You are on page 1of 60

MODULE 3

SOFTWARE ESTIMATION AND SCHEDULING

DR. MADHAVI WAGHMARE


THE MANAGEMENT SPECTRUM
• Effective software project management focuses on the four Ps: people, product,process, and
project.
• People: People Capability Maturity Model (People-CMM), in recognition of the fact that
“every organization needs to continually improve its ability to attract, develop, motivate,
organize, and retain the workforce needed to accomplish its strategic business objectives.
• The people capability maturity model defines the following key practice areas for software
people: staffing, communication and coordination, work environment, performance
management, training, compensation, competency analysis and development, career
development, workgroup development, and team/ culture development, and others.
THE MANAGEMENT SPECTRUM
• The Product : Before a project can be planned, product objectives and scope should
be established, alternative solutions should be considered, and technical and
management constraints should be identified.
• Without this information, it is impossible to defi ne reasonable (and accurate)
estimates of the cost, an effective assessment of risk, a realistic breakdown of project
tasks, or a manageable project schedule that provides a meaningful indication of
progress.
• As a software developer, you and other stakeholders must meet to define product
objectives and scope
• Objectives identify the overall goals for the product (from the stakeholders’ points of
view) without considering how these goals will be achieved.
• Scope identifies the primary data, functions, and behaviors that characterize the
product, and more important, attempts to bound these characteristics in a quantitative
manner.
THE MANAGEMENT SPECTRUM
• The Process
• A software process provides the framework from which a comprehensive plan for software
development can be established.
• A small number of framework activities are applicable to all software projects, regardless of
their size or complexity.
• A number of different task sets—tasks, milestones, work products, and quality assurance
points—enable the framework activities to be adapted to the characteristics of the software
project and the requirements of the project team. Finally, umbrella activities—such as software
quality assurance, software configuration management, and measurement—overlay the process
model.
• Umbrella activities are independent of any one framework activity and occur throughout the
process.
THE MANAGEMENT SPECTRUM

The Project
• To manage the complexity
• To avoid project failure, a software project manager and the software
engineers who build the product must avoid a set of common warning
signs, understand the critical success factors that lead to good project
management,
• develop a commonsense approach for planning, monitoring, and
controlling the project.
PROCESS AND PROJECT METRICS
• What is it? quantitative measures that enable you to gain insight into the efficacy of
the software process and the projects that are conducted using the process as a
framework.
• Who does it? Software metrics are analyzed and assessed by software managers
• Why is it important? If you don’t measure, judgment can be based only on subjective
evaluation. With measurement, trends (either good or bad) can be spotted, better
estimates can be made, and true improvement can be accomplished over time.
• What are the steps? Begin by defining a limited set of process, project, and product
measures that are easy to collect.
SOFTWARE MEASUREMENT

(1) to characterize in an effort to gain an understanding “of processes, products,


resources, and environments, and to establish baselines for comparisons with future
assessments”;
(2) to evaluate “to determine status with respect to plans”;
(3) to predict by “gaining understandings of relationships among processes and
products and building models of these relationships”; and
(4) to improve by “identify[ing] roadblocks, root causes, inefficiencies, and other
opportunities for improving product quality and process performance.”
Process Metrics and Software Process Improvement

Determinants
for software
quality and
organizational
effectiveness.

Ease of communication Deadlines, business rules


And collaboration

Integrated Software tools


• “controllable factors in improving software quality and organizational
performance”
• Measure the efficacy (Perform the task satisfactory) of a software process
indirectly
• process sits at the center of a triangle connecting three factors that have a
profound influence on software quality and organizational performance.
• The skill and motivation of people have been shown to be the most influential
factors in quality and performance.
• The complexity of the product can have a substantial impact on quality and
team performance.
• The technology (i.e., the software engineering methods and tools) that populates
the process also has an impact.
• The process triangle exists within a circle of environmental conditions that
include the development environment (e.g., integrated software tools), business
conditions (e.g., deadlines, business rules), and customer characteristics (e.g.,
ease of communication and collaboration).
SOFTWARE PROJECT ESTIMATION
• Software cost and effort estimation will never be an exact
human, technical, environmental, political—can affect the ultimate cost
of software and effort applied to develop it.
However, software project estimation can be transformed from a black art to a series
of systematic steps that provide estimates with acceptable risk.
To achieve reliable cost and effort estimates, a number of options arise:
1. Delay estimation until late in the project (obviously, we can achieve
100 percent accurate estimates after the project is complete!).
2. Base estimates on similar projects that have already been completed.
3. Use relatively simple decomposition techniques to generate project cost
and effort estimates.
4. Use one or more empirical models for software cost and effort estimation.
DECOMPOSITION TECHNIQUES
• Decompose the problem, recharacterizing it as a set of smaller
• The decomposition approach was discussed from two different points of view:
decomposition of the problem and decomposition of the process.
• Estimation uses one or both forms of partitioning.
• But before an estimate can be made, you must understand the scope of the
software to be built and generate an estimate of its “size.”
DECOMPOSITION TECHNIQUES

Software Sizing : The accuracy of a software project estimate is predicated on a number


of things:
(1) the degree to which you have properly estimated the size of the product to be built;
(2) the ability to translate the size estimate into human effort, calendar time, and dollars
(a function of the availability of reliable software metrics from past projects);
(3) the degree to which the project plan reflects the abilities of the software team; and
(4) the stability of product requirements and the environment that supports the software
engineering effort.
software sizing

Direct Approach Indirect Approach


Lines of Code Function Point
(LOC) (FP)
DECOMPOSITION TECHNIQUES

• Size can be estimated by considering:


• the type of project and
• its application domain,
• the functionality delivered (i.e., the number of function points),
• the number of components to be delivered,
• the degree to which a set of existing components must be modified for the new system.
sizing
approaches be combined statistically to create a three-point or expected-value
Estimate:- optimistic (low),
most likely, and
pessimistic (high) values for size and combining them
DECOMPOSITION TECHNIQUES
A three-point or expected value can then be computed. The expected value
for the estimation variable (size)
S can be computed as a weighted average of the optimistic ( s opt), most likely ( s m),
and pessimistic (s pess) estimates. For example,
DECOMPOSITION TECHNIQUES
An Example of LOC-Based Estimation

Estimation
table for the
LOC methods
DECOMPOSITION TECHNIQUES

The range of LOC estimates for the 3D geometric analysis function is optimistic, 4600
LOC; most likely, 6900 LOC; and pessimistic, 8600 LOC….Applying above eq. the
expected value for the 3D geometric analysis function is 6800 LOC.

• A review of historical data indicates that the organizational average productivity for
systems of this type is 620 LOC/pm.
• Based on a burdened labor rate of $8,000 per month, the cost per line of code is
approximately $13.
• Based on the LOC estimate and the historical productivity data, the total estimated
project cost is $431,000 (620 X 8000)and the estimated effort is 54 person-
months(431000/8000).
DECOMPOSITION TECHNIQUES
An Example of FP-Based Estimation
Decomposition for FP-based estimation focuses on information domain values rather than software
functions.
DECOMPOSITION TECHNIQUES

Estimating
information
domain values

The organizational average productivity for systems of this type is 6.5 FP/pm. Based on a burdened labor rate of
$8,000 per month, the cost per FP is approximately $1,230. Based on the FP estimate and the historical
productivity data, the total estimated project cost is $461,000 and the estimated effort is 58 person-months.
DECOMPOSITION TECHNIQUES
Process-Based Estimation

The process is decomposed into a relatively small set of activities, actions,


and tasks and the effort required to accomplish each is estimated.
Like the problem-based techniques, process-based estimation begins with a
delineation of software functions obtained from the project scope. A series
of framework activities must be performed for each function. Functions
and related framework activities
DECOMPOSITION TECHNIQUES
Example of Process-Based Estimation
• Estimates of effort (in person-months) for each software engineering activity are
providedfor each CAD software function (abbreviated for brevity).
• The engineering and construction release activities are subdivided
• Gross estimates of effort are provided for customer communication, planning,
and risk analysis. These are noted in the total row at the bottom of the table.
• Horizontal and vertical totals provide an indication of estimated effort required
for analysis, design, code, and test.
• It should be noted that 53 percent of all effort is expended on front-end
engineering tasks (requirements analysis and design), indicating the relative
importance of this work.
• Based on an average burdened labor rate of $8,000 per month,
• the total estimated project cost is $368,000
• the estimated effort is 46 person-months.
DECOMPOSITION TECHNIQUES

Process-based
estimation
table
DECOMPOSITION TECHNIQUES
Estimation with Use Cases
use cases provide a software
team with insight into software scope and requirements.

UCP = (UUCW + UAW) X TCF X ECF


DECOMPOSITION TECHNIQUES
The Example CAD Software has Three subsystems groups:
User interface subsystem (includes UICF)
Engineering subsystem group (includes the 2DGA, 3DGA, DAM subsystems) and
Infrastructure subsystem group (includes CGDF and PCF subsystems)

16 complex use cases describe the user interface subsystem


The engineering subsystem group is described by 14 average use cases and 8 simple
use cases.
And the infrastructure subsystem is described with 10 simple use cases.

Unadjusted Use-Case
Weight (UUCW)
DECOMPOSITION TECHNIQUES
Analysis of the use cases indicates that there
are 8 simple actors, 12 average actors,
and 4 complex actors.

Unadjusted Actor
Weight (UAW)
DECOMPOSITION TECHNIQUES

Technical complexity factors (TCFs)


OR
Technical Factors (TF)
Weight value is fixed
Rate value 0 to 5

TCF = 1.04
DECOMPOSITION TECHNIQUES

ECF (Environment
complexity factors)

ECF = 0.96
DECOMPOSITION TECHNIQUES

TCF = 1.04

ECF = 0.96

(Use-case Point)UCP = (470 + 44) X 1.04 X 0.96 = 513


DECOMPOSITION TECHNIQUES
• Using past project data as a guide, the development group has produced 85 LOC per UCP.
Therefore, an estimate of the overall size of the CAD project is 43,600 LOC (85x 513).
• Similar computations can be made for applied effort or project duration.
• Using 620 LOC/pm as the average productivity for systems of this type and
a burdened labor rate of $8,000 per month,
The cost per line of code is approximately $13.(8000/620)
• Based on the use-case estimate and the historical productivity data,
the total estimated project cost is $552,000 (13 x 43,600 =$566800) (Take past history cost
and current estimated cost average $ 566800 = $ 552,000)
and the estimated effort is about 70 person-months.(43,600/ 620)
METRICS FOR THE REQUIREMENTS MODEL

• Function-Based Metrics : The function point (FP) metric can be used effectively as
a means for measuring the functionality delivered by a system. Using historical
data, the FP metric can then be used to (1) estimate the cost or effort required to
design, code, and test the software; (2) predict the number of errors that will be
encountered during testing; and (3) forecast the number of components and/or the
number of projected source lines in the implemented system.
Function points are derived using an empirical relationship based on countable (direct)
measures of software’s information domain and qualitative assessments of software complexity.
Information domain values are defined in the following manner:
Number of external inputs (EIs). Each external input originates from a user or is transmitted
from another application and provides distinct application- oriented data or control information.
Inputs are often used to update internal logical files (ILFs). Inputs should be distinguished from
inquiries, which are counted separately.
Number of external outputs (EOs). Each external output is derived data within the application
that provides information to the user. In this context external output refers to reports, screens,
error messages, and the
like. Individual data items within a report are not counted separately.
Number of external inquiries (EQs). An external inquiry is defined as an online input that
results in the generation of some immediate software response in the form of an online output
(often retrieved from an ILF).
Number of internal logical files (ILFs). Each internal logical file is a logical grouping of data
that resides within the application’s boundary and is maintained via external inputs.
Number of external interface files (EIFs). Each external interface fi le is a logical grouping of
data that resides external to the application but provides data that may be of use to the
application.
Computing function points
Once these data have been collected, the table in Figure is completed and a complexity value is
associated with each count. Organizations that use function point methods develop criteria for
determining whether a particular entry is simple, average, or complex. Nonetheless, the determination
of complexity is somewhat subjective.
To compute function points (FP), the following relationship is used:
1. Does the system require reliable backup and recovery?
2. Are specialized data communications required to transfer information to or from the application?
3. Are there distributed processing functions?
4. Is performance critical?
5. Will the system run in an existing, heavily utilized operational environment?
6. Does the system require online data entry?
7. Does the online data entry require the input transaction to be built over multiple screens or
operations?
8. Are the ILFs updated online?
9. Are the inputs, outputs, fi les, or inquiries complex?
10. Is the internal processing complex?
11. Is the code designed to be reusable?
12. Are conversion and installation included in the design?
13. Is the system designed for multiple installations in different organizations?
14. Is the application designed to facilitate change and ease of use by the user?
A flow model for SafeHome user interaction function
Each of these questions is answered using an ordinal scale that ranges from 0
(not important or applicable) to 5 (absolutely essential). The constant values in
Equation (30.1) and the weighting factors that are applied to information domain
counts are determined empirically.

The flow diagram is evaluated to determine a set of key information domain


measures required for computation of the function point metric. Three external
inputs— password, panic button, and activate/deactivate —are shown in the figure
along with two external inquiries— zone inquiry and sensor inquiry . One ILF
( system configuration fi le ) is shown. Two external outputs ( messages and sensor
status ) and four EIFs ( test sensor, zone setting, activate/deactivate, and alarm
alert ) are also present. These data, along with the appropriate complexity, are
shown in Figure.
Computing
function points

FP 5 50 3 [0.65 1 (0.01 3 46)] 5 56


• Based on the projected FP value derived from the requirements model, the
project team can estimate the overall implemented size of the SafeHome
user interaction function.
• Assume that past data indicates that one FP translates into 60 lines of code
(an object-oriented language is to be used) and that 12 FPs are produced for
each person-month of effort. These historical data provide the project
manager with important planning information that is based on the
requirements model rather than preliminary estimates.
Given the following values compute F.P. when all complexity adjustment factor and
weighting factors are average
User i/p=50
User o/p= 40
User Enquiries=35
User files = 6
External interface= 4
Empirical Estimation Models - COCOMO II Model
• Boehm proposed COCOMO (Constructive Cost Estimation Model) in 1981.
• COCOMO is one of the most generally used software estimation models in the
world.
• COCOMO predicts the efforts and schedule of a software product based on the size
of the software.
Empirical Estimation Models - COCOMO II Model
The necessary steps in this model are:
1.Get an initial estimate of the development effort from evaluation of
thousands of delivered lines of source code (KDLOC).
2.Determine a set of 15 multiplying factors from various attributes of the
project.
3.Calculate the effort estimate by multiplying the initial estimate with all the
multiplying factors i.e., multiply the values in step1 and step2.
Empirical Estimation Models - COCOMO II Model

The initial estimate (also called nominal estimate) is determined by


an equation of the form used in the static single variable models,
using KDLOC as the measure of the size. To determine the initial
effort Ei in person-months the equation used is of the type is shown
below
Ei=a*(KDLOC)b
The value of the constant a and b are depends on the project type.

In COCOMO, projects are categorized into three types:


1.Organic
2.Semidetached
3.Embedded
Empirical Estimation Models - COCOMO II Model
1.Organic: A development project can be treated of the organic type, if
the project deals with developing a well-understood application
program, the size of the development team is reasonably small, and the
team members are experienced in developing similar methods of
projects.
Examples of this type of projects are simple business systems,
simple inventory management systems, and data processing
systems.
Empirical Estimation Models - COCOMO II Model

2. Semidetached: A development project can be treated with semidetached type if


the development consists of a mixture of experienced and inexperienced staff. Team
members may have finite experience in related systems but may be unfamiliar with
some aspects of the order being developed.
Example of Semidetached system includes developing a new operating system
(OS), a Database Management System (DBMS), and complex inventory
management system.
Empirical Estimation Models - COCOMO II Model
3. Embedded: A development project is treated to be of an embedded type,
if the software being developed is strongly coupled to complex hardware, or
if the stringent regulations on the operational method exist.
For Example: ATM, Air Traffic control.
For three product categories, Bohem provides a different set of expression
to predict effort (in a unit of person month)and development time from the
size of estimation in KLOC(Kilo Line of code) efforts estimation takes into
account the productivity loss due to holidays, weekly off, coffee breaks, etc.
Empirical Estimation Models - COCOMO II Model

According to Boehm, software cost estimation should be done


through three stages:
1.Basic Model
2.Intermediate Model
3.Detailed Model
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:
The basic COCOMO model provide an accurate size of the project
parameters. The following expressions give the basic COCOMO
estimation model:
Effort=a1*(KLOC)a2PM
Tdev=b1*(efforts)b2 Months
Where
KLOC is the estimated size of the software product indicate in Kilo Lines
of Code,
a1,a2,b1,b2 are constants for each group of software products,
Tdev is the estimated time to develop the software, expressed in months,
Effort is the total effort required to develop the software product,
expressed in person months (PMs).
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:

Estimation of development effort


For the three classes of software products, the formulas for
estimating the effort based on the code size are shown below:
Organic: Effort = 2.4(KLOC) 1.05 PM
Semi-detached: Effort = 3.0(KLOC) 1.12 PM
Embedded: Effort = 3.6(KLOC) 1.20 PM
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:

Estimation of development time


For the three classes of software products, the formulas for estimating the
development time based on the effort are given below:
Organic: Tdev = 2.5(Effort) 0.38 Months
Semi-detached: Tdev = 2.5(Effort) 0.35 Months
Embedded: Tdev = 2.5(Effort) 0.32 Months
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:

Example1: Suppose a project was estimated to be 400 KLOC.


Calculate the effort and development time for each of the three
model i.e., organic, semi-detached & embedded.

a1
a1 a2 b1 b2
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:
Solution: The basic COCOMO equation takes the form:
Effort=a1*(KLOC) a2 PM
Tdev=b1*(efforts) b2 Months
Estimated Size of project= 400 KLOC
(i)Organic Mode
E = 2.4 * (400)1.05 = 1295.31 PM
D = 2.5 * (1295.31)0.38=38.07 PM
(ii)Semidetached Mode
E = 3.0 * (400)1.12=2462.79 PM
D = 2.5 * (2462.79)0.35=38.45 PM
(iii) Embedded Mode
E = 3.6 * (400)1.20 = 4772.81 PM
D = 2.5 * (4772.8)0.32 = 38 PM
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:

Example2: A project size of 200 KLOC is to be developed. Software


development team has average experience on similar type of projects. The
project schedule is not very tight. Calculate the Effort, development time,
average staff size, and productivity of the project.
Empirical Estimation Models - COCOMO II Model
Basic COCOMO Model:
Solution: The semidetached mode is the most appropriate mode, keeping in view the
size, schedule and experience of development time.
Hence E=3.0(200)1.12=1133.12PM
D=2.5(1133.12)0.35=29.3PM
Empirical Estimation Models - COCOMO II Model
Intermediate COCOMO Model:
• The basic COCOMO model considers that the effort is only a function of the
number of lines of code and some constants calculated according to the various
software systems.
• The intermediate COCOMO model recognizes these facts and refines the initial
estimates obtained through the basic COCOMO model by using a set of 15 cost
drivers based on various attributes of software engineering.
Empirical Estimation Models - COCOMO II Model
Intermediate COCOMO Model:
Classification of Cost Drivers and their attributes:
(i) Product attributes -
•Required software reliability extent (RELY)
•Size of the application database (DATA)
•The complexity of the product (CPLX)
(ii) Hardware attributes -
•Run-time performance constraints/ (Execution Time constraint ) (TIME)
•Memory storage constraints (STOR)
•The volatility of the virtual machine environment (VIRT)
•Required turnabout time (TURN)
Empirical Estimation Models - COCOMO II Model
Intermediate COCOMO Model:

(iii) Personnel attributes -


•Analyst capability (ACAP)
•Software engineering capability (PCAP)
•Applications experience (AEXP)
•Virtual machine experience (VEXP) and Programming language
experience(LEXP)

(iv) Project attributes -


•Use of software tools (TOOL)
•Application of software engineering methods (MODP)
•Required development schedule (SCED)
Empirical Estimation Models - COCOMO II Model
Intermediate COCOMO Model: The cost drivers are divided into four categories:
Empirical Estimation Models - COCOMO II Model
Intermediate COCOMO Model: The cost drivers are divided into four categories:
Empirical Estimation Models - COCOMO II Model
Intermediate COCOMO Model:

EAF : Effort Adjustment Factor


It can be calculated by multiplying all the
values that have been obtain after
categorizing each cost driver.
Empirical Estimation Models - COCOMO II Model
Detailed COCOMO Model:

Detailed COCOMO incorporates all qualities of the standard version


with an assessment of the cost driver?s effect on each method of the
software engineering process. The detailed model uses various effort
multipliers for each cost driver property. In detailed cocomo, the whole
software is differentiated into multiple modules, and then we apply
COCOMO in various modules to estimate effort and then sum the
effort.
Empirical Estimation Models - COCOMO II Model
Detailed COCOMO Model:

The Six phases of detailed COCOMO are:


1.Planning and requirements
2.System structure
3.Complete structure
4.Module code and test
5.Integration and test
6.Cost Constructive model
The effort is determined as a function of program estimate, and a set of cost
drivers are given according to every phase of the software lifecycle.

You might also like