You are on page 1of 57

Effort Estimation

Software Effort Estimation


Effort Estimation

 Estimating
 The process of forecasting or approximating the time
and cost of completing project deliverables.
 The task of balancing the expectations of stakeholders
and the need for control while the project is implemented
 Types of Estimates
 Top-down (macro) estimates: analogy, group
consensus, or mathematical relationships
 Bottom-up (micro) estimates: estimates of elements of
the work breakdown structure
Which view is correct?
 Rough order of magnitude is good enough. Spending time
on detailed estimating wastes money
 Time is everything; our survival depends on getting there
first! Time and cost accuracy is not an issue.
 The project is internal. We don’t need to worry about cost.
 The uncertainty is so great, spending time and money on
estimates is a waste.
 The project is so small, we don’t need to bother with
estimates. Just do it.
 They used an internal estimate “for strategic decisions”
and then we had to live with it.
 We were burned once. I want a detailed estimate of every
task by the people responsible.
Macro versus Micro Estimating
Conditions for Preferring Top-Down or Bottom-up
Time and Cost Estimates

Condition Macro Estimates Micro Estimates


Strategic decision making X
Cost and time important X
High uncertainty X
Internal, small project X
Fixed-price contract X
Customer wants details X
Unstable scope X
Estimating Projects: Preferred
Approach

 Make rough top-down estimates.


 Develop the WBS/OBS.
 Make bottom-up estimates.
 Develop schedules and budgets.
 Reconcile differences between top-down and
bottom-up estimates
Estimating Guidelines for Times,
Costs, and Resources
1. Have people familiar with the tasks make the estimate.
2. Use several people to make estimates.
3. Base estimates on normal conditions, efficient
methods, and a normal level of resources.
4. Use consistent time units in estimating task times.
5. Treat each task as independent, don’t aggregate.
6. Don’t make allowances for contingencies.
7. Adding a risk assessment helps avoid surprises to
stakeholders.
Refining Estimates
 Reasons for Adjusting Estimates
 Interaction costs are hidden in estimates.
 Normal conditions do not apply.
 Things go wrong on projects.
 Changes in project scope and plans.
 Adjusting Estimates
 Time and cost estimates of specific activities are
adjusted as the risks, resources, and situation
particulars become more clearly defined.
Refining Estimates (cont’d)
 Contingency Funds and Time Buffers
 Are created independently to offset uncertainty.
 Reduce the likelihood of cost and completion time
overruns for a project.
 Can be added to the overall project or to specific
activities or work packages.
 Can be determined from previous similar projects.
 Changing Baseline Schedule and Budget
 Unforeseen events may dictate a reformulation of the
budget and schedule.
Why Refine an Estimate?
Methods for Estimating Project
Times and Costs

 Macro (Top-down) Approaches


 Consensus methods
Project Estimate
 Ratio methods Times
Costs
 Apportion method
 Function point methods for
software and system projects
 Learning curves
Apportion Method of Allocating Project
Costs Using the Work Breakdown Structure
Methods for Estimating Project
Times and Costs (cont’d)

 Micro (Bottom-up)
Approaches
 Template method
 Parametric Procedures Applied
to Specific Tasks
 Detailed Estimates for the
WBS Work Packages
 Phase Estimating: A Hybrid
Duration vs. Effort vs. Productive Time
 Duration is the elapsed time in business
working days
 Work effort is the labor required to complete
an activity. Work effort is typically the
amount of focused and uninterrupted labor
time required to complete an activity.
 Productive time considers the percentage of
the work day that can be devoted to project
activity work. Estimates in IT range from 66-
75%, recent estimates of about 50-65%
(same client base). This doesn’t include
unexpected interruptions!
Elapsed time vs. work time
Software Cost Estimation
•What is the Problem?

•100 - 200% cost overruns are not uncommon

•15%of large projects never deliver anything

•31% of new IS projects cancelled before completion ($81 billion)


•What are the consequences?

•Economic

•Technical

•Managerial

•What is gained through effective software cost-estimation?


•schedule/staffing estimates
•better understanding of a particular project
Why are we bad at software
estimation?
•Complexity

•Infrequency

•Uniqueness

•Underestimation bias

•Goals not estimates


Basic Steps in Software Estimation

 Identify project objectives and requirements


 Plan the activities
 Estimate product size and complexity
 Estimate effort, cost and resources
 Develop projected schedule
 Compare and iterate estimates
 Follow up
Software Cost-Estimation Methods

 algorithmic
 expert judgement
 similar, completed projects
 equate to available resources
 Price-to-win
 Top-down (global estimate)
 Bottom-up (each component separately estimated)
Algorithmic
Models

COCOMO TRW (Boehm)


ESTIMACS Computer Associates (Rubin)
ESTIPLAN AGS Management Systems
FAST Freiman Parametric Systems (Freiman)
FUNCTION IBM (Albrecht)
POINTS
MAINSTAY Mainstay Software Corporation
PRICE RCA
SLIM QSM (Putnam)
SOFTCOST-R Reifer Consultants (Tausworthe)
SPQR Software Productivity Research (Jones)
Basic Algorithmic Form

Effort = constant + coefficient*(size metric) +


coefficient*(cost driver 1) +
coefficient*(cost driver 2) +
coefficient*(cost driver 3) +
…..

size metric lines of code


‘new’ versus ‘old’ lines of code
function points
SLOC as an Estimation Tool

 Why used?
 early systems emphasis on coding

 Criticisms
 cross-language inconsistencies

 within language counting variations

 change in program structure can affect count

 stimulates programmers to write lots of code

 system-oriented, not user-oriented


How many Lines of Code in this program?

#define LOWER 0 /* lower limit of table */


#define UPPER 300 /* upper limit */
#define STEP 20 /* step size */

main () /* print a Fahrenheit-Celsius conversion


table */
{
int fahr;
for (fahr=LOWER; fahr <= UPPER; fahr=fahr+STEP)
printf(“%4d %6.1f\n”, fahr, (5.0/9.0)*(fahr-32));
}
COCOMO Cost Drivers
 Required software reliability
 data base size
 product complexity
 computer execution time constraint
 computer storage constraint
 computer turnaround time
 analyst capability
 programmer capability
 application experience
 hardware/software experience
 programming language experience
 use of modern programming practices
 use of software tools
 required development schedule
Algorithmic Model Conclusions

 Algorithmic Models can do a good job in estimating


required effort
• Good project data must be collected and analyzed in
order to derive useful algorithms
• Calibration is essential as the specific environment
is critically important

 Effort estimates do have other uses


 Productivity evaluation of project teams or software
development technologies
 Objective negotiating tool with users in changes in
scope and impact on budget/schedule
Function Count Systems View:
Functionality Types

Interface
Files

Inputs Internal Outputs


Files

Queries
Function
Points
History
 Non-code oriented size measure
 Developed by IBM (A. Albrecht) in 1979, 1983
 Now in use by more than 500 organizations
world-wide

What are they?


 5 weighted functionality types
 14 complexity factors
Functionality Types

EXTERNAL USER

input output inquiry


type type type

Internal
Logical File
External
Interface File
input type
output type
inquiry type

Application Boundary Other Applications


Processing Complexity Adjustment

1) data communications
Each rated on scales equivalent
2) distributed functions
to the following:
3) performance
4) heavily used configuration
Not present =0
5) transaction rate
Incidental Influence =1
6) on-line data entry
Moderate Influence =2
7) end user efficiency
Average Influence =3
8) on-line update
Significant Influence =4
9) complex processing
Strong Influence =5
10) reusability
11) installation ease
12) operational ease
13) multiple sites
14) facilitates change
Function Point
Calculation

5 3

Function Counts = FC   x i w j
i  1 j 1

  14 
Function Points = FP  FC .65  .01  ck 
  k  1 

where
xi = function i
wj = weight j
ck = complexity factor k
 Need to track employees and their work
- Add, change, delete, queries, and reports
- Two types of employees, salaried and hourly

 Employees can have more than one job assignment

 Standard job descriptions are retained by system

Employees can have more than one location and


locations can have more than one employee
- Another system stores the location data
Detailed Function Point Counting Rules

(1) Internal Logical Files (ILFs) Rules:


 Each major logical group of user data or control
information
 Data is generated, used and maintained by the
application
In Practice:
 Count at logical (external design) level
 In DB environment generally a relational table = a logical
file (before extensive normalization)
 Ignore multiple views
Detailed Function Point Counting Rules

(2) External Interface Files (EIFs) Rules:


 Files passed or shared between
applications
 Reference data only (not transactions)

In Practice:
 Look for “read only” usage

 Count special database extracts


Example - ILFs and EIFs

 Employee - entity type


- Employee name
- SSN
- Number of dependents
- Type (salary or hourly)
- Location name (foreign key)
 Salaried employee - entity subtype
- Supervisory level
 Hourly employee - entity subtype
- Standard Hourly rate
- Collective Bargaining Unit Number
Example - ILFs and EIFs

 Job - entity type


- Job name
- Job number
- Pay grade
 Job Assignment - entity type
- Effective Date
- Salary
- Performance Rating
- Job Number (foreign key)
- Employee SSN (foreign key)
 Job Description
- Job Number (foreign key)
- Line number (not known to users)
- Description line
Example - ILFs and EIFs

 Location - entity --maintained in another system


- Location Name
- Address
- Employee SSN (foreign key)

 COUNTING STEPS:
- Count number of ILFs and EIFs
- Assign them a complexity weighting
Counting ILFs and EIFs
 Three ILFs:
- Employee
- Job
- Job Assignment
- not Job Description (logically part of Job)
- not Location (an EIF)
- not Salaried Employee (a Record Element Type)
- not Hourly Employee (a Record Element Type)

 One EIF:
- Location
Counting ILFs/EIFs - Complexity
Record
Element
RecordTypes Data Element
Data Types (DETs)
Element Element
Types(RETs) Types
(RETs) 1-19 (DETs)
20-50 51+
<2 1-19 Low 20-50 Low 51+
Average
<2 Low Low Average
2-5
2-5 Low
Low Average
Average
High
High
>5 >5 Average
Average High
High High
High

Three ILFs:
•Employee - 8 DETs and 2 RETs
•Job - 4 DETs and 1 RET
•Job Assignment - 5 DETs and 1 RET
One EIF: Location - 3 DETs and 1 RET
ILF and EIF Unadjusted FPs

Low Average High


External x3 x4 x6
Input
External x4 x5 x7
Output
Logical 3 x7 x10 x15
Internal
File
External 1 x5 x7 x10
Interface
File
External x3 x4 x6
Inquiry
Detailed Function Point Counting Rules

(3) External Inputs (EIs) Rules:


 Each unique user data/control type that enters
application
 Adds/Changes/Deletes data in Internal logical file
 Each transaction type is an external input

In Practice:
 Not necessarily equal to screens
 Don’t confuse with inquiries (no change to data)
Counting EIs - Raw Data

 Employee Maintenance
Add, change, delete Employee

 Employee Inquiry; Employee Report

Job Maintenance
 Add, change, delete Job
 Job Inquiry; Job Report

Job Assignment Maintenance


 Assign Employee to Job
 Job Assignment Inquiry; Job Assignment Report
 Transfer Employee
 Evaluate Employee
 Delete Assignment
 Location Reporting
 Location Inquiry; Location Report
Counting EIs - Complexity

File Types
File Types
Referenced DataTypes (DETs)
Data Element
Referenced
(FTRs) Element
(FTRs) Types
1-4 5-15 +15
(DETs)
<2 1-4Low Low
5-15 Average
15+
<2 2 LowLow Average
Low High
Average
2 >2 Average Average
Low High High
High
>2 Average High High
Example EIs (3 of 10):
• Create Employee- 10 DETs, 2FTRs (Employee
and Location) => Average
• Delete Employee- 3 DETs and 1 FTR=> Low
• Assign Employee to Job - 6 DETs and 3 FTRs
(Employee, Job and Job Assignment)=> High
External Input (EI) Unadjusted FPs

Low Average High


External 6 x3 2 x4 2 x6
Input
External x4 x5 x7
Output
Logical x7 x10 x15
Internal
File
External x5 x7 x10
Interface
File
External x3 x4 x6
Inquiry
Detailed Function Point Counting Rules

(4) External Outputs (EOs) Rules:


 Each unique user data/control type that exits
application
 Unique means different format or processing logic
 Can be sent directly to users as reports/messages, or
to other applications as a file

In Practice:
 Processing must be involved (don’t count output
response to an inquiry)
 Detail and summary outputs count separately
Counting EOs - Raw Data
 Employee Maintenance
Add, change, delete Employee

 Employee Inquiry; Employee Report - 6-19 DETs

Job Maintenance
 Add, change, delete Job
 Job Inquiry; Job Report- 5 DETs

Job Assignment Maintenance


 Assign Employee to Job
 Job Assignment Inquiry; Job Assignment Report
 Transfer Employee
 Evaluate Employee
 Delete Assignment
 Location Reporting
 Location Inquiry; Location Report- 6-19DETs
Counting EOs - Complexity
File Types
File Types Data
Referenced
Referenced Element
Delta Element Types (DETs)
File
(FTRs) Types
(FTRs) Types Data
Referenced 1-5 (DETs) Element 20+
6-19
(FTRs)
<2 1-5 Low 6-19
LowTypes Average
20+
<2 2-3 LowLow Low (DETs)
Average Average
High
2-3 >3 Low 1-4
AverageAverage
High
5-15 High
High
15+
>3 <2 Average Low High Low HighAverage
2 Low Average High
>2 Average High High
Example EOs :
• Employee Report- 6-19 DETs, 2FTRs (Employee
and Location) => Average
• Job Report-5 DETs and 1 FTR=> Low
• Job Assignment Report - 6-19 DETs, 3 FTRs
(Employee, Job and Job Assignment)=> Average
External Output Unadjusted FPs

Low Average High


External x3 x4 x6
Input
External 1 x4 3 x5 x7
Output
Logical x7 x10 x15
Internal
File
External x5 x7 x10
Interface
File
External x3 x4 x6
Inquiry
Detailed Function Point Counting Rules

(5) External Inquiry (EQ) Rules:


 Each unique input/output combination where an input
causes and generates an immediate output
 Unique means different format or processing logic

In Practice:
 No processing involved. If result is calculated or derived
field, then it is an input and an output
 Help systems typically counted as external inquiry
 Rate complexity as the higher of the input/output value
Counting EQs - “Medium Cooked” Data

 Employee Maintenance
 Employee Inquiry- 2 FTRs and 9 DETs (output)

Job Maintenance
 Job Inquiry - 1 FTR and 4 DETs (output)

Job Assignment Maintenance


 Job Assignment Inquiry- 1 FTR and 5 DETs (output)
 Location Reporting
 Location Inquiry - 2 FTRs and 5 DETs (output)

RESULT - Use EI and EO matrices => 3 low complexity and


1 average (employee)
EQ Unadjusted FPs
Low Average High
External x3 x4 x6
Input
External x4 x5 x7
Output
Logical x7 x10 x15
Internal
File
External x5 x7 x10
Interface
File
External 3 x3 1 x4 x6
Inquiry
Total Unadjusted Function Points
Low Average High
External 6 x3 2 x4 2 x6
Input
External 1 x4 3 x5 x7
Output
Logical 3 x7 x10 x15
Internal
File
External 1 x5 x7 x10
Interface
File
External 3 x3 1 x4 x6
Inquiry

Total = 96 Unadjusted FPs


Are Function Points a “Silver Bullet”?

“The function-point metric, like LOC, is relatively


controversial...Opponents claim that the method requires
some ‘sleight of hand’ in that computation is based on
subjective, rather than objective, data...”
R. Pressman Software Engineering p. 94

"Variants in FP counting methodologies can result in


variances of up to +/- 50%."
Capers Jones Selecting a FP Counting Method

“Within organizations the variation in function point


counts about the mean appears to be within 30%...”
G. Low and D.R. Jeffery IEEE TSE Jan. 1990
Software Estimating Rules of
Thumb
 Rule 1: One function point = 100 logical source code
statements (procedural languages)
 300 for assembly languages, < 20 for some OO languages
 Rule 2: Raising the number of function points to the
1.15 power predicts the approximate page counts
for paper documents associated with software
projects
 Rule 3: Creeping user requirements will grow at an
average rate of 1% per months over the
development schedule
 For a 2 year project, functionality at delivery will be 24%
larger then when requirements were collected.
Software Estimating Rules of
Thumb (continued)
 Rule 4: Raising the number of function points to 1.2
power predicts the approximate number of test
cases created.
 Assume each test case will be executed about 4 times
 Rule 5: Raising the number of function points to the
1.25 power predicts the approximate defect potential
for new software projects
 Defect potential is sum of bugs (errors) in requirements,
design, coding, user-documentation + bad fixes or
secondary errors introduced fixes prior errors.
 For enhancements: raise to 1.27 power
Software Estimating Rules of
Thumb (continued)
 Rule 6: Each software review, inspection, or
test step will find and remove 30% of the
bugs that are present
 Implies 6-12 consecutive defect-removal
operations to achieve high-quality software
 Rule 7: Raising the number of function points
to the .4 power predicts the approximate
development schedule in calendar months.
 Longer for military projects; for enhancements
applies to size of enhancement (not base product)
Software Estimating Rules of
Thumb (continued)
 Rule 8: Dividing the number of function points
by 150 predicts the approximate number of
personnel for the application
 Includes software developers, QA, testers,
technical writers, DBAs, project managers
 Rule 9: Dividing the number of function points
by 500 predicts the approximate number of
maintenance personnel
 Raising function point to .25 power predicts
approximate number of years the application will
stay in use
Software Estimating Rules of
Thumb (continued)
 Rule 10: Multiply software development
schedules by number of personnel to predict
the approximate number of staff months of
effort.
 1000 function points raised to .4 = 16 calendar
months
 1000 function points / 150 = 6.6 full time staff
 16 * 6.6 = 106 staff months to build project
Software Estimating Rules of
Thumb (continued)
 Staff month: 22 working days with 6
productive work hours each day
 132 work hours per month

 Capers-Jones IEEE Computer March 1996:


notes limitations of these types of heuristics

You might also like