Professional Documents
Culture Documents
Estimation
Estimation
Estimation
4/6/2017 1
2
Boss with spreadsheet…
3
Fundamental estimation questions
What is the size of the software to be
developed?
How much effort is required to
complete an activity?
How much calendar time is needed to
complete an activity?
What is the total cost of an activity?
4
Estimations and Scheduling
Effort Cost
Estimation Estimation
Size Staffing
Estimation Estimation
Duration
Estimation Scheduling
5
Software cost components
Hardware and software costs.
Travel and training costs.
Effort costs:
The dominant factor in most projects
The salaries of engineers involved in the project
Overheads:
Costs of building, heating, lighting.
Costs of networking and communications.
Costs of shared facilities (e.g library, staff
restaurant, etc.).
Rule of thumb: as much as the effort costs 6
Why overheads bloat up…
7
And how managers handle it…
8
Computing Overhead Expenses…
9
Person Month
Suppose a project is estimated to take 300
person-months to develop:
No? why?
11
Person-month put to practice…
12
Why Person-Month and not
Person-days or Person years?
Modern Projects typically take a
few months to complete...
17
Mythical Man-Month
“Cost varies as product of men and
months, progress does not.”
Hencethe man-month as a unit for
measuring the size of job is a dangerous
and deceptive myth.
19
Reason for Rule 1
20
Mythical Man-Month
Optimism
“All programmers are optimists”
1st false assumption: “all will go well” or “each task
takes only as long as it „ought‟ to take”
The Fix: Consider the larger probabilities
Cost (overhead) of communication (and training)
His formula: n(n-1)/2
How long does a 12 month project take?
1 person: 1 month
2 persons = 7 months (2 man-months extra)
3 persons = 5 months (e man-months extra)
22
23
Mythical Man-Month
Q: “How does a project get to be a year
late”?
A: “One day at a time”
Continued attention to meeting small individual
milestones is required at each level of
management.
Other Fixes
No “fuzzy” milestones
Identify the “true status”
24
Take corrective action
How Projects Become
Further Late…
25
26
Brooks‟ Other ideas…
The Second-system effect:
The second system a team designs is the
most dangerous system they would ever
design…
Tends to incorporate all the additions that
were thought of but not added due to the
inherent time constraints in the first
system.
The team should be mindful that it is
susceptible to over-engineering. 27
Costing and pricing
Estimates are made to discover the
cost of producing a software system.
However, there is no simple relationship
between the development cost and the
price charged to the customer.
29
Cost Estimation Process
Effort
Requirements
Development Time
Estimation
Process Number of Personnel
30
Software pricing factors
Market A development organisation may quote a low price because it
opportunity wishes to move into a new segment of the soft ware market.
Accepting a low profit on one project may give the opportunity
of more profit later. The experience gained may allow new
products to be developed.
Cost estimate If an o rganisation is unsure of its cost estimate, it may increase
uncertainty its price by some contingency over and above its normal profit.
Contractual terms A c ustomer may be willing to allow the developer to retain
ownership of the source code and reuse it in other projects. The
price charged may then be less than if the soft ware source code
is handed over to the customer.
Requirements If the requirements are likely to change, an organisation may
volatility lower its price to win a contract. After the contract is awarded,
high prices can be charged fo r changes to the requirements.
Financial health Developers in financial difficulty may lower their price to gain
a contract. It is better to make a smaller than normal profit or
31
break even than to go out of business.
Giving Estimate to the Boss…
32
Some problems with estimating
Subjective nature of much of estimating:
It may be difficult to produce evidence to support
your precise target
Political pressures:
Managers may wish to reduce estimated costs in order
to win support for acceptance of a project proposal
Changing technologies:
These bring uncertainties.
Projects differ:
Experience on one project may not be applicable to
another 33
Pitfalls of underestimating…
34
Words of wisdom
2 1.25
1.5 1.15
1.25 1.1
1.0 1.0
0.8 0.9
0.67 0.85
0.5 0.8
0.25 0.6
Initial Approved Requirements Architecture Detailed Product
product product specification design design complete
39
definition definition specification specification
A Case of Poor Estimation…
40
Over and under-estimating
Parkinson‟s Law: Weinberg‟s Zeroth
„Work expands to fill Law of reliability:
the time available‟ „a software project
Underestimate: that does not have
Advantage: No to meet a reliability
overspend requirement can
Disadvantages: meet any other
System is usually requirement‟
unfinished
41
Effect of Underestimation
Research results confirm:
43
Basis for successful estimating
Information about past projects
Need to collect performance details
about past project: how big were they?
How much effort/time did they need?
Need to be able to measure the
amount of work involved
Traditional
size measurement for
software is „lines of code‟ – but known to
have problems 44
Off hand estimations…
45
Which view is correct?
Rough order of magnitude is good enough.
Spending time on detailed estimating wastes money
48
Words of wisdom
49
Refining Estimates
Reasons for Adjusting Estimates
Interaction costs are hidden in estimates.
Normal conditions do not apply.
Things go wrong on projects.
Changes in project scope and plans.
Adjusting Estimates:
Time and cost estimates of specific activities
are adjusted as the risks, resources, and
situation particulars become more clearly
50
defined.
A taxonomy of estimating methods
Price to win
51
Pricing to win
The project costs whatever the
customer can spend on it.
Advantages: You get the contract
Disadvantages: Costs do not accurately
reflect the work required. Either:
(1)
the customer does not get the desired
system or
(2) the customer overpays. 52
Pricing to win
This approach may seem unethical and
unbusiness-like….
However, when detailed information is
lacking it may be the only appropriate
strategy…
Which is the most ethical approach?
The project cost is agreed on the basis of an
outline proposal and the development is
constrained by that cost
A detailed specification may be negotiated
or an evolutionary approach used for system
development
53
Parameters to be Estimated
Size is a fundamental measure of work
Based on the estimated size, two
parameters are estimated:
Effort Effort
Size
Duration Duration
KSLOC ≡
Thousands of Source LOC
NCKSLOC ≡
New or Changed KSLOC 57
LOC: Few things counter intuitive…
The lower the level of the language, the more
productive the programmer is:
The same functionality takes more code to
implement in a lower-level language than in a high-
level language.
61
Weighted average estimates
•Weighted average estimating is also known as
sensitivity analysis estimating.
•Three estimates are obtained rather than one.
• Best case (O = Optimistic), worst case (P =
Pessimistic) and most likely (M = Median).
•This provides a more accurate estimate
than when only one estimate is used.
•These are then used in the following formula:
Estimated effort = (O + 4M + P ) / 6
62
Consensus estimating
Steps in conducting a consensus estimating session:
A briefing is provided to the estimating team on
the project.
Each person is provided with a list of work
components to estimate.
Each person independently estimates O, M and P
for each work component.
The estimates are written up on the whiteboard.
Each person discusses the basis and assumptions
for their estimates.
A revised set of estimates is produced.
Averages for the O, M and P values are
calculated. 63
These values are used in the formula.
Expert judgment
One or more experts predict software
costs.
Process iterates until some consensus is
reached.
Advantages: Relatively simple estimation
method. Can be accurate if experts have
direct experience of similar systems
Disadvantages: Very inaccurate if there
are no experts available!
64
Expert Judgment Method: Steps
Coordinator presents each expert with a specification
and an estimation form.
Coordinator calls a group meeting in which the experts
discuss estimation issues with the coordinator and each
other.
Experts fill out forms anonymously
Coordinator prepares and distributes a summary of the
estimation on an iteration form.
Coordinator calls a group meeting, specially focusing on
having the experts discuss points where their estimates
varied widely.
Experts fill out forms, again anonymously, and steps 4 and
6 are iterated for as many rounds as appropriate. 65
Expert Judgement: Cons
Hard to quantify
It is hard to document the factors used by the
experts or expert-group.
Expert may be biased, optimistic, or pessimistic,
even though they have been decreased by the
group consensus.
The expert judgment method always
complements the other cost estimating methods
such as algorithmic method.
66
Expert judgement
An expert is familiar with and
knowledgeable about the application
area and the technologies
Particularly appropriate where existing
code is to be modified.
Research shows that expert
judgement in practice tends to be
based on analogy… 67
Stages
Identify significant features of the
current project
Possibilities include:
The type of application domain,
The number of inputs, the number of distinct entities
referenced,
the number of screens, and so forth. 70
Estimating by analogy
Use effort
source cases from source as
An estimate
attribute values effort
It-Is
Ot-Os
target
Number of outputs
Euclidean distance=sq root ((It - Is)2 +(Ot - Os)722 )
73
Delphi Estimation
A variation of basic expert judgment
technique
Team of Experts and a coordinator.
Experts carry out estimation independently:
mention the rationale behind their estimation.
coordinator notes down any extraordinary
rationale:
circulates among experts.
Experts re-estimate.
Experts never meet each other
to discuss their viewpoints.
74
74
Types of Estimation Techniques
Though there are many techniques of
estimating, they can broadly be classified
into:
Top-down
Bottom-up
What about:
Algorithmic models?
Expert opinion ?
Analogy ?
Price to win?
75
Bottom-up versus top-down
Bottom-up
identify all tasks that have to be done – so quite
time-consuming
use when you have no data about similar past
projects
Top-down
produce overall estimate based on project cost
drivers based on past project data
divide overall estimate between jobs to be done
76
Bottom-up estimating
1. Break the project activities into smaller
and smaller components
o Stop when you get to what one person can do in
one/two weeks
Distribute
proportions of
design code test
overall estimate to
30% 30% 40% components
i.e. i.e. i.e. 40 days
30 days 30 days
78
Top-down Example
79
Algorithmic Models
For project planning, we need:
Effort (cost)
Duration
Hard to estimate effort (or cost) or
duration directly from a problem
description.
Effort and Duration can be measured in
terms of certain project characteristics
that correlate with it:
Called size 80
Software Size
What exactly is the size of a software
project?
How do you measure it?
Function point 81
Algorithmic/Parametric models
COCOMO (lines of code) and function
points examples of these
A Problem with LOC based models
(COCOMO etc):
guess algorithm estimate
84
Bottom-up Estimating: Pro
It permits the software group to estimate
in an almost traditional fashion:
configuration management,
Number
of file types
„system
model size‟
Numbers of input
and output transaction types
93
Parametric model for Effort
Productivity
factors
94
Parametric models for Size
We shall now examine four
parametric models more closely :
1. Albrecht/IFPUG function points
Benefits
• Networking with other counters
• IFPUG Counting Practices Manual
• Research projects
• Hotline
• Newsletter
• Certification
Extent of Usage:
• Member companies include all industry sectors 96
• Over 1200 members in more than 30 countries
Albrecht/IFPUG function points
Albrecht worked at IBM:
Needed a way of measuring the relative
productivity of different programming
languages.
Needed some way of measuring the size of an
application without counting lines of code.
Internal
Logical File
External
Interface File
input type
output type
inquiry type
Application Boundary Other Applications
104
Albrecht/IFPUG function points - continued
Five function types
1. Logical interface file (LIF) types
o Equates roughly to a data store in systems
analysis terms. Created and accessed by the
target system
External Output Count each unique user data or control output type that
(Outputs) leaves the external boundary of the software system
being measured.
Internal Logical Count each major logical group of user data or control
File (Files) information in the software system as a logical internal
file type. Include each logical file (e.g., each logical group
of data) that is generated, used, or maintained by the
software system.
External Input
1.0
ON-LINE
UPDATE
ENTRY Transaction CUSTOMER
INFORMATION
Multi-Screen
External Output
1.0
Category CATEGORIZE
END Summary CUSTOMER
USER INFO
109
Definition Of An Inquiry
An External Inquiry (EQ) is an output that
results in data retrieval. The result
contains no derived data.
CUSTOMER INFO FILE
External Inquiry
1.0
END Selected DISPLAY
USER Customer CUSTOMER
Info INFO
110
Definition Of An IL File
An Internal Logical File (ILF) is a user-
identifiable group of logically related data
that is maintained within the boundary of
the application.
END
USER
1.0
UPDATE
CUSTOMER
INFO
Customer Info
112
Example
Place Purchase Order:
Input data items:
Date, Supplier Number,
Product Code
Quantity Required
Date Required
Output data items:
PO Number (system generated)
Entities referenced:
Product, Purchase Order
Supplier, Purchase Order Item 113
Calculating the System Size
For each function count:
Number of input data items ni
Number of output data items no
Number of entities read/updated ne
116
118
119
Function Point: Refinement
14 General Systems Characteristics are evaluated and used
to compute a Value Adjustment Factor (VAF)
VAF = 1.07
122
Example 2
Technical Complexity Factors:
1. Data Communication 3
2. Distributed Data Processing 0
3. Performance Criteria 4
4. Heavily Utilized Hardware 0
5. High Transaction Rates 3
6. Online Data Entry 3
7. Online Updating 3
8. End-user Efficiency 3
9. Complex Computations 0
10. Reusability 3
11. Ease of Installation 3
12. Ease of Operation 5
13. Portability 3
14. Maintainability 3
TDI =30 (Total Degree of Influence)
123
Example 1 cont…
Function Points:
FP=UFP*(0.65+0.01*TDI)=
55*(0.65+0.01*30)=52.25
124
Example 3
A Payroll application has:
1. Transaction to input, amend and delete employee details – an
EI that is rated of medium complexity
2. A transaction that calculates pay details from timesheet data
that is input – an EI of high complexity
3. A transaction of medium complexity that prints out pay-to-
date details for each employee – EO
4. A file of payroll details for each employee – assessed as of
medium complexity LIF
5. A personnel file maintained by another system is accessed for
name and address details – a simple EIF
What would be the FP counts for these? 125
FP counts
External Low Medium High
user comple comple comple
types xity xity xity
EI 3 4 6
1. Medium EI = 4 FPs EO 4 5 7
EQ 3 4 6
2. High complexity EI = 6 FPs LIF 7 10 15
EIF 5 7 10
3. Medium complexity EO=5 FPs
4. Medium complexity LIF 10 FPs
5. Simple EIF 5 FPs
Total 30 FPs
If previous projects delivered 5 FPs a day,
implementing the above should take 30/5 = 6
days 126
Exercise 1: Tic-Tac-Toe Computer Game
As soon as either of the human player or
the computer wins,
A message announcing the
winner should be displayed.
If neither player manages to get three
consecutive marks along a straight line,
And all the squares on the board are filled up,
Then the game is drawn.
The computer always tries to win a game.
127
Exercise 2
It is needed to develop an Alumni Repository
software for IIM, Ranchi. The software will
extract the details of students from the
existing academic software of IIM, Ranchi. It
will provide an online display of the Alumni
names. The details of the Alumni can be
entered by any one by double clicking on the
Alumni name. The details of Alumni would be
stored in a file. It should be possible to print
out a report detailing all alumni.
130
Function points Mark II
Developed by Charles R. Symons
Higher layers
Receives request Supplies service
Makes a request
Receives service
for a service
Lower layers
134
COSMIC FPs
The following are counted:
Schedule
PROJECT X PROJECT X RISK
REQUIREMENT COMPLEXITY
SIZE
Adjustments FACTORS
Adjustments Costs
Effort
137
Accurate Size Estimation
138
Object Points
Object points has nothing to do with
object-oriented programming
Number of object points is estimated
based on
Number of separate screens displayed
Number of reports that are produced
Number of modules in the code
Object points are simpler to estimated
and take the GUI into account 139
Productivity (Function points / staff month)
12
Productivity= f (size)
10
Bell Laboratories Capers Jones
8 data data
6
4
2
0 Function Points
20 40 80 160 320 640 1280 2560 5120 10240 20480 40960
140
Bernstein‟s rule of thumb
Productivity per staff-month:
50 NCSLOC for OS code (or real-time system)
Reuse note: Sometimes, reusing code that does not provide the exact functionality
needed can be achieved by reformatting input/output. This decreases
performance but dramatically shortens development time. 141
COCOMO
COCOMO (CONSTRUCTIVE COST MODEL)
-First published by Dr. Barry Boehm, 1981
Semidetached Mode
somewhere between Organic and Embedded
Embedded Mode
new product requiring a great deal of innovation
inflexible constraints and interface requirements
(ex: real-time systems)
145
Modes
Feature Organic Semidetached Embedded
Intermediate Model
Uses Effort Adjustment Factor (EAF) fm 15 cost drivers
Detailed Model
Uses different Effort Multipliers for each phase of project
(Most project managers use intermediate model)
147
148
Basic Effort Equation (COCOMO 81)
Effort=A(size)exponent
A is a constant based on the developmental
mode
organic = 2.4
semi = 3.0
embedded = 3.6
Size = 1000s Source Lines of Code (KSLOC)
Exponent is constant for a given mode
organic = 1.05
semi = 1.12
embedded = 1.20
149
150
151
The COCOMO constants
System type c k
Organic (broadly, 2.4 1.05
information systems)
Semi-detached 3.0 1.12
Embedded (broadly, 3.6 1.20
real-time)
162
163
Intermediate COCOMO
Takes basic COCOMO as starting
point
Identifies personnel, product,
computer and project attributes which
affect cost and development time.
Multiplies basic cost by attribute
multipliers which may increase or
decrease costs 164
Attributes
Personnel attributes
Analyst capability
Virtual machine experience
Programmer capability
Programming language experience
Application experience
Product attributes
Reliability requirement
Database size
Product complexity 165
More Attributes
Computer attributes
Execution time constraints
Storage constraints
Virtual machine volatility
Computer turnaround time
Project attributes
Modern programming practices
Software tools
Required development schedule 166
COCOMO Models
7000
COCOMO Models
Organic
Semidetached
6000
Embedded
5000
Person-months
4000
3000
2000
1000
0
0 100 200 300 400 500 600
Thousands of lines of code 167
Effort for
increasing LOC
3x 1.12
Duration for
increasing Effort*
2.5x 0.35
<1
exponent:
>1
168
Intermediate Model
Effort Equation (COCOMO 81)
Effort=EAF*A(size)exponent
EAF (effort adjustment factor) is the
product of effort multipliers corresponding to
each cost driver rating
A is a constant based on the developmental
mode
organic = 3.2
semi = 3.0
embedded = 2.8
Time
Ttheoretical
75% * Ttheoretical
Linear
increase
Staff-month
Impossible design
2x Early Design
(13 parameters)
1.5x
1.25x
Relative
Size Range x
0.8x
Post-Architecture
0.67x (23 parameters)
0.5x Applications
Composition
(3 parameters)
0.25x Product Detail
Concept of Rqts. Design Design Accepted
Operation Spec. Spec. Spec. Software
ICASE
maturity and
Very Low Nominal High Very
capability low high
PROD
(NAP/month)
4 7 13 25 50
199
The Scale Drivers (Exponents)
•An important factors contributing to a project's
duration and cost are the Scale Drivers.
•The Scale Drivers determine the exponent used in
the Effort Equation.
•Scale Drivers have replaced the Development
Modes of COCOMO 81.
•The 5 Scale Drivers are:
Precedentedness
Development Flexibility
Architecture / Risk Resolution
Team Cohesion
Process Maturity 200
Early Design and Post-Architecture Model
( ) [Size]
Environment (Process ScaleFactors)
·Effort =
Multipliers
201
COCOMO 2 Scaling Exponent Approach
• Nominal person-months = A*(size)B
• B = 0.91 + 0.01 (scale factor ratings)
- B ranges from 0.91 to 1.23
- 5 scale factors; 6 rating levels each
• Scale factors:
- Precedentedness (PREC)
- Development flexibility (FLEX)
- Architecture/ risk resolution (RESL)
- Team cohesion (TEAM)
- Process maturity (PMAT, derived from SEI CMM) 202
Project Scale Factors
= B
PM 2.94 (Size) EM
estimated i
B =0.91
. +0.01SF
i
Scale Factors Very Low Low Nominal High Very High Extra High
(Wi)
PREC thoroughly largely somewhat generally largely familiar throughly
unprecedented unprecedented unprecedented familiar familiar
FLEX rigorous occasional some general some general goals
relaxation relaxation conformity conformity
RESL little (20%) some (40%) often (60%) generally mostly (90%) full (100%)
(75%)
TEAM very difficult some difficult basically largely highly seamless
interactions interactions cooperative cooperative cooperative interactions
interactions 203
PMAT weighted sum of 18 KPA achievement levels
The reuse model
Reuse costs:
overhead for assessing, selecting and
assimilating component
small modifications generate disproportional
large costs
Development Flexibility
Need for software conformance with Full Considerable Basic
preestablished requirements
Need for software conformance with Full Considerable Basic
external interface specifications
Premium on early completion High Medium Low
209
Architecture / Risk Resolution (RESL)
210
Team Cohesion (TEAM)
211
COCOMO II Scale factor values
Driver Very Low Nom- High Very Extra
low inal high high
212
Example Usage of scale factor
A software development team is developing an
application:
It is very similar to previous ones it has developed.
A very precise software engineering document
lays down very strict requirements.
PREC is very high (score 1.24).
FLEX is very low (score 5.07).
The good news is that requirements are unlikely
to change:
RESL is high with a score 2.83
The team is tightly knit (high score of 2.19), but
processes are informal: 213
Scale factor calculation
The formula for sf is
sf = B + 0.01 × Σ scale factor values
i.e. sf = 0.91 + 0.01
× (1.24 + 5.07 + 2.83 + 2.19 + 6.24)
= 1.0857
If system contained 10 kloc then estimate would be
2.94 x 101.0857 = 35.8 person months
Using exponentiation („to the power of‟)
adds disproportionately more to the
estimates for larger applications 214
Scale Factor: Example 2
Depends on 5 scale factors, sum/100 is added to
0.91
A company takes on a project in a new domain.
The client would not be participating during
development and has not allowed time for risk
analysis. The company has a CMM level 2 rating.
Precedenteness - new project (4)
Development flexibility - no client involvement - Very
high (1)
Architecture/risk resolution - No risk analysis - V. Low
.(5)
Team cohesion - new team - nominal (3)
Process maturity - some control - nominal (3)
Scale factor is therefore 1.17.
215
Effort multipliers
In addition to the scale factor:
effort multipliers are also assessed:
RCPX Product reliability and complexity
RUSE Reuse required
PDIF Platform difficulty
PERS Personnel capability
FCIL Facilities available
SCED Schedule pressure 216
Effort multipliers
Extra Very Low Nom- High Very Extra
low low inal high high
RCPX 0.49 0.60 0.83 1.00 1.33 1.91 2.72
L=f(K, td)
226
Staffing
Norden was one of the first to investigate
staffing pattern:
Considered general research and development (R&D)
type of projects.
Norden concluded:
Staffing pattern for any R&D project can be
approximated by the Rayleigh distribution curve
Manpower
TD
Time
227
Putnam‟s Work
In 1976, Putnam studied the problem
of staffing of software projects:
observed that the level of effort
required in software development
efforts has a similar envelope.
found that the Rayleigh-Norden curve
relates the number of delivered lines of
code to effort and development time.
228
228
Putnam‟s Work (CONT.)
Lines of code: SS
Time to develop: td
Technology coefficient: Ck
234
Putnam‟s Model
Lines of code: S S = Ck K t1/ 3 4 / 3
d
3
SS
Person years invested: K =
4/3
Ck t d
3/ 4
SS
Time to develop: td =
1/ 3
Ck K 235
Putnam‟s Work
Putnam adapted the Rayleigh-Norden
curve:
Related the number of delivered lines of
code to the effort and the time required to
develop the product.
Studied the effect of schedule compression:
236
Effort Applied vs. Delivery Time
There is a nonlinear relationship between effort
applied and delivery time (Putnam-Norden-
Rayleigh Curve)
Effort increases rapidly as the delivery time is
reduced
Effort
cost
Impossible
region
E theoretical
E optimal
Observe:
A relatively small compression in delivery
schedule
can result in substantial penalty on human
effort.
Also, observe:
benefitscan be gained by using fewer
people over a somewhat longer time span.
239
239
Example
If the estimated development time is
1 year, then in order to develop the
product in 6 months,
thetotal effort and hence the cost
increases 16 times.
In other words,
The relationship between effort and the
chronological delivery time is highly
nonlinear. 240
Putnam‟s Model
Example:
given SS=100,000
C=10,040
td = varies
compute K
td K
1 988 person-month
1.5 195 person-month
2 62 person-month
241
Effect of Schedule Change on
Cost (CONT.)
Putnam model indicates extreme
penalty for schedule compression
andextreme reward for expanding the
schedule.
Putnam estimation model works
reasonably well for very large systems,
butseriously overestimates the effort
for medium and small systems. 242
242
Effect of Schedule Change on
Cost (CONT.)
Boehm observed:
“Thereis a limit beyond which the
schedule of a software project cannot
be reduced by buying any more
personnel or equipment.”
This
limit occurs roughly at 75% of the
nominal time estimate.
243
Effect of Schedule Change on Cost
(CONT.)
251
Capers‟ Jones Rules
Rule 9: Raising the number of
function points to the 1.25 power
predicts the approximate defect
potential for new software projects
Defect potential is sum of bugs (errors)
in requirements, design, coding, user-
documentation + bad fixes or secondary
errors introduced fixes prior errors.
For enhancements: raise to 1.27 power
252
Estimations Still Very Inexact…
Conventional cost models have been
described as:
“within 20% of actuals, 70% of the
time.”
This is scary to investors and
stakeholders:
Asvery seldom are variations of
estimates „for the better‟. 253
254
Some conclusions: how to
review estimates
Ask the following questions about an
estimate
What are the task size drivers?
What productivity rates have been used?
Is there an example of a previous project
of about the same size?
Are there examples of where the
productivity rates used have actually been
found?
255
IEEE 1058.1-1987 SPMP
Table of Contents 3.2 Assumptions, dependencies
& constraints
1. Introduction 3.3 Risk management
1.1 Project overview 3.4 Monitoring & controlling
1.2 Project deliverables mechanisms
1.3 Evolution of the SPMP 3.5 Staffing plan
1.4 Reference materials 4. Technical process
1.5 Definitions and acronyms 4.1 Methods, tools & techniques
2. Project organization 4.2 Software documentation
2.1 Process model 4.3 Project support functions
2.2 Organizational structure 5. Work packages, schedule &
2.3 Organizational boundaries budget
and interfaces 5.1 Work packages
2.4 Project responsibilities 5.2 Dependencies
3. Managerial process 5.3 Resource requirements
3.1 Managerial objectives & 5.4 Budget & resource allocation
priorities 5.5 Schedule 256
Organization of SPMP Document
Introduction (Objectives,Major Functions,Performance
Issues,Management and Technical Constraints)
258
Further Questions?
259