You are on page 1of 87

Software Engineering

SE 2081
Chapter Two
Software processes

School of Engineering
Department of Computing
ASTU

Saturday, March 18, 2023 1


The software process
Software process: organizing a structured set of activities
to develop software systems.
Many different software processes but all involve the
following activities:
Requirement gathering (Identification )
Specification – defining what the system should do;

Design and implementation – defining the organization of


the system and implementing the system;
Validation – checking that it does what the customer wants;
Evolution – changing the system in response to changing
customer needs.

2
The Software Process
A structured set of activities required to develop
a software system
Requirement Identification
Specification
Analysis, design and implementation.
Validation
Evolution
A software process model is an abstract
representation of a process
It presents a description of a process from some
particular perspective

3
Process Model
1. Waterfall Model

Requirements
definition

System and
software design

Implementation
and unit testing

Integr ation and


system testing

Operation and
maintenance

4
Waterfall model phases
Requirements analysis and definition
System and software design
Implementation and unit testing
Integration and system testing
Operation and maintenance

The drawback of the waterfall model is the


difficulty of accommodating change after the
process is underway

5
Waterfall model Requirement and Design
Artefacts produced in the requirements and Design
phases
SRS -Software Requirements specification document
SRS might include:
User usage stories (scenarios) – Use cases.
Static analysis – class diagrams.
Behavioural analysis – sequence diagrams, state charts.

The specification and design activities are heavily time


consuming.

6
Waterfall model problems
Inflexible partitioning of the project into distinct stages
Difficult to respond to changing customer requirements

This model is only appropriate when the requirements are


well-understood

Waterfall model describes a staged development process


 Based on hardware engineering models
 Change during development is unlikely
 Widely used in large systems: military and aerospace
industries

7
When to use the Waterfall Model
Requirements are very well known
Product definition is stable
Technology is understood
New version of an existing product
Porting an existing product to a new platform.

8
2. V-Shaped SDLC Model
A variant of the Waterfall
that emphasizes the
verification and validation
of the product.
Testing of the product is
planned in parallel with a
corresponding phase of
development

9
V-Shaped Steps
 Project and Requirements Planning  Production, operation and
– allocate resources maintenance – provide for
enhancement and corrections
 Product Requirements and  System and acceptance testing –
Specification Analysis – complete check the entire software system in
specification of the software system its environment

 Architecture or High-Level Design –


defines how software functions  Integration and Testing – check that
fulfill the design modules interconnect correctly

 Detailed Design – develop  Unit testing – check that each


algorithms for each architectural module acts as expected
component
 Coding – transform algorithms into
software

10
V-Shaped Strengths
Emphasize planning for verification and validation of
the product in early stages of product development
Each deliverable must be testable
Project management can track progress by milestones
Easy to use

11
V-Shaped Weaknesses
Does not easily handle concurrent events
Does not handle iterations or phases
Does not easily handle dynamic changes in
requirements
Does not contain risk analysis activities

12
When to use the V-Shaped Model
Excellent choice for systems requiring high reliability
– hospital patient control applications
All requirements are known up-front
When it can be modified to handle changing
requirements beyond analysis phase
Solution and technology are known

13
3.Structured Evolutionary Prototyping
Model
Developers build a prototype during the requirements
phase
Prototype is evaluated by end users
Users give corrective feedback
Developers further refine the prototype
When the user is satisfied, the prototype code is
brought up to the standards needed for a final
product.

14
Structured Evolutionary Prototyping Steps
A preliminary project plan is developed
An partial high-level paper model is created
The model is source for a partial requirements
specification
A prototype is built with basic and critical attributes
The designer builds
 the database
 user interface
 algorithmic functions
The designer demonstrates the prototype, the user
evaluates for problems and suggests improvements.
This loop continues until the user is satisfied

15
Structured Evolutionary Prototyping
Strengths
Customers can “see” the system requirements as
they are being gathered
Developers learn from customers
A more accurate end product
Unexpected requirements accommodated
Allows for flexible design and development
Steady, visible signs of progress produced
Interaction with the prototype stimulates
awareness of additional needed functionality

16
Structured Evolutionary Prototyping
Weaknesses
Tendency to abandon structured program
development for “code-and-fix” development
Bad reputation for “quick-and-dirty” methods
Overall maintainability may be overlooked
The customer may want the prototype delivered.
Process may continue forever (scope creep)

17
When to use
Structured Evolutionary Prototyping
Requirements are unstable or have to be clarified
As the requirements clarification stage of a
waterfall model
Develop user interfaces
Short-lived demonstrations
New, original development
With the analysis and design portions of object-
oriented development.

18
4.Rapid Application Model (RAD)
Requirements planning phase (a workshop
utilizing structured discussion of business
problems)
User description phase – automated tools capture
information from users
Construction phase – productivity tools, such as
code generators, screen generators, etc. inside a
time-box. (“Do until done”)
Cutover phase -- installation of the system, user
acceptance testing and user training

19
RAD Strengths
Reduced cycle time and improved productivity
with fewer people means lower costs
Time-box approach mitigates cost and schedule
risk
Customer involved throughout the complete cycle
minimizes risk of not achieving customer
satisfaction and business needs
Focus moves from documentation to code.
Uses modeling concepts to capture information
about business, data, and processes.

20
RAD Weaknesses
Accelerated development process must give quick
responses to the user
Risk of never achieving closure
Hard to use with legacy systems
Requires a system that can be modularized
Developers and customers must be committed to
rapid-fire activities in an abbreviated time frame.

21
When to use RAD
Reasonably well-known requirements
User involved throughout the life cycle
Project can be time-boxed
Functionality delivered in increments
High performance not required
Low technical risks
System can be modularized

22
5.Incremental Development
Rather than deliver the system as a single delivery,
the development and delivery is broken down
into increments with each increment delivering
part of the required functionality.
User requirements are prioritised and the
highest priority requirements are included in
early increments.
Once the development of an increment is
started, the requirements are frozen though
requirements for later increments can continue to
evolve.

23
Incremental Development

Define outline Assign requirements Design system


requirements to increments architecture

Develop system Valida te Integrate Valida te


increment increment increment system
Final
system
System incomplete

24
Incremental Development –Advantages
Customer value can be delivered with each
increment so system functionality is available
earlier.
Early increments act as a prototype to help
elicit requirements for later increments.
Lower risk of overall project failure.
The highest priority system services
tend to receive the most testing.

25
Incremental Model Strengths
Each release delivers an operational product
Customer can respond to each build
Uses “divide and conquer” breakdown of tasks
Lowers initial delivery cost
Initial product delivery is faster
Customers get important functionality early
Risk of changing requirements is reduced

26
Incremental Development – Problems
Lack of process visibility.
Systems are often poorly structured.

Applicability claims in the literature:


For small or medium-size interactive systems.
For parts of large systems (e.g. the user interface).
For short-lifetime systems.

27
When to use the Incremental Model
Risk, funding, schedule, program complexity, or
need for early realization of benefits.
Most of the requirements are known up-front but
are expected to evolve over time
A need to get basic functionality to the market
early
On projects which have lengthy development
schedules
On a project with new technology

28
Incremental means adding, iterative means
reworking (by Alistair Cockburn)
 Incremental development is a staging and scheduling strategy in which
the various parts of the system are developed at different times or rates and
integrated as they are completed. The alternative is to develop the entire
system with a big bang integration at the end.
 Iterative development is a rework scheduling strategy in which time is
set aside to revise and improve parts of the system. The alternative
development is to get it right the first time (or at least declare that it is
right!).

Increment Iterate
fundamentally means “add fundamentally means
.”onto .”“change
repeating the process on a repeating the process on the
.new section of work same section of work
repeat the process (, design, repeat the process (, design,
,implement, evaluate) ,implement, evaluate)
29
Incremental Development

 The first increment delivers one slice of


functionality through the whole system.
 The second increment delivers another
slice of functionality through the whole system.
 The third increment delivers the rest of
the system

30
Iterative Development

 The first iteration delivers enough of


feature 1 to evaluate what is correct
and what needs revision.
 After the second iteration, some revised
parts still need improvement.
 The third iteration produces the final
and stable feature

31
Problems with incremental development (from a
waterfall eye…)
 Management problems
 Progress can be hard to judge and problems hard to find because
there is no documentation to demonstrate what has been done.

 Contractual problems
 The normal contract may include a specification; without a
specification, different forms of contract have to be used.

 Validation problems
 Without a specification, what is the system being tested against?

 Maintenance problems
 Continual change tends to corrupt software structure making it more
expensive to change and evolve to meet new requirements.

32
Incremental & iterative - summary
 Iterative model - This model iterates requirements, design, build and
test phases again and again for each requirement and builds up a system
iteratively till the system is completely build.
 Incremental model - It is non integrated development model. This
model divides work in chunks and one team can work on many chunks.
It is more flexible.
 Spiral model - This model uses series of prototypes in stages,
the development ends when  the prototypes are developed into
functional system. It is flexible model and used for
large and complicated projects.
 Evolutionary model - It is more customer focused model. In this model
the software is divided in small units which is delivered earlier to the
customers.
 V-Model - It is more focused on testing. For every phase some testing
activity are done.

33
6.Spiral Development
Process is conceived as a spiral rather than as a
sequence of activities with backtracking.
Each loop in the spiral represents a phase in
the process.
No fixed phases such as specification or design
- loops in the spiral are chosen depending on what
is required.
Risks are explicitly assessed and resolved
throughout the process.

34
Spiral model (Boehm 87)
Objective setting Risk assessment and
Specific objectives for the reduction
phase are identified. Risks are assessed and
activities put in place to
reduce the key risks.

Planning Development and validation


The project is reviewed and A development model for the
the next phase of the spiral is system is chosen which can be
planned. any of the generic models.

35
Spiral model sectors
Objective setting
Specific objectives for the phase are identified.
Risk assessment and reduction
Risks are assessed and activities put in place to
reduce the key risks.
Development and validation
A development model for the system is chosen
which can be any of the generic models.
Planning
The project is reviewed and the next phase of the
spiral is planned.

36
Spiral Model Strengths
Provides early indication of insurmountable risks,
without much cost
Users see the system early because of rapid
prototyping tools
Critical high-risk functions are developed first
The design does not have to be perfect
Users can be closely tied to all lifecycle steps
Early and frequent feedback from users
Cumulative costs assessed frequently
37
Spiral Model Weaknesses
Time spent for evaluating risks too large for small or low-
risk projects
Time spent planning, resetting objectives, doing risk
analysis and prototyping may be excessive
The model is complex
Risk assessment expertise is required
Spiral may continue indefinitely
Developers must be reassigned during non-development
phase activities
May be hard to define objective, verifiable milestones that
indicate readiness to proceed through the next iteration

38
When to use Spiral Model
When creation of a prototype is appropriate
When costs and risk evaluation is important
For medium to high-risk projects
Long-term project commitment unwise because of
potential changes to economic priorities
Users are unsure of their needs
Requirements are complex
New product line
Significant changes are expected (research and
exploration)

39
7. The (Rational) Unified Process
A modern process model derived from the work on
the UML.
Normally described from 3 perspectives
A dynamic perspective that shows phases over time;
A static perspective that shows process activities;
A practice perspective that suggests good practice.

40
(R)UP phase model

Phase iteration

Inception Elaboration Construction Transition

41
(R)UP phases
One cycle consists of four phases:
Inception
 Establish the business case for the system.
Elaboration
 Develop an understanding of the problem domain and the
system architecture.
Construction
 System design, programming and testing.
Transition
 Deploy the system in its operating environment.

42
(R)UP phases and iterations

Picture taken from:


http://www.ibm.com/developerworks/webservices/library/ws-soa-term2/
43
(R)UP
In each phase many different workflows (like
management, environment, design, implementation
workflow, etc.) can be addressed simultaneously.
At the end of each cycle some kind of prototype or
artifact are produced.
The phases can be repeated many times (i.e. iterations),
producing one or many prototypes or artifacts.
During the cycles the requirements are stable which
offers possibilities to plan the development process for
this cycle.
Between the cycles the requirements change.

44
(R)UP good practice
Develop software iteratively
Manage requirements
Use component-based architectures
Visually model software
Verify software quality
Control changes to software

45
8.Agile Development

46
Project Failure – the trigger for Agility
 One of the primary causes of project failure was
the extended period of time it took to develop
a system.
 Costs escalated and requirements changed.

 Agile methods intend to develop systems more


quickly with limited time spent on analysis
and design.

47
What is an Agile method? (1)
 Focus on the code rather than the design.
 Based on an iterative approach to software
development.
 Intended to deliver working software quickly.
 Evolve quickly to meet changing requirements.
 There are claims that agile methods are probably best
suited to small/medium-sized business systems or PC
products.

48
What is an agile method? (2)
 Agile methods are considered
 Lightweight
 People-based rather than Plan-based
 Several agile methods
 No single agile method
 Extreme Programming (XP) most popular
 No single definition
 Agile Manifesto closest to a definition
 Set of principles
 Developed by Agile Alliance

49
Summary of Principles of agile methods
Principle Description
Customer involvement The customer should be closely involved throughout the
development process. Their role is provide and prioritise new
system requirements and to evaluate the iterations of the system.
Incremental delivery The software is developed in increments with the customer
specifying the requirements to be included in each increment.
People not process The skills of the development team should be recognised and
exploited. The team should be left to develop their own ways of
working without prescriptive processes.
Embrace change Expect the system requirements to change and design the system
so that it can accommodate these changes.
Maintain simplicity Focus on simplicity in both the software being developed and in
the development process used. Wherever possible, actively work
to eliminate complexity from the system.

50
eXtreme Programming
Development and delivery of very small
increments of functionality.
Relies on constant code improvement, user
involvement in the development team and pair wise
programming.
Emphasizes Test Driven Development (TDD) as
part of the small development iterations.

51
XP Practices (1-6)
1. Planning game – determine scope of the next release by
combining business priorities and technical estimates
2. Small releases – put a simple system into production,
then release new versions in very short cycle
3. Metaphor – all development is guided by a simple shared
story of how the whole system works
4. Simple design – system is designed as simply as possible
(extra complexity removed as soon as found)
5. Testing – programmers continuously write unit tests;
customers write tests for features
6. Refactoring – programmers continuously restructure the
system without changing its behavior to remove
duplication and simplify

52
XP Practices (7 – 12)
7. Pair-programming -- all production code is written with
two programmers at one machine
8. Collective ownership – anyone can change any code
anywhere in the system at any time.
9. Continuous integration – integrate and build the system
many times a day – every time a task is completed.
10. 40-hour week – work no more than 40 hours a week as a
rule
11. On-site customer – a user is on the team and available
full-time to answer questions
12. Coding standards – programmers write all code in
accordance with rules emphasizing communication
through the code

53
Claimed Problems with agile methods
 It can be difficult to keep the interest of customers
who are involved in the process.
 Team members may be unsuited to the intense
involvement that characterizes agile methods.
 Prioritising changes can be difficult where there
are multiple stakeholders.
 Maintaining simplicity requires extra work.
 Contracts may be a problem as with other
approaches to iterative development.

54
Is / Isn’t: Misinterpreting the message
1. Agile SD is cheating

2. Agile SD requires the best developers

3. Agile SD is hacking

4. Agile SD won’t work for all projects

55
1. Agile techniques are “cheating”.
· Hire good people;
· Seat them close together to help each other out;
· Get them close to the customers and users;
· Arrange for rapid feedback on decisions;
· Let them find fast ways to document their work;
· Cut out the bureaucracy.

This is:
cheating
stacking the level
a good idea
the heart of agile software development

56
2. Agile only works with the best developers.
 Every project needs at least one experienced and
competent lead person. (Critical Success Factor)

 Each experienced and competent person on the


team permits the presence of 4-5 “average” or
learning people.

 With that skill mix, agile techniques have been


shown to work many times.

57
3. Agile is hacking. (Hacker interpretations are
available & inevitable.)
Hackers: “...spend all their time coding”
Agilists: ...test according to project priorities,
recheck results with users often.

Hackers: “...talk to each other when they are stuck”


Agilists: ...talk to each other and customers as
a matter of practice.

Hackers: “...avoid planning”


Agilists: ...plan regularly

Hackers: “...management caves in out of fear”


Agilists: ...expect management to provide priorities,
& participate jointly project adjustments.

58
4. Agile won’t work for all projects.
Right. (Business isn’t fair).
Agile is an attitude prioritizing:
Project evaluation based on delivered code.
Rapid feedback.
People as a value center.
Creativity in overcoming obstacles.
Not every team
... values the Agile value set.
... can set up the needed trust and communication.

59
Process Assessment
• In the CMM model, the maturity level of
an organization tells us to what extent an
organization can produce low cost, high
quality software.
• Having known the current maturity level,
an organization can work to reach the next
higher level.

60
Software Process Improvement Efforts
Carnegie Mellon University’s Software

Engineering Institute’s Capability


Maturity Model - (SEI’s CMM)
International Standards Organization’s

9001 Specification (ISO 9001)


Proprietary SPI’s from consulting firms

61
< 1% Optimizing
SEI Capability
Maturity Model Process Control

2-3% Managed

Process Measurement

20% Defined

Process Definition

30% Repeatable

Basic Management Control

45% Initial
62
CMM - Initial (Level 1)
 The software process is characterized as

ad hoc, occasionally even chaotic

 Few processes are defined

 Success depends on individual effort and

heroics

“BASICALLY NO CONTROL”
63
CMM - Repeatable (Level 2)
 Basic project management processes are
established to track cost, schedule, and
functionality
 The necessary process discipline is in
place to repeat earlier successes on
projects with similar applications
 Success achieved through basic project
management; not advanced technologies

“BASIC MANAGEMENT CONTROL”


64
CMM - Defined (Level 3)
 The software process for both management
and engineering activities is documented,
standardized, and integrated into a
standard software process for the
organization
 All projects use an approved, tailored
version of the organization’s standard
software process for developing and
maintaining software
 Formality lends itself to improvement

“PROCESS DEFINITION”
65
CMM - Managed (Level 4)
 Detailed measures of the software process

and product quality are collected

 Both the software process and products

are quantitatively understood and

controlled

 A software metrics program is in use

“PROCESS MEASUREMENT”
66
CMM - Optimizing (Level 5)
 Continuous process improvement is enabled

by quantitative (metrics) feedback from the

process

 Continuous process improvement is enabled

by piloting innovative ideas and technologies


“PROCESS CONTROL”
67
SW-CMM Process Assessment

• General Classes of SW-CMM Appraisal


1. Software Process Assessment
 Determine state of organization’s
software process
2. Software Capability Evaluations
 Identify contractors qualified to perform
software work

68
SW-CMM Process Assessment …

• Software Process Assessment:


- Identify improvement priorities within
organization
- Assessment team uses CMM to guide identifying
& prioritizing findings
- Findings & KPA guidance used to plan
improvement strategy for organization

69
SW-CMM Process Assessment …
• Software Capability Evaluations:
- Identify risks associated with a project or contract
to build high quality on schedule & budget
- During acquisition process, capability evaluation
may be performed on bidders
- Findings of an evaluation may be used to identify
risk with using a contractor
- Performed on existing contracts to monitor
process performance
70
Software Process Assessment &
Capability Evaluation Steps:

71
SW-CMM Process Assessment …
• Common Steps:
- Team Selection
 Select team trained in CMM
 Knowledgeable in SE & mgmt
- Maturity Questionnaire
 Site reps complete questionnaire
- Response Analysis
 Analyze results of questionnaire
 Investigation areas = KPAs

72
SW-CMM Process Assessment …
• Common Steps …
- On-site Visit
 Using results analysis, conduct
on-site visit to view process areas
 Using KPAs as guide, question,
listen, review & synthesize info
 Apply professional judgment
 Document rationale for situations
where KPAs not met

73
SW-CMM Process Assessment …
• Common Steps …
- Findings
 At end of on-site period, team produces list of findings
 Identifies strengths & weaknesses of org’s software
processes
 Software Process Assessment -> Basis for PI
recommendations
 Software Capability Evaluation -> Findings part of risk
analysis

74
SW-CMM Process Assessment …
• Common Steps …
- KPA Profile
 Team prepares KPA profile,
showing where KPAs satisfied /
not satisfied by organization
 KPA can be satisfied and still have
associated findings, as long as
findings don’t identify major
problems achieving goals of KPA

75
SW-CMM Process Assessment …
• Differences: Process Assessments &
Capability Evaluation
- Results of process assessment or
capability evaluation may differ
- Assessment / evaluation scope may vary:
 Different definitions of
‘Organization’
 Org may be based on senior
management, geo location,
common app, profit/loss center, etc.
 Sample of selected projects
76
SW-CMM Process Assessment …
• Differences …
- Motivation, objective, outcome & results
ownership differ
 These factors lead to differences in
dynamics of interviews, scope of
inquiry, info collected, & results
 Assessment & evaluation methods
are different
 Assessment training doesn’t prepare
team to do evaluation, vice versa

77
SW-CMM Process Assessment …
• Differences …
- Process Assessment – performed in
open, collaborative environment
 Commitment from mgmt & staff to
do process improvement
 Objective: surface problems & help
improve organization
 Emphasis on interviews as tool for
understanding organization’s
software process
78
SW-CMM Process Assessment …
• Differences …
- Capability Evaluation – performed in
audit-oriented environment
 Objective tied to monetary
considerations
 Emphasis on documented audit
trail that reveals software process
actually implemented by
organization

79
Why Measure Software?
Metric - quantitative measure of degree to which a
system, component or process possesses a given
attribute. “A handle or guess about a given attribute.”
E.g., Number of errors found per person hours expended
Determine the quality of the current product or process

Predict qualities of a product/process

Improve quality of a product/process

80
Motivation for Metrics
Estimate the cost & schedule of future projects

Evaluate the productivity impacts of new tools and


techniques

Establish productivity trends over time

Improve software quality

Forecast future staffing needs

Anticipate and reduce future maintenance needs

81
Metrics in the Process Domain
 a metric as " a quantitative measure of the degree to which a system,

component, or process possesses a given attribute”.


 Process metrics are collected across all projects and over long periods of

time
 They are used for making strategic decisions

 The intent is to provide a set of process indicators that lead to long-term

software process improvement


 The only way to know how/where to improve any process is to
 Measure specific attributes of the process

 Develop a set of meaningful metrics based on these attributes

 Use the metrics to provide indicators that will lead to a strategy for improvement

82
Metrics in the Process Domain (continued)
 We measure the effectiveness of a process by deriving a set of metrics

based on outcomes of the process such as


 Errors uncovered before release of the software

 Defects delivered to and reported by the end users

 Work products delivered

 Human effort expended

 Calendar time expended

 Conformance to the schedule

 Time and effort to complete each generic activity

83
Cont.…
Private metric
There are "private and public" uses for different
types of process data :
Data private to the individual
serve as an indicator for the individual only
Examples of metrics private to the individual
defect rates (by individual)
defect rates (by module)
errors found during development

84
Cont.…
Public metrics
Public metrics assimilate information that originally was
private to individuals and teams.
Project-level defect
– rates ,
– effort,
– calendar times,
– related data
are collected and evaluated in an attempt to uncover
indicators that can improve organizational process
performance.
85
Protocol of Process Metrics
 Use common sense and organizational sensitivity when interpreting
metrics data
 Provide regular feedback to the individuals and teams who collect
measures and metrics
 Don’t use metrics to evaluate individuals
 Work with practitioners and teams to set clear goals and metrics that
will be used to achieve them
 Never use metrics to threaten individuals or teams
 Metrics data that indicate a problem should not be considered
“negative”
 Such data are merely an indicator for process improvement
 Don’t obsess on a single metric to the exclusion of other important
metrics

86
Thank You!
Q?

87

You might also like