Reviews - Workshop

Saud Bahwan Group
CSD
Why Reviews?
Isn¶t Testing Enough?
‡ Reviews improve schedule performance
‡ Reviews reduce rework.
± Rework accounts for 44% of dev. cost!
± Reqs (1%), Design (12%), Coding (12%), Testing (19%)
‡ Reviews are pro-active tests.
± Find errors not possible through testing.
‡ Reviews are training.
± Domain, corporate standards, group.
Why review? We test!
No Revs.
Revs
Req Design Code Test
R R R
Industry Experiences - 1
Industry Experiences - 2
Industry Experiences - 3
Industry Experiences - 4
Industry Experiences - 5
‡ Aetna Insurance Company:
± FTR found 82% of errors, 25% cost reduction.
‡ Bell-Northern Research:
± Inspection cost: 1 hour per defect.
± Testing cost: 2-4 hours per defect.
± Post-release cost: 33 hours per defect.
‡ Hewlett-Packard
± Est. inspection savings/year: $21,454,000
‡ IBM (using Cleanroom)
± C system software
± No errors from time of first compile.
Defect amplification
and removal (1)
Errors passed through
Amplified errors 1:x
Newly generated errors
Percent
efficiency
for error
detection
Errors from
previous step
Errors passed
to next step
Defect amplification
and removal (2)
0
0
10
0%
Preliminary design
10
5
25
0%
Detailed design
40
20
25
0%
Coding
85
0
0
50%
Unit testing
42
0
0
50%
Integration testing
10
40
85
85
42
21
Assumption: amplification 1:0.5
Defect amplification and removal
- revisited
0
0
10
50%
Preliminary design
5
3
25
50%
Detailed design
16
8
25
50%
Coding
25
0
0
50%
Unit testing
12
0
0
50%
Integration testing
5 (10)
16 (40)
25 (85)
25 (85)
12 (42)
6 (21)
Assumption: amplification 1:0.5
What can we do to remove errors in
early development phases
Non-execution-based testing !
± Reviews: Inspections & walkthroughs
± Presentations
The later a change is
addressed, the greater the
cost, risk and duration
Cost,
Risk,
Duration
Reviews: an effectiveness scale
formality
most
effective
inspection (FTR)
walk through (FTR)
formal presentation
informal presentation
peer group review
casual conversation
Effectiveness of inspections
‡ [Fagan 1976] inspections of design & code
± 67%-82% of all faults were found by inspections
± 25% time saved on programmer resources (despite inspections)
‡ [Fagan 1986]
± 93% of all faults were found by inspections
‡ Cost reduction for fault detection (compared with testing)
± [Ackerman, Buchwald, Lewski 1989]: 85%
± [Fowler 1986]: 90%
± [Bush 1990]: 25.000US$ saved PER inspection
Why review? Who benefits?
‡ Formal technical review provides:
± Defect information to the author.
± Information on work product and development to
peers.
± Fault likelihood data to testers.
± Product status to management.
± Process status to SPI group.
Objectives of reviews
‡ Uncover errors in any representation of
software
‡ Verify that
± software meets its requirements
± software follows predefined standards
± software is developed in uniform manner
‡ Make projects more manageable
‡ Educate new team members
Basic
Review
Principles
What is Formal Technical
Review?
‡ A method involving a structured
encounter in which a group of technical
personnel analyzes or improves the
quality of the original work product as
well as the quality of the method.
True FTR is well-defined
‡ Well-defined process
± Phases (kick-Off, etc.)
± Procedures (checklists, etc.)
‡ Well-defined roles
± Coordinator, Reviewer, Recorder, Author, etc.
‡ Well-defined objectives
± Defect removal, requirements elicitation, etc.
‡ Well-defined measurements
± Forms, consistent data collection, etc.
FTR is effective quality improvement
‡ Reviews can find 60-100% of all defects.
‡ Reviews are technical, not management.
‡ Review data can assess/improve quality of:
± work product
± software development process
± review process
‡ Reviews reduce total project cost, but have non-trivial cost
(~15%)
‡ Upstream defect removal is 10-100 times cheaper.
‡ Reviews disseminate domain knowledge, development
skills, and corporate culture.
Who, What, and When
‡ Who decides what should be reviewed?
± Senior technical personnel, project leader
‡ What should be reviewed?
± Work products with high impact upon project risks.
± Work products directly related to quality objectives.
± ³Upstream´ work products have higher impact.
‡ When should review be planned?
± Specify review method and target work products in
project plan/software development plan/quality plan.
The range of review practices
TekInspect
Development Method
Non-Cleanroom Cleanroom
FTR inFTR
Code
Inspection
(Fagan76)
Inspection
(Gilb93)
2-Person
Inspection
(Bisant89)
N-Fold
Inspection
(Martin90)
Walkthrough
(Yourdon89)
Verification-
based
Inspection
(Dyer92)
Active
Design
Reviews
(Parnas85)
FTArm
(Johnson94)
ICICLE
(Brothers90)
Scrutiny
(Gintell93)
CAIS
(Mashayekhi94)
Manual Tool-Based
Code
Reading
(McConnell93)
Software
Review
(Humphrey90)
Phased Insp.
(Knight93)
How to carry out reviews:
The review team
review
leader
recorder
producer
reviewer(s)
Basic guidelines (1)
‡ 3-6 people (typical)
± experienced senior technical staff
± representatives of
‡ team that created the document
‡ client representative
‡ team for next development phase
‡ software quality assurance group
‡ IEEE Standard for Software Reviews and Audits
[IEEE 1028, 1988]
Basic guidelines (2)
‡ Review leader should be SQA representative
± has the most to lose
± creator: eager to get approval (to start next job)
± client: can wait for acceptance testing
‡ Review leader distributes material
‡ Advance preparation of max. 2 hours before the
meeting
‡ Duration: less than 2 hours
Result of a review
‡ Decision about the product
± accept without further modification
± reject the work due to severe errors (review must be
repeated)
± accept with minor modifications (that can be
incorporated into the document by the producer)
‡ All participants have to sign-off
± shows participation / responsibility
± shows their concurrence with the findings
Reviewer¶s preparation
‡ be sure that you understand the context
‡ first, skim all the product material to understand
location and format of the information
‡ next, read product material and annotate hardcopy
‡ pose your written comments as questions
‡ avoid issues of style
‡ inform the review leader if you can¶t prepare
Families of Review Methods
Walkthroughs
Minimal rh ad
D l p r training
Quick turnar und
quir m nt licitati n
Ambiguity r luti n
Training
Meth d Family Typical G al Typical Attributes
Little/n preparati n
Inf rmal pr cess
No measurement
Not FT !
Technical
Reviews
Formal process
Author presentation
Wide range of discussion
Inspections
Detect and remove all
defects efficiently and
effectively
Formal process
Checklists
Measurements
Verify phase
Reviews
Types of Reviews
‡ Walkthroughs
‡ Inspections
‡ Audits
Reviews
Walkthrough
producer guides review
many variations
presentation reviews
overlook of many details
presentation overshadows review
Ego is a key problem
Reviews
Inspection
Formal process
Requires intensive advance preps
checklists utilized
many variations
product reviews
Reviews
Audits
external review
audit the product
audit the process
Reviews
The Process
Roles, Rules, Reports
Roles: Coordinator, Presenter(producer), Reviewers, Recorder
Rules: Etiquette, Order, Issues
Reports: Deliverables, Recorder report, Review Report, Audit
Reviews
Monitor - Review Leader
(technical, leader, mediator, well-organized)
calls meeting to order
calls on people to bring issues
assures recording
takes votes
Reviews
Reviewer - receives deliverables early
reviews deliverables prior to meeting
addresses problems known prior to meeting
reviews in meeting with issues
held accountable for review
Reviews
Guidelines for Reviewer
MUST BE PREPARED
Raise Issues, not resolutions
Avoid discussions of style
Record issues in public
Stick to technical issues
No managers (PR HOLDERS)
Reviews
Recorder -
records all issues with all necessary information
not a secretary
records votes
Presenter (producer)
presents material objectively
has team for support
Reviews
Etiquette - product not person
good before bad
Order do until complete
present, review, issues, recording
Issue: not resolution only what is the problem
Conducting the review
‡ Be prepared - evaluate product before review
‡ develop check list for each kind of work product
‡ review the product, not the producer
‡ keep your tone mild, ask questions instead of making
accusations
‡ stick to the review agenda
‡ raise issues, don¶t resolve them!
‡ limit discussions (do them off line!)
‡ avoid discussions of style - stick to technical correctness
‡ schedule reviews as project tasks (allocate resources)
‡ record and report all review results
An
Exemplary
³Generic´
Inspection
Process
Effort for Tec ical Revie s


l ··vm ÷uf·
¹÷x·÷rf ··d÷)
´ ·d÷
M =x1m vm l ··vm ÷uf/´ ·d÷ ¬ 1·÷ ¡·v ·u÷ R ÷v1÷» ¯9 r=a÷· ?9 r=a÷·
\ vm ¦÷v ·¡ R ÷v1÷» ÷v·
R ÷v1÷» l v÷r=v=f1·u ¹v÷1=f1v÷)
l ¡¡·vf R ÷v1÷» l v÷r=v=f1·u
l ¡¡·vf R ÷v1÷» ¬ ÷··1·u ¹=¦··1vf÷)
T·f=1 R ÷v1÷» l ¡¡·vf
¯
19 r=a÷·/hv·.
?¯ hv·.
14 hv·.
¯ r÷v··u d=v·
`
¯ r=a÷·/hv·.
1? hv·.
19 hv·.
` r÷v··u d=v·
l ¡¡·vf ¡·v l ··vm ÷uf/´ ·d÷ ´ v÷=f1·u ¹v÷1=f1v÷)
l ¡¡·vf ¡·v l ··vm ÷uf/´ ·d÷ ´ v÷=f1·u ¹=¦··1vf÷)
? r=a÷·/d=v
?¯ r÷v··u
d=v·
1 r=a÷/d=v
?9 r÷v··u
d=v·
R ÷v1÷» l ¡¡·vf v·. l ¡¡·vf ¡·v l ··vm ÷uf/´ ·d÷
´ v÷=f1·u
?9 º 15

The ³Generic´ Inspection Process
‡ Choose team, materials, dates.
‡ Present product, process, goals.
‡ Check product, note issues.
‡ Consolidate issues.
‡ Correct defects.
‡ Verify product/process quality
Preparation
Kick-Off
Planning
Review Meeting
Rework
Verify
Planning
‡ Objectives
± Gather review package: work product, checklists, references, and
data sheets. (PL & Review Coordinator)
± Form inspection team (Review members).
± Determine dates for meetings.
‡ Procedure
± Review Coordinator assembles team and review package.
± Review Coordinator enhances checklist if needed.
± Review Coordinator plans dates for meetings.
± Review Coordinator checks work product for readiness.
± Review Coordinator helps Author prepare overview.
Planning Kick-Off Preparation Review Mt Rework Verify
Example Planning Data
Pla i g 1. Inspection ID ____________ Date: _____________
2. Team
oderator
_________________________
Authors _________________________ _________________________
evie ers _________________________ _________________________
_________________________ _________________________
_________________________ _________________________
3. Documents
Work Product
_________________________ _________________________
e erences _________________________ _________________________
hecklists _________________________ _________________________
4. eetings Date ocation tart End
Orientation ____________ ____________ ____________ ____________
evie eeting
____________ ____________ ____________ ____________
5. Planning Q e erences obtained or ork product.
Ob ectives Q hecklists obtained or ork product.
Q oderator is trained in TekInspect method.
Q Team members agree to proposed times/dates.
Q oderator's quick revie yields less than 5 ma or issues.
Q evie ers understand responsibilities and are committed.
6. lan. ort min
Kick-Off
‡ Objectives
± Author provides overview.
± Reviewers obtain review package.
± Preparation goals established.
± Reviewers commit to participate.
‡ Procedure
± Moderator distributes review package.
± Author presents overview, if necessary.
± Scribe duty for Review Meeting assigned.
± Moderator reviews preparation procedure.
Planning Kick-Off Preparation Review Mt Rework Verify
Example Orientation Data
Orientation 7. rep. oals min/pg x pgs. = prep time/revie er
8. Orient. Q evie ers understand scope and purpose o ork product..
Ob ectives Q evie ers understand checking process, checklists, and re erences.
Q Work product, re erences, checklists, and checking orms provided.
9. Orient. ort min. meet x particip. = min
Preparation
‡ Objectives
± Find maximum number of non-minor issues.
‡ Procedure for reviewers:
± Allocate recommended time to preparation.
± Perform individual review of work product.
± Use checklists and references to focus attention.
± Note critical, severe, and moderate issues on Reviewer
Data Form.
± Note minor issues and author questions on work product.
Planning Orientation Preparation Review Mt Rework Verify
Example Issue Classification
‡ Critical
± Defects that may cause the system to hang, crash, produce
incorrect results or behavior, or corrupt user data. No known
work-arounds.
‡ Severe
± Defects that cause incorrect results or behavior with known
work-arounds. Large and/or important areas of the system is
affected.
‡ Moderate
± Defects that affect limited areas of functionality that can
either be worked around or ignored.
‡ Minor
± Defects that can be overlooked with no loss of functionality.
Example checklist
Checklist for Software Quality lans
Q 1. Does the plan reference the Tektronix Test Plan process document to be used in this project?
Q 2. Does the plan list the set of measurements to be used to assess the quality of the product?
Q 3. Is a rationale provided for each feature to be tested?
4. According to this document, what features won't be tested? Are any missing? List all below:
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
______________________________________________________________________________
Q Does the plan provide a rationale for why each of these features will not be tested?
5. How well does the plan describe how tests will be traced back to requirements?
Check one of the following:
Q Very well Q Fairly well Q Poorly Q No Traceability
. Refer to the corresponding software development plan. Does the quality plan discuss each of the test
milestones and test transmittal events from this document?
Check all that apply:
Q I cannot access the software development plan.
Q The software development plan has no test milestones.
Q The software development plan has no test transmittal events.
Q The quality plan has no test milestones.
Q The quality plan has no test transmittal events.
Q Both documents include the same set of test milestones and test transmittal events.
Example references
‡ Corporate standards:
± Procedure for Software Quality Plans
‡ Exemplary documents:
± Foo System Software Quality Plan
‡ High quality reference texts:
± Software Quality: Concepts And Plans, Ch. 13 (Plan
following an industrial model), Robert Dunn.
‡ On-line resources:
± http://flute.lanl.gov/SWQA/SMP.html
Example Preparation Data
1. Inspection ID _________ 2. Document: _____________ 3. Name: ________________
4. Critical, Severe, and Moderate Issues
Num Location Severity Chk/Ref Description
_____ _______ ______ ______ ______________________________________________________
_____ _______ ______ ______ ______________________________________________________
_____ _______ ______ ______ ______________________________________________________
_____ _______ ______ ______ ______________________________________________________
_____ _______ ______ ______ ______________________________________________________
5. Effort: min 6. Issue ________ ________ ________ ________ ________
Totals
critical severe moderate minor author Qs
7. Preparation Q Work product has been completely checked.
Objectives Q All critical, severe, and moderate issues are noted on this form.
Q All minor issues and author questions are noted on the work product.
Why not write on the work product?
‡ Advantages of Reviewer Data Sheet:
± Minor issues are ³pre-filtered´ from review meeting,
saving meeting time.
± Reviewers articulate issues clearly during preparation,
saving meeting time.
± Preparation statistics gathering simplified.
± Preparation effectiveness (% true defects, %
redundancy) and checklist effectiveness is measurable.
± Issues can be presented in order of importance.
± Data sheet indicates effectiveness of checklists.
Why not write on the work product?
‡ Disadvantages of Reviewer Data Sheet:
± Requires extra time (15 minutes?)
± Discourages last minute preparation.
± Makes quality of preparation more visible.
Review Meeting
‡ Objectives
± Create consolidated, comprehensive listing of non-minor
issues.
± Provide opportunity for group synergy.
± Improve reviewing skill by observing others.
± Create shared knowledge of work product.
‡ Procedure
± Moderator requests issues sequentially.
± Reviewers raise issues.
± Recorder notes issues on Scribe Data Sheet.
± Recorder Data Sheet is visible to everyone.
Planning Kick-Off Preparation Review Mt. Rework Verify
Example Review Meeting Data
Review ggregrate e ing Data
Meeting
R R R R R R T tal
. Prep. Eff rt + + + + + min
. riti al I . + + + + + i .
. evere I . + + + + + i .
. M erate I + + + + + i .
. Min r I . + + + + + i .
. t r ' . + + + + +
Review . Rev. Meet. Q ll r i r r t. Li t t:
Meeting j ti Q ll r i r r r ffi i tly f r m ti .
( t.) Q ll i t y S ri r t y t r f r r r
Q y r l m it i ti r t .
7. .M. Eff rt mi . m t rti i . = mi
Rework
‡ Objectives
± Assess each issue, determine if it is a defect, and
remove it if necessary.
± Produce written disposition of non-minor issue.
± Resolve minor issues as necessary.
Planning Kick-Off Preparation Review Mt. Rework Verify
Rework (cont.)
‡ Procedure
± Author obtains Scribe Data Sheet containing
consolidated issues list as well as copies of work
products.
± Author assesses each issue and notes action taken using
Author Data Sheet.
± Author determines the µtype¶ of each defect
(reqs/spec/design/imp, etc.)
± When finished Author provides Author Data Sheet and
reworked product to Moderator to Verify.
Example Rework Data
1. Inspection I _________ 2. Doc ment____________ 3. Aut or ____________
4. Issue Disposition
Nu Fixed Type Explanation
_____ ____ __________ _____________________________________________________________
_____ ____ __________ _____________________________________________________________
_____ ____ __________ _____________________________________________________________
5. Effort min
6. Re ork Q Outcome of all evie eeting ata heet issues are noted on this form.
Objectives Q All minor issues have been addressed.
Q o kno n defects remain in the ork product.
Verify
‡ Objectives
± Assess the (reworked) work product quality.
± Assess the inspection process.
± Pass or fail the work product.
‡ Procedure for moderator:
± Obtain reworked product and Author Data Sheet.
± Review work product/data sheet for problems.
± Provide recommendation for work product.
± Perform sign-off with reviewers.
± Compute summary statistics for inspection.
± Generate any process improvement proposals.
± Enter review data into quality database.
Planning Kick-Off Preparation Review Mt. Rework Verify
Example Verify Data
Verif . tal la i ( i e )
Eff rt rie tati ( i e )
reparati ( i e )
e ie eeti ( i e )
e r ( ee e r Data eet)
Verify
t l I s ecti ff rt
. tal ritical ( ll fr m e r Data eet)
Defects e ere
em e erate
i r
t l efects emoved
. et Q e ie er f r s ere t fille t c letel .
Variati s Q e ie eeti i l e iss e isc ssi a res l ti .
Q ec lists i t a ear t e el f l.
Q efere ces i t a ear t e el f l.
Q t er:
. Verif Q erat r s ic re ie iel s less t a aj r iss es.
jecti es Q erat r as c llecte all e I s ect f r s f r fili .
Q erat r as e tere ata i t alit e i eeri ata ase.
. r cess
I r e e t
. I s ecti Q ass
tat s Q iti al ass:
Q ail:
erat r si at re: ate:
I ree/ is ree wit t e er t r's ecisi :
QAgree Q isagree ate:

Why Reviews?
Isn¶t Testing Enough?

Why review? We test!
R R R ‡ Reviews improve schedule performance Req
Revs No Revs.

Design

Code

Test

‡ Reviews reduce rework.
± Rework accounts for 44% of dev. cost! ± Reqs (1%), Design (12%), Coding (12%), Testing (19%)

‡ Reviews are pro-active tests.
± Find errors not possible through testing.

‡ Reviews are training.
± Domain, corporate standards, group.

Industry Experiences .1 .

Industry Experiences .2 .

Industry Experiences .3 .

Industry Experiences .4 .

000 ‡ IBM (using Cleanroom) ± C system software ± No errors from time of first compile. ± Post-release cost: 33 hours per defect. ‡ Bell-Northern Research: ± Inspection cost: 1 hour per defect. . inspection savings/year: $21.454.Industry Experiences . 25% cost reduction. ‡ Hewlett-Packard ± Est.5 ‡ Aetna Insurance Company: ± FTR found 82% of errors. ± Testing cost: 2-4 hours per defect.

Defect amplification and removal (1) Errors passed through Amplified errors 1:x Errors from previous step Newly generated errors Percent efficiency for error detection Errors passed to next step .

Preliminary design Defect amplification and removal (2) Detailed design 0 0 10 Assumption: amplification 1:0.5 Coding 40 0% 10 10 5 25 0% 40 20 25 0% 85 Unit testing 85 85 0 0 50% 42 Integration testing 42 0 0 50% 21 .

5 Coding 5 3 25 50% 16 (40) 16 8 25 50% 25 (85) Unit testing 25 (85) 25 0 0 50% 12 (42) Integration testing 12 0 0 50% 6 (21) .Preliminary design Defect amplification and removal .revisited 0 0 10 Detailed design 50% 5 (10) Assumption: amplification 1:0.

What can we do to remove errors in early development phases Non-execution-based testing ! ± Reviews: Inspections & walkthroughs ± Presentations .

The later a change is addressed. risk and duration Cost. Risk. Duration . the greater the cost.

Reviews: an effectiveness scale most effective inspection (FTR) walk through (FTR) formal presentation informal presentation peer group review casual conversation formality .

Buchwald.Effectiveness of inspections ‡ [Fagan 1976] inspections of design & code ± 67%-82% of all faults were found by inspections ± 25% time saved on programmer resources (despite inspections) ‡ [Fagan 1986] ± 93% of all faults were found by inspections ‡ Cost reduction for fault detection (compared with testing) ± [Ackerman. Lewski 1989]: 85% ± [Fowler 1986]: 90% ± [Bush 1990]: 25.000US$ saved PER inspection .

± Information on work product and development to peers.Why review? Who benefits? ‡ Formal technical review provides: ± Defect information to the author. ± Product status to management. ± Process status to SPI group. ± Fault likelihood data to testers. .

Objectives of reviews ‡ Uncover errors in any representation of software ‡ Verify that ± software meets its requirements ± software follows predefined standards ± software is developed in uniform manner ‡ Make projects more manageable ‡ Educate new team members .

Basic Review Principles .

What is Formal Technical Review? ‡ A method involving a structured encounter in which a group of technical personnel analyzes or improves the quality of the original work product as well as the quality of the method. .

consistent data collection. etc. Reviewer.True FTR is well-defined ‡ Well-defined process ± Phases (kick-Off. etc. etc. . etc. requirements elicitation. Recorder.) ± Procedures (checklists. ‡ Well-defined objectives ± Defect removal. Author.) ‡ Well-defined roles ± Coordinator. etc. ‡ Well-defined measurements ± Forms.

‡ Review data can assess/improve quality of: ± work product ± software development process ± review process ‡ Reviews reduce total project cost. ‡ Reviews are technical. but have non-trivial cost (~15%) ‡ Upstream defect removal is 10-100 times cheaper. . and corporate culture. ‡ Reviews disseminate domain knowledge.FTR is effective quality improvement ‡ Reviews can find 60-100% of all defects. not management. development skills.

What. and When ‡ Who decides what should be reviewed? ± Senior technical personnel. . project leader ‡ What should be reviewed? ± Work products with high impact upon project risks. ± Work products directly related to quality objectives. ‡ When should review be planned? ± Specify review method and target work products in project plan/software development plan/quality plan.Who. ± ³Upstream´ work products have higher impact.

The range of review practices Development Method Non-Cleanroom inFTR Walkthrough (Yourdon89) Code Reading (McConnell93) Tool-Based Active Design Reviews (Parnas85) FTR Manual Code Inspection (Fagan76) Cleanroom Verificationbased Inspection (Dyer92) Software Review (Humphrey90) Inspection (Gilb93) TekInspect ICICLE (Brothers90) Phased Insp. (Knight93) 2-Person Inspection (Bisant89) N-Fold Inspection (Martin90) FTArm (Johnson94) Scrutiny (Gintell93) CAIS (Mashayekhi94) .

How to carry out reviews: The review team producer review leader reviewer(s) recorder .

Basic guidelines (1) ‡ 3-6 people (typical) ± experienced senior technical staff ± representatives of ‡ team that created the document ‡ client representative ‡ team for next development phase ‡ software quality assurance group ‡ IEEE Standard for Software Reviews and Audits [IEEE 1028. 1988] .

2 hours before the meeting ‡ Duration: less than 2 hours .Basic guidelines (2) ‡ Review leader should be SQA representative ± has the most to lose ± creator: eager to get approval (to start next job) ± client: can wait for acceptance testing ‡ Review leader distributes material ‡ Advance preparation of max.

Result of a review ‡ Decision about the product ± accept without further modification ± reject the work due to severe errors (review must be repeated) ± accept with minor modifications (that can be incorporated into the document by the producer) ‡ All participants have to sign-off ± shows participation / responsibility ± shows their concurrence with the findings .

read product material and annotate hardcopy ‡ pose your written comments as questions ‡ avoid issues of style ‡ inform the review leader if you can¶t prepare .Reviewer¶s preparation ‡ be sure that you understand the context ‡ first. skim all the product material to understand location and format of the information ‡ next.

Families of Review Methods Meth d Family Typical G al Minimal rh ad D l p r training Typical Attributes Little/n preparati n Inf rmal pr cess No measurement Not FT ! Walkthroughs Quick turnar und Technical Reviews quir m nt licitati n Ambiguity r luti n Training Formal process Author presentation Wide range of discussion Formal process Checklists Measurements Verify phase Inspections Detect and remove all defects efficiently and effectively .

Reviews Types of Reviews ‡ ‡ ‡ Walkthroughs Inspections Audits .

Reviews Walkthrough producer guides review many variations presentation reviews overlook of many details presentation overshadows review Ego is a key problem .

Reviews Inspection Formal process Requires intensive advance preps checklists utilized many variations product reviews .

Reviews Audits external review audit the product audit the process .

Review Report. Issues Reports: Deliverables. Recorder report. Order. Rules. Recorder Rules: Etiquette.Reviews The Process Roles. Presenter(producer). Reports Roles: Coordinator. Reviewers. Audit .

leader.Reviews Monitor . mediator.Review Leader (technical. well-organized) calls meeting to order calls on people to bring issues assures recording takes votes .

Reviews Reviewer .receives deliverables early reviews deliverables prior to meeting addresses problems known prior to meeting reviews in meeting with issues held accountable for review .

Reviews Guidelines for Reviewer MUST BE PREPARED Raise Issues. not resolutions Avoid discussions of style Record issues in public Stick to technical issues No managers (PR HOLDERS) .

Reviews Recorder records all issues with all necessary information not a secretary records votes Presenter (producer) presents material objectively has team for support .

Reviews
Etiquette - product not person good before bad Order do until complete present, review, issues, recording Issue: not resolution only what is the problem

Conducting the review
‡ ‡ ‡ ‡ ‡ ‡ ‡ ‡ ‡ ‡ Be prepared - evaluate product before review develop check list for each kind of work product review the product, not the producer keep your tone mild, ask questions instead of making accusations stick to the review agenda raise issues, don¶t resolve them! limit discussions (do them off line!) avoid discussions of style - stick to technical correctness schedule reviews as project tasks (allocate resources) record and report all review results

An Exemplary ³Generic´ Inspection Process

Effort for Tec  ical Revie s  ' RFXP HQW  V H[FHSW FRGH.

  & RGH  0 D[L XP  RFXP HQW & RGH L I RQH HYL  P '  6 ]H RU 5 HZ  1 XP EHU  HYL HUV RI 5 HZ  5 HYL  UHSDUDWRQ DWYH.

 HZ 3 L UHO L ( I RUW 5 HYL  UHSDUDWRQ I  HZ 3 L ( I RUW 5 HYL  HVVL DEVRO H.

 I  HZ 6 RQ XW 7 RW  HYL  I RUW  DO5 HZ ( I   SDJHV     SDJHV KUV     KUV    KUV   SDJHV     SDJHV  KUV   KUV    KUV    SHUVRQ GD\V  SHUVRQ GD\V  SDJHV GD\   SHUVRQ GD\V   SDJH GD\   SHUVRQ GD\V  ( I RUW I ' RFXP HQW & RGH UHDWRQ DWYH.

 I  RU  & L UHO L  ( I RUW I ' RFXP HQW & RGH UHDWRQ I  RU  & L DEVRO H.

 XW 5 HYL  I RUW YV ( I RUW I ' RFXP HQW & RGH HZ ( I   I  RU  & UHDWRQ L      15 .

dates. note issues. ‡ Correct defects.The ³Generic´ Inspection Process Planning ‡ Choose team. materials. ‡ Consolidate issues. goals. ‡ Present product. process. ‡ Check product. ‡ Verify product/process quality Kick-Off Preparation Review Meeting Rework Verify .

checklists. Review Coordinator plans dates for meetings. Review Coordinator checks work product for readiness. . ± Determine dates for meetings.Planning Planning Kick-Off Preparation Review Mt Rework Verify ‡ Objectives ± Gather review package: work product. and data sheets. Review Coordinator helps Author prepare overview. references. Review Coordinator enhances checklist if needed. ‡ Procedure ± ± ± ± ± Review Coordinator assembles team and review package. (PL & Review Coordinator) ± Form inspection team (Review members).

Q hecklists obtained or ork product. Q Team members agree to proposed times/dates. Q oderator is trained in TekInspect method. Team oderator _________________________ Authors _________________________ evie ers _________________________ Date: _____________ _________________________ _________________________ 3. Q oderator's quick revie yields less than 5 ma or issues. Documents Work Product _________________________ e erences _________________________ hecklists _________________________ _________________________ _________________________ _________________________ _________________________ _________________________ _________________________ _________________________ tart End 4.Example Planning Data Pla i g 1. min 6. eetings Date ocation Orientation ____________ evie eeting ____________ ____________ ____________ ____________ ____________ ____________ ____________ 5. Q evie ers understand responsibilities and are committed. Planning Ob ectives Q e erences obtained or ork product. Inspection ID ____________ 2. ort . lan.

Preparation goals established. Moderator reviews preparation procedure. Scribe duty for Review Meeting assigned. if necessary. Reviewers commit to participate. Reviewers obtain review package.Kick-Off Planning Kick-Off Preparation Review Mt Rework Verify ‡ Objectives ± ± ± ± ± ± ± ± Author provides overview. Moderator distributes review package. Author presents overview. ‡ Procedure .

rep. and checking orms provided.Example Orientation Data Orientation 7. meet x particip. ort min/pg x pgs. = prep time/revie er Q evie ers understand scope and purpose o ork product. checklists. oals 8. = min . checklists. min. Q Work product. and re erences. Orient. Orient. Ob ectives 9. Q evie ers understand checking process.. re erences.

and moderate issues on Reviewer Data Form. Perform individual review of work product. Note critical. severe.Preparation Planning Orientation Preparation Review Mt Rework Verify ‡ Objectives ± Find maximum number of non-minor issues. ‡ Procedure for reviewers: Allocate recommended time to preparation. ± ± ± ± . Use checklists and references to focus attention. ± Note minor issues and author questions on work product.

No known work-arounds. . ‡ Severe ± Defects that cause incorrect results or behavior with known work-arounds. ‡ Minor ± Defects that can be overlooked with no loss of functionality.Example Issue Classification ‡ Critical ± Defects that may cause the system to hang. Large and/or important areas of the system is affected. produce incorrect results or behavior. crash. ‡ Moderate ± Defects that affect limited areas of functionality that can either be worked around or ignored. or corrupt user data.

Checklist for Software Quality lans Example checklist Q 1. Q Both documents include the same set of test milestones and test transmittal events. Is a rationale provided for each feature to be tested? 4. Does the plan list the set of measurements to be used to assess the quality of the product? Q 3. what features won't be tested? Are any missing? List all below: ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ Q Does the plan provide a rationale for why each of these features will not be tested? 5. How well does the plan describe how tests will be traced back to requirements? Check one of the following: Q Very well Q Fairly well Q Poorly Q No Traceability  . Q The software development plan has no test transmittal events. . Does the plan reference the Tektronix Test Plan process document to be used in this project? Q 2. Q The software development plan has no test milestones. Refer to the corresponding software development plan. Does the quality plan discuss each of the test milestones and test transmittal events from this document? Check all that apply: Q I cannot access the software development plan. Q The quality plan has no test transmittal events. Q The quality plan has no test milestones. According to this document.

‡ On-line resources: ± http://flute. 13 (Plan following an industrial model).html .Example references ‡ Corporate standards: ± Procedure for Software Quality Plans ‡ Exemplary documents: ± Foo System Software Quality Plan ‡ High quality reference texts: ± Software Quality: Concepts And Plans. Ch.lanl. Robert Dunn.gov/SWQA/SMP.

. Critical. and Moderate Issues Num Location _____ _____ _____ _____ _____ _______ _______ _______ _______ _______ ______ ______ ______ ______ ______ min ______ ______ ______ ______ ______ 6. Effort: 7. Issue Totals ______________________________________________________ ______________________________________________________ ______________________________________________________ ______________________________________________________ ______________________________________________________ ________ critical 5. and moderate issues are noted on this form. Name: ________________ 4. Severe. severe.1. Preparation Objectives ________ severe ________ moderate ________ minor ________ author Q s Q Work product has been completely checked. Inspection ID Example Preparation Data _________ 2. Document: _____________ Severity Chk/Ref Description 3. Q All minor issues and author questions are noted on the work product. Q All critical.

saving meeting time. ± Preparation statistics gathering simplified. . saving meeting time. ± Issues can be presented in order of importance. ± Reviewers articulate issues clearly during preparation. % redundancy) and checklist effectiveness is measurable.Why not write on the work product? ‡ Advantages of Reviewer Data Sheet: ± Minor issues are ³pre-filtered´ from review meeting. ± Data sheet indicates effectiveness of checklists. ± Preparation effectiveness (% true defects.

± Makes quality of preparation more visible.Why not write on the work product? ‡ Disadvantages of Reviewer Data Sheet: ± Requires extra time (15 minutes?) ± Discourages last minute preparation. .

± Create shared knowledge of work product. ‡ Procedure ± ± ± ± Moderator requests issues sequentially. Rework Verify ‡ Objectives ± Create consolidated. Reviewers raise issues. . Recorder Data Sheet is visible to everyone. ± Improve reviewing skill by observing others.Review Meeting Planning Kick-Off Preparation Review Mt. comprehensive listing of non-minor issues. ± Provide opportunity for group synergy. Recorder notes issues on Scribe Data Sheet.

Example Review Meeting Data Review Meeting . . Review Meeting ( t. ll i t y S ri rt y t rf rr y r l m it i ti r t . Prep. Min r I . Rev. . riti al I . . mi . m t rti i . evere I . M erate I . Q Q Q Q R R R ggregrate R e ing Data R R T tal min i . Eff rt .M.) t r '. r 7. j ti ll r i r r t. i . Eff rt = mi . i . Li t t: ll r i r r r ffi i tly f r m ti . + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + . i . Meet. .

Rework Verify ‡ Objectives ± Assess each issue. ± Produce written disposition of non-minor issue. ± Resolve minor issues as necessary. and remove it if necessary.Rework Planning Kick-Off Preparation Review Mt. . determine if it is a defect.

) ‡ Procedure ± Author obtains Scribe Data Sheet containing consolidated issues list as well as copies of work products.Rework (cont. ± Author determines the µtype¶ of each defect (reqs/spec/design/imp.) ± When finished Author provides Author Data Sheet and reworked product to Moderator to Verify. etc. ± Author assesses each issue and notes action taken using Author Data Sheet. .

1. Re ork Objectives min Q Outcome of all evie eeting ata heet issues are noted on this form. Issue Disposition Nu Fixed Type Explanation _____ ____ __________ _____________________________________________________________ _____ ____ __________ _____________________________________________________________ _____ ____ __________ _____________________________________________________________ 5. Effort 6. Doc ment ____________ 3. Inspection I Example Rework Data _________ 2. Q o kno n defects remain in the ork product. . Aut or ____________ 4. Q All minor issues have been addressed.

‡ Procedure for moderator: ± ± ± ± ± ± ± Obtain reworked product and Author Data Sheet. Generate any process improvement proposals. . ± Pass or fail the work product.Verify Planning Kick-Off Preparation Review Mt. ± Assess the inspection process. Enter review data into quality database. Rework Verify ‡ Objectives ± Assess the (reworked) work product quality. Perform sign-off with reviewers. Compute summary statistics for inspection. Provide recommendation for work product. Review work product/data sheet for problems.

erat r as e tere ata i t alit e i eeri ata ase. r cess I r e e t . .Verif Example Verify Data . Q efere ces i t a ear t e el f l. Q ass Q iti al ass: Q ail: erat r si at re: I ree/ is ree wit t e Q gree Q isagree A ate: er t r's ecisi : ate: . tal Eff rt la i rie tati reparati e ie eeti e r Verify t l I s ecti ( ( ( ( ( i e ) i e ) i e ) i e ) ee e r Data eet) ff rt r Data eet) . et Variati s Q e ie er f r s ere t fille tc letel . Verif jecti es . tal Defects em e ritical ( ll fr m e e ere erate i r t l efects emoved . erat r as c llecte all e I s ect f r s f r fili . Q t er: Q Q Q . Q e ie eeti i l e iss e isc ssi a res l ti Q ec lists i t a ear t e el f l. I s ecti tat s erat r s ic re ie iel s less t a aj r iss es.

Sign up to vote on this title
UsefulNot useful