For Software Testing Information visit: www.gcreddy.


Important Terms:

Static Testing

Static techniques and the test process dynamic testing, static testing, static technique Review process entry criteria, formal review, informal review, inspection, metric, moderator/inspection leader, peer review, reviewer, scribe, technical review, walkthrough. Static analysis by tools Compiler, complexity, control flow, data flow, static analysis

I) Phases of a formal review
1) Planning 2) Kick-off 3) Individual preparation 4) Review meeting 5) Rework 6) Follow-up Selecting the personal, allocating roles, defining entry and exit criteria for more formal reviews etc. Distributing documents, explaining the objectives, checking entry criteria etc. Work done by each of the participants on their own work before the review meeting, questions and comments etc. Discussion or logging, make recommendations for handling the defects, or make decisions about the defects etc. Fixing defects found, typically done by the author Fixing defects found, typically done by the author Checking the defects have been addressed, gathering metrics and checking on exit criteria

II) Roles and responsibilities
Manager Moderator Author Reviewers Scribe (recorder) Decides on execution of reviews, allocates time in projects schedules, and determines if the review objectives have been met Leads the review, including planning, running the meeting, follow-up after the meeting. The writer or person with chief responsibility of the document(s) to be reviewed. Individuals with a specific technical or business background. Identify defects and describe findings. Documents all the issues, problems

For QTP Information visit:


For Software Testing Information visit: III) Types of review
Informal review Walkthrough Technical review No formal process, pair programming or a technical lead reviewing designs and code. Main purpose: inexpensive way to get some benefit. Meeting led by the author, ‘scenarios, dry runs, peer group’, openended sessions. Main purpose: learning, gaining understanding, defect finding Documented, defined defect detection process, ideally led by trained moderator, may be performed as a peer review, pre meeting preparation, involved by peers and technical experts Main purpose: discuss, make decisions, find defects, solve technical problems and check conformance to specifications and standards Led by trained moderator (not the author), usually peer examination, defined roles, includes metrics, formal process, premeeting preparation, formal follow-up process Main purpose: find defects.


Note: walkthroughs, technical reviews and inspections can be performed within a peer group-colleague at the same organization level. This type of review is called a “peer review”.

IV) Success factors for reviews
   Each review has a clear predefined objective. The right people for the review objectives are involved. Defects found are welcomed, and expressed objectively. People issues and psychological aspects are dealt with (e.g. making it a positive experience for the author). Review techniques are applied that are suitable to the type and level of software work products and reviewers. Checklists or roles are used if appropriate to increase effectiveness of defect identification. Training is given in review techniques, especially the more formal techniques, such as inspection. Management supports a good review process (e.g. by incorporating adequate time for review activities in project schedules). There is an emphasis on learning and process improvement.

 

  

V) Cyclomatic Complexity
The number of independent paths through a program Cyclomatic Complexity is defined as: L – N + 2P

For QTP Information visit:


For Software Testing Information visit:
L = the number of edges/links in a graph N = the number of nodes in a graphs P = the number of disconnected parts of the graph (connected components) Alternatively one may calculate Cyclomatic Complexity using decision point rule Decision points +1 Cyclomatic Complexity and Risk Evaluation 1 to 10a simple program, without very much risk 11 to 20 a complex program, moderate risk 21 to 50, a more complex program, high risk > 50an un-testable program (very high risk)

For QTP Information visit: