You are on page 1of 62

Watercooler

Software Quality
Assurance (SQA) Plan

Submitted by:

Alvarez, Guillard

Dalangin, Cris

Lu, Pamela

Villamejor, Joey Nyl


P R E F A C E

This document contains the Software Quality Assurance Plan for the Watercooler Project.
The activities and tasks outlined in this document are consistent with the existing Software
Development Plan and project documentation for the system.

The Watercooler Development Team assumes responsibility for the maintenance of this
document according to the needs of the Watercooler Project. Users of this document may
report deficiencies or corrections through a Document Change Request Form that can be
found at the end of the document.
S I G N A T U R E P A G E

Prepared By:

______________________________ ______________
(Signature above printed name) (Date)

Reviewed By:

______________________________ ______________
(Signature above printed name) (Date)

______________________________ ______________
(Signature above printed name) (Date)

Approved By:

______________________________ ______________
(Signature above printed name) (Date)
A M E N D M E N T H I S T O R Y

Version Description of Change Change Approved By Date


Number Request Approved
Number
T A B L E O F C O N T E N T S

1. Purpose ......................................................................................................................... 8

1.1. Scope ..................................................................................................................... 8

2. Reference Documents ................................................................................................... 9

3. Management .................................................................................................................. 9

3.1. Management Organization ...................................................................................... 9

3.1.1. The Watercooler Project Team ...................................................................... 10

3.1.2. The Assurance Management Team ............................................................... 11

3.2. Resources............................................................................................................. 11

3.2.1. Facilities and Equipment ................................................................................ 11

3.2.2. Personnel ...................................................................................................... 11

3.2.2.1. The Watercooler Project Team ................................................................... 11

3.2.2.2. The Assurance Management Team............................................................ 12

3.3. SQA Tasks............................................................................................................ 13

3.3.1. Evaluate the Overall Development Environment ............................................ 13

3.3.2. Evaluate the Requirements Analysis Phase ................................................... 13

3.3.3. Evaluate the Software Design and Development Process.............................. 14

3.3.4. Evaluate the Implementation and Testing Phase ........................................... 15

3.3.5. Evaluate the End-Item Delivery...................................................................... 15

3.3.6. Product Assessments .................................................................................... 16

3.3.7. Process Assessments.................................................................................... 17

3.4. Roles and Responsibilities .................................................................................... 17

3.4.1. Quality Assurance Manager ........................................................................... 17

3.4.2. Quality Assurance Personnel ......................................................................... 17

4. Documentation ............................................................................................................. 18
4.1. Purpose ................................................................................................................ 18

4.2. Minimum Project Documentation .......................................................................... 18

5. Standards, Practices, Conventions, and Metrics .......................................................... 19

5.1. Purpose ................................................................................................................ 19

5.2. Software Quality Program ..................................................................................... 19

5.2.1. Standards, Practices, and Conventions ......................................................... 21

5.2.2. Metrics ........................................................................................................... 22

6. Software Reviews ........................................................................................................ 24

6.1. Purpose ................................................................................................................ 24

6.2. Minimum Software Reviews .................................................................................. 24

6.2.1. Software Specification Review (SSR) ............................................................ 24

6.2.2. Software Preliminary Design Review (PDR)................................................... 24

6.2.3. Software Critical Design Review (CDR) ......................................................... 25

6.2.4. Software Test Readiness Review (TRR) ........................................................ 25

6.2.5. Formal Qualification Review (FQR) ................................................................ 25

6.2.6. Production Readiness Review (PRR)............................................................. 25

6.2.7. Software Concept Review (SCR) ................................................................... 25

6.2.8. Acceptance Review (AR) ............................................................................... 25

6.2.9. Operational Readiness Review (ORR) ........................................................... 25

6.2.10. Peer Reviews ............................................................................................. 25

7. Software Testing .......................................................................................................... 26

8. Problem Reporting and Corrective Action .................................................................... 27

8.1. Internal Audit Report ............................................................................................. 27

8.2. Corrective Action Report ....................................................................................... 28

8.3. Preventive Action Report ...................................................................................... 28

8.4. Software Tool Evaluation Report........................................................................... 29

8.5. Facilities Evaluation Report ................................................................................... 29

9. Tools, Techniques, and Methodologies ........................................................................ 29

9.1. Software Quality Tools .......................................................................................... 29


9.2. Project Tools ......................................................................................................... 29

9.3. Software Quality Techniques and Methodologies.................................................. 30

10. Media Control ........................................................................................................... 30

11. Supplier Control........................................................................................................ 31

12. Record Collection, Maintenance, and Retention ....................................................... 31

13. Training .................................................................................................................... 31

14. Risk Management .................................................................................................... 31

15. SQA Plan Change Procedure and History ................................................................ 32

Table of Appendices ........................................................................................................... 33

Appendix A – List of Abbreviations and Acronyms ........................................................... 33

Appendix B – Business Process / System Flowchart ....................................................... 34

Appendix C – Sequence Diagrams .................................................................................. 37

Appendix D – Test Procedures ........................................................................................ 42

Appendix E – System Screenshots.................................................................................. 48

Appendix F – Forms ........................................................................................................ 57


1. Purpose

The purpose of this Software Quality Assurance (SQA) Plan is to establish the goals,
processes, and responsibilities required to implement effective quality assurance functions
for the Watercooler project.

The Watercooler SQA Plan provides the framework necessary to ensure a consistent
approach to software quality assurance throughout the project life cycle. It defines the
approach that will be used by the Software Quality (SQ) personnel to monitor and assess
software development processes and products to provide objective insight into the maturity
and quality of the software. The systematic monitoring of Watercooler products, processes,
and services will be evaluated to ensure they meet requirements and comply with the
Watercooler Development Team’s existing policies, standards, and procedures, as well as
applicable Institute of Electrical and Electronic Engineers (IEEE) standards.

1.1. Scope
This plan covers SQA activities throughout the lifecycle of the Watercooler project.

Watercooler offers a simple, easy-to-use and clutter-free interface. It will allow software
developers and other IT professionals to make use of tags to categorize topics. The usage of
tags will provide for easier searching of topics and will prevent the deep hierarchies of
categories that usually add clutter to a forum. In addition, users will be allowed to follow any
tags they like. Users will be able to quickly check the threads that are most relevant to them
by following tags that interest them. Users will also be able to vote for or against any of the
posts. Post ratings will allow for the easy filtering of the most helpful answers.

This project will cover the registration, login/logout, creation of threads, post questions,
follow tags, post ratings. It will be created in PHP and MySQL which will run on Apache
server.

The team will use the Agile software development which will be based on iterative and
incremental development and where requirements and solutions will evolve through
collaboration between self-organizing, cross-functional teams.

SQA activities will be done in all iterations. The SQA Team will be testing the system/module
in all iterations to ensure quality of the system.
2. Reference Documents
 IEEE Standard for Software Quality Assurance Plans. Software Engineering
Standards Committee , September 2002.
 Reaves, Michael J. Software Quality Assurance and Software Verification and
Validation Plan for Web Accessible Alumni Database. Jacksonville State University,
2003.
 Defense Finance And Accounting Service, Program Quality Assurance Plan, May
2002.
 Tom, Mochal, Integration testing will show you how well your modules get along,
Sept. 2001.
 Applying Broadcasting/Multicasting/Secured Communication in Multi-Agent Systems
Software Quality Assurance Plan, 2003
 IEEE Guide for Software Quality Assurance Planning
 IEEE Standard for Software Quality Assurance Planning, IEEE Std 730.1-1995

3. Management

3.1. Management Organization


Watercooler efforts are supported by numerous entities, organizations and personnel.
Relevant entities/roles that are of interest and applicable to this SQA Plan and the software
assurance effort are described at a high level below.

Project
SQA
Manager

Database Business Web


Lead Designer Tester
Administrator Analyst Developers

Software Quality Assurance is responsible for ensuring compliance with SQA requirements
as delineated in this Software Quality Assurance Plan. The SQA organization assures the
quality of deliverable software and its documentation, non-deliverable software, and the
engineering processes used to produce software.

This section describes the functional groups that influence and control software quality.
3.1.1. The Watercooler Project Team
The Watercooler Project Team is responsible for the management of project objectives
within the guidelines and controls prescribed by the Watercooler Project Plan.

The Project Manager is responsible for the following:

 Implementing the quality program in accordance with the SQA Policy


 Identifying the SQA activities to be performed by SQA
 Reviewing and approving the Watercooler SQAP
 Identifying and funding an individual or group independent from the project to perform
the SQA functions
 Resolving and following-up on any quality issues raised by SQA
 Identifying and ensuring the quality factors to be implemented in the system and
software
 Identifying, developing and maintaining planning documents such as the Program
Management Plan, SDP, SCMP, Test Plans, and this SQAP

The Business Analyst responsible for the following:

 Reviewing and commenting on the Watercooler SQAP


 Implementing the quality program in accordance with this SQAP
 Resolving and following-up on any quality issues raised by SQA related to software
engineering activities
 Identifying, implementing, and evaluating the quality factors to be implemented in the
system (software and hardware)
 Implementing the software engineering practices, processes, and procedures as
defined in the Watercooler SDP and other project’s planning documents

The Software Designer and Developer are responsible for the following:

 Reviewing and commenting on the Watercooler SQAP


 Implementing the quality program in accordance with this SQAP
 Resolving and following-up on any quality issues raised by SQA related to software
design and development
 Identifying, implementing, and evaluating the quality factors to be implemented in the
software
 Implementing the software design/development practices, processes, and
procedures as defined in the Watercooler SDP and other project’s planning
documents
3.1.2. The Assurance Management Team
The Assurance Management Team provides mission assurance support to the project team.

The Assurance Management Team is responsible for the following:

 Reviewing and commenting on the Watercooler SQAP


 Implementing the quality program in accordance with this SQAP
 Resolving and following-up on any quality issues raised by SQA related to software
test
 Verifying the quality factors are implemented in the system, specifically software
 Implementing the software test practices, processes, and procedures as defined in
the Watercooler SDP and other project’s planning documents

3.2. Resources

3.2.1. Facilities and Equipment


SQA will have access to the facilities and equipment as described in the Watercooler SDP.
SQA will have access to computer resources to perform SQA functions such as process and
product evaluations and audits.

3.2.2. Personnel

3.2.2.1. The Watercooler Project Team


The Watercooler Project Team is responsible for management of project objectives within
the guidelines and controls prescribed by the Watercooler Project Plan. The Project
Manager (PM) is specifically responsible for the success of the Watercooler Project,
including but not limited to cost, schedule, and quality.

The Development Team comprises of one Project Manager (PM), one Business Analyst and
Developers, each specializing in a particular web component. Following a web framework
methodology, tasks will be split according to its requirements.

The PM will also be responsible for implementing and reviewing the SQA Plan. The PM will
coordinate regularly with the independent Quality Assurance Team with regards to quality
audits made to the system and will discuss with the project team members if there are issues
or discrepancies encountered during the audit process.

The senior programmer will be in charge of all backend logical processes. The Database
Administrator will focus on database design and structure and will constantly coordinate with
the senior programmer. The Lead Web Designer will be in charge of the general appearance
of the system. The tester will be in charge of all tests during the development process.

The Business Analyst will ensure that the project scope is clear and complete will be
responsible for defining and documenting the scope as part of the requirements gathering
task.

3.2.2.2. The Assurance Management Team


The Assurance Management Team provides mission assurance support to the project team.
The team is comprised of Quality Assurance Manager (QAM) and Quality Assurance
Personnel (QAP).

The Quality Assurance Manager assigned to the project is responsible for supporting the
Project Manager in ensuring the quality of the product. The QAM provides Project
Management with visibility into the processes being used by the software development
teams and the quality of the products being built. The QAM maintains a level of
independence from the project and the software developers. Risk escalation begins at the
project level, and extends to the Assurance Management Team.

In support of software quality assurance activities, the QAM has assigned Quality Assurance
Personnel to coordinate and conduct the Quality Assurance (QA) activities for the project
and identify and document noncompliance issues

Quality Assurance Personnel Responsibilities include, but are not limited to:

 Prepare and manage the project software system quality assurance plan
 Create and keep a schedule of software quality assurance conducted activities
 Lead process and product assessments, as depicted within this plan, using objective
standards
 Interface with Safety, Reliability, and IV&V personnel on software system assurance
activities
 Discover and document non-compliances, observations, and risks from all software
assurance related activities to the Systems Assurance Manager
 Communication of results from assessments with crucial stakeholders
 Ascertain resolution of non-compliances and escalate any issues that cannot be
decided within the project
 Discover lessons learned that could improve processes for future projects.
 Prepare and manage metrics
3.3. SQA Tasks
This section discusses the SQA tasks performed by the SQ Personnel during the
development and maintenance of the Watercooler project. These tasks are selected in
accordance to planned and contractual deliverables, the Software Management Plan (SMP)
and the Project Schedule set by the Watercooler Project team.

3.3.1. Evaluate the Overall Development Environment


The Software Development Environment (SDE) must have the necessary tools and facilities
needed for the smooth development flow of the Watercooler project. The SDE must contain,
but is not limited to, the following components:

 Integrated Development Environment (IDE) – An application that will be used by the


developers throughout the development phase. The IDE must be installed and
configured with the needed plugins and extensions to support software development
needs such as source code editing/debugging and build automation.
 Version Control System – A system that allows developers to collaborate and to keep
track of all files and file changes in the project. This allows the developer to work on
the same files and to avoid accidentally overriding important changes made by
another developer.
 Issue Tracking System – A system which manages and keeps a list of software
issues (additional software features, bugs, etc.) that are raised for the project. This
helps the developer see pending issues and which of them have a higher priority, in
which case the developer must attend to it first.

Set of Coding Standards and Best Practices – Set of coding rules to be followed by the
developers to ensure uniform coding. Following these standards reduces time and effort in
code reviews, and also allows easier maintenance of the project.

3.3.2. Evaluate the Requirements Analysis Phase


One of the key assurance activities for any project is the expert evaluation and assessment
of the requirements. Incorrect, ambiguous, and incomplete requirements have been shown
to be the major cause of system failures and accidents. Getting the requirements right is not
easy, but it is vital to the success of the project.

Requirements Analysis involves gathering information about the customer's needs and
defining, in the clearest possible terms, the problem that the product is expected to solve.

This analysis includes understanding the customer's business context and constraints, the
functions the product must perform, the performance levels it must adhere to, and the
external systems it must be compatible with. Techniques used to obtain this understanding
include customer interviews, use cases, and "shopping lists" of software features. The
results of the analysis are typically captured in a formal requirements specification, which
serves as input to the next step.

Requirement for this project is to have a forum for IT Professionals where they can ask
questions in terms of programming. Users should be able to post questions, answer
questions, follow threads of their interest, and post ratings. They should be able to register,
login and edit their profile in Watercooler.

Several techniques such as interviews, questionnaires, client documents and scenarios will
be used to elicit requirements.

System Requirements Document (SRD) will be tested to confirm that a system possessing
the characteristics defined in the SRD satisfies the URD within the effectiveness envelope.
Operational Analysis is the usual technique for SRD validation.

It is confirmed that the candidate requirements can be satisfied within the target or approved
capability, cost, timescale and risk envelope by feedback from SRD and solution option
development.

3.3.3. Evaluate the Software Design and Development Process


Preliminary design and development activity determines the overall structure of the software
to be built. Based on the requirements identified in the previous phase, the software is
partitioned into modules, and the function(s) of each module and relationships among these
modules are defined.

A goal of detailed design is to define logically how the software will satisfy the allocated
requirements. The level of detail of this design and development must be such that the
coding of the computer program can be accomplished by someone other than the original
designer.

SQA shall perform the following:

 Ensure that the software design process and associated design reviews are
conducted in accordance with standards and procedures established by the project
and as described in the SDP
 Ensure that action items resulting from reviews of the design are resolved in
accordance with these standards and procedures
 Evaluate the method used for tracking and documenting the development of a
software unit in order to determine the method's utility as a management tool for
assessing software unit development progress. Example criteria to be applied for the
evaluation are the inclusion of schedule information, results of audits, and an
indication of internal review and approval of its constituent parts
 Ensure that the method used for tracking and documenting the development of a
software unit is implemented and is kept current

The results of this task shall be documented using the Process Audit Form described in
Section 8 and provided to project management. SQA’s recommendation for corrective
action requires project management’s disposition and will be processed in accordance with
the Corrective Action Process.

3.3.4. Evaluate the Implementation and Testing Phase


In the implementation, integration and testing phases, the separate modules developed are
then combined, while ensuring these modules work together to achieve the set software
functionalities. Software testing is done starting from the functionalities of the individual
modules, then moving up to the interfaces in between them, assuring that these modules
work together to complete the expected software functionality.

It must be ensured that the software integration process, and testing activities or procedures
are being accomplished in accordance with established software standards and procedures,
and the software design. Also, it must be documented and ensured that the approved test
procedures are being followed, that accurate records of test results are being kept, that all
discrepancies discovered during the tests are being properly reported, that test results are
being analyzed, and the associated test reports are completed.

When fixes are made, it must be made sure that the software integration and unit testing
procedures are re-executed to validate the correctness of the changes made to the source
code, and to check for the existence of child bugs, if any.

3.3.5. Evaluate the End-Item Delivery


End-item deliverables will include the following:

 Software Version Description (SVD) – this document identifies and describes a


software version of Watercooler. It is used to release, track and control Watercooler
builds.
 User Manual – this is a technical communication document intended to assist users
into using Watercooler.
 Media Distribution List – this is a list containing the names and addresses of the
organizations to which the end-item deliverables are to be given.

The SQA Personnel must make sure that the SVD and User Manual documents are the
correct version for Watercooler. Also, the Media Distribution List must contain the complete
list of organizations, along with their correct and current addresses, where the end-item is to
be delivered.

3.3.6. Product Assessments


Software Development Folders

A software development folder or file is a physical or virtual container for software project
artifacts, including: requirements, plans, designs, source code, test plans and results,
problem reports, reviews, notes, and other artifacts of the development process.

The software development folder check off list requires the checking of requirements,
functionality, source code, libraries/directory, development methodologies, test data, test
analysis. Requirements indicate that there is matches between the requirements trace ability
matrix and the requirements addressed by this module. With functionality, it checks if a
design walk-through was conducted and if it there was a permission to start the program.
Source code of the unit should be included in the folder as well as the libraries and
directories. Test Analysis is also included to verify that the unit has been thoroughly tested.

Software configuration management

Software configuration management (SCM) process is the best solution to handle changes
in software projects. It identifies the functional and physical attributes of software at various
points in time, and performs systematic control of changes to the identified attributes for the
purpose of maintaining software integrity and traceability throughout the software
development life cycle. Configuration management practices include revision control and the
establishment of baselines. CVS will be used for version control.

Requirements Traceability

Traceability is identified through the use of a spreadsheet matrix which will tie individual
Contract End Item (CEI) Specifications, applicable Interface Controlled Description (ICD)s
and Software Requirements Document entries to lower level design and test specification
paragraphs or sections. These Traceability products are produced and maintained by SQA.
SQA is included in review process for all software document generation. During these
reviews, checklists and Traceability spreadsheets are used to ensure that requirements are
met by both the design and test functions.

3.3.7. Process Assessments


Listed below are the process assessments that will be conducted by SQ personnel.

 Project Planning
 Project Monitoring and Control
 Software Reviews
 Requirements Management
 Software Configuration Management and Configuration Audits (FCA/PCA)
 Test Management (Verification & Validation)
 Software Problem Reporting and Corrective Action
 Risk Management

3.4. Roles and Responsibilities


This section describes the roles and responsibilities for each assurance personnel assigned
as Quality Assurance Manager and Quality Assurance Personnel for the Watercooler Project.

3.4.1. Quality Assurance Manager


The Quality Assurance (QA) Manager is responsible for developing the QA Plan and for
measuring, assessing, and reporting quality performance against objectives.

3.4.2. Quality Assurance Personnel


Responsibilities include, but are not limited to:

 Generate and maintain a schedule of quality assurance activities


 Conduct process and product assessments
 Interface with all personnel on quality assurance activities
 Identify and document noncompliance
 Communicate results from assessments with relevant stakeholders
 Ensure resolution of noncompliance and escalate any issues that cannot be resolved
within the project
 Identify lessons learned
 Develop and maintain metrics
4. Documentation

4.1. Purpose
This section defines the required documentation for properly assessing the correctness of
the system. Essential documentations pertaining to requirements, development, verification,
validation and maintenance of the software that fall within the scope of this Software Quality
Plan shall be reviewed by the Software Quality Personnel.

4.2. Minimum Project Documentation


 Project Plan. The Project Plan is the approved document that will be used as guide
for project execution and control. Scope, cost, resources, project milestones and
activities are all indicated in the Project Plan.
 Configuration Management Plan. The Configuration Management Plan’s main
purpose is to maintain consistency in a software product by establishing change
control processes, identifying all aspects of configuration items and recording all
configuration baselines.
 Risk Management Plan. The Risk Management Plan contains effective controls for
the identification and assessment of risks. The perceived risks, measures for
mitigation and treatment of risks, schedule for risk control implementation and the
responsible persons for these actions are all documented in the Risk Management
Plan.
 Test Plan. The Test Plan includes the test strategies and test coverage that will be
carried out to ensure that the software meets its requirements. There are different
test plans depending on the types of test that will be carried out.
 User Acceptance Test Plan. The User Acceptance Test Plan outlines the detailed
plan for user acceptance testing. This document will be used to record the client’s
sign off of the documented scenarios.
 System Training Plan. The Training Plan defines the support activities, schedules,
curriculum, methods and tools, and equipment required for system training.
 Software Maintenance Plan. The Software Maintenance Plan defines the support
environment, roles and responsibilities and scheduled activities to provide
maintenance personnel with information to effectively maintain the system.
 Software Requirements Specifications. The Software Requirements Specifications
document contains the functional and non-functional requirements of the system. It
includes the complete description of the expected behaviour of the system.
 Functional Specifications. The Function Specifications document details what the
finished product will do and how the user will interact with the system.
 SQA Audit Checklist. The SQA Audit Checklist shall be used by the Quality
Assurance Team to verify the compliance of the Project Team with the project
processes and procedures.
 Corrective Action Report. The Corrective Action Report comprises of any operation
or processing problems and the corrective actions taken to fix these problems.
 Preventive Action Report. The Preventive Action Report details the actions done to
eliminate potential root causes of possible problems.
 Software User’s Manual. The Software User’s Manual is a guide for usage of the
System. It contains details for doing certain functions using the system.

5. Standards, Practices, Conventions, and Metrics

5.1. Purpose
This section highlights the standards, practices, conventions, and metrics to be applied to
ensure a successful software quality program.

5.2. Software Quality Program


Software Quality Programs for the Watercooler project are governed by the policies
described in the ISO 9001:2008 standards. All software quality assurance staff members are
experienced in the ISO 9001:2008 standards and are applying generic and specific practices
for Process and Product Quality Assurance (PPQA).

In accordance to ISO 9001:2008 standards, the process model for the Watercooler project
will involve continuous improvement by planning, implementing and monitoring project
aspects and repeatedly reviewing project procedures. The Watercooler project will focus on
the following four characteristics of Quality Management:
Process Customer
Improvement Focus

Measurement
Quality
Analysis and
Culture
Improvement

The Watercooler project will involve business planning, management planning, establishing
of parameters and procedures and training to ensure that the project will be in tune with the
focus of the project’s quality program. In particular, involvement of the clients, the
management and staff will be important to the success of the implementation of the quality
program.
5.2.1. Standards, Practices, and Conventions
The following table lists all standards, practices, and conventions that will be used in
Watercooler project.

Standards / Description Inspection Type


Practices /
Conventions

UML Notation The UML notation will be used for Document Inspection
design documents.
Pseudocode Detailed Design documents will Document Inspection
follow the pseucode standard.
Coding Standard Programs will have to follow the Automated / Manual Code
coding standards. Different Review
programming languages will have
different coding standards.
Unit Test Standard All programs will have to undergo Code Review
unit testing. Unit testing standards
will be documented and will have to
be followed.
Usage of Wiki Practices, standards and pertinent Process Inspection
project information must be put in
the project wiki. All team members
must have access to the project
wiki.
Bug Tracking Bugzilla will be used for Bug Implementation Inspection
Tracking.
Best Practices for All bug reports are expected to Process Inspection
Bug Reporting have the following details:
 Current behaviour of the
system
 Expected behaviour of the
system
 Screenshot
 Replication Steps
Email All members must follow the Process Inspection
standard for email communication.
Subject of the all email messages
must conform to the standard.
Progress Reports The team must produce progress Implementation Inspection
reports at the end of each week.

5.2.2. Metrics
The use of metrics provides an objective means of gauging whether the project is on track
with respect to its cost, scope, and quality of its product. Selection of the correct metrics and
monitoring them will aid in a more timely escalation and resolution of issues or misses, and
will prevent the team from reverting to fire-fighting as its means of addressing issues. The
fire-fighting approach is reactive, while the use of metrics to control and monitor the project
is a proactive approach well-intended to mitigate risks early on.

To establish the most appropriate quality metrics that will be used in the project, the
following activities will be performed by the Quality Assurance team:

Establish software quality requirements

Identify software quality metrics

implement software quality metrics

Analyze Results of metrics

Validate metrics

The following standard metrics are the minimum planned metrics that will be collected,
reported, and maintained:

 Project size (Planned vs Actual)


 Development effort (Planned vs Actual)
 Module completion rate (Planned vs Actual)
 Code coverage of automated tests
 Defect density
 Number of defects categorized according to Priority, Severity
 Number of defects categorized according to Type (Functional, User Interface,
Performance, Usability, etc.)
 Defect rejection rate
 Defect removal rate
 Defect leakage rate
6. Software Reviews

6.1. Purpose
This section manages all software reviews that would be conducted by the SQA Team to
the system. The reviews would be mainly based from the Software Development Plan or
the Project Plan, which outlines the processes in the Systems Development Life Cycle, as
well as the scheduling, budget analysis, and other pertinent information that would prove
useful during the review process.

The software items produced during the software life cycle process should be reviewed and
audited on a planned basis to determine the extent of progress and to evaluate the technical
adequacy of the work and its conformance to software requirements standards. Technical
reviews and audits should be conducted to evaluate the status and quality of the software
development effort. This ensures that the elements completely and correctly embody its
baseline specification.

6.2. Minimum Software Reviews


For each review, the Software Quality Assurance will assess the review products to assure
that review packages are being developed according to the specified criteria and that the
review content is complete, accurate, and of sufficient detail, and that Requests for Action
are captured, reviewed, and tracked to closure. In addition, the Software Quality Assurance
will assess the processes used to conduct the reviews to determine if appropriate personnel
are in attendance, correct information is presented, entry and exit criteria are met, and
appropriate documents are identified for update.

The following software reviews may be assessed by the Software Quality Assurance:

6.2.1. Software Specification Review (SSR)


The objective of the Software Specification Review (SSR) is to review the finalized Computer
Software Configuration Item requirements and operational concept. A successful SSR
shows that the SRS, IRS, and Operational Concept Document form a satisfactory basis for
proceeding into preliminary software design.

6.2.2. Software Preliminary Design Review (PDR)


The objective of the Software Preliminary Design Review (PDR) is to evaluate the progress,
consistency, and technical adequacy of the selected top-level design and test approach,
compatibility between software requirements and preliminary design, and the preliminary
version of the operation and support documents.
6.2.3. Software Critical Design Review (CDR)
The objective of the Software Critical Design Review (CDR) is to determine acceptability of
the detailed design, performance, and test characteristics of the design solution, and on the
adequacy of the operation and support documents.

6.2.4. Software Test Readiness Review (TRR)


The objective of the Software Test Readiness Review (TRR) is to determine whether the
software test procedures are complete and to assure that the developer is prepared for
formal Computer Software Configuration Item testing.

6.2.5. Formal Qualification Review (FQR)


The Formal Qualification Review (FQR) is the test, inspection, or analytical process by which
a group of configuration items comprising the system are verified to have met specific
program or project management performance requirements.

6.2.6. Production Readiness Review (PRR)


The objective of the Production Readiness Review (PRR) is to determine the status of
completion of the specific actions which must be satisfactorily accomplished prior to
executing a production go-ahead decision.

6.2.7. Software Concept Review (SCR)


The Software Concept Review (SCR) is a buyer control gate which reviews and approves
the recommended system concept configured to satisfy the system requirements document.
The SCR is the decision point to proceed with the development of the system specification.

6.2.8. Acceptance Review (AR)


The Acceptance Review (AR) is the specification compliance control gate or check point at
which adherence to expectations of the service or deliverables is verified. This may be
performed at any level of the system or process.

6.2.9. Operational Readiness Review (ORR)


The Operational Readiness Review (ORR) examines the actual system characteristics and
the procedures used in the system or end product’s operation and ensure that all system
and support hardware, software, personnel, procedures, and user documentation accurately
reflect the deployed state of the system.

6.2.10. Peer Reviews


A Peer Review is a review of a colleagues’s work by another with similar knowledge or
experience in the same or a closely related field. This is a review of products or services,
following defined procedures, by peers for the purpose of identifying deficiencies and
improvements.

7. Software Testing
The testing activity for the Watercooler project will be divided into three parts, namely:

 Unit-level testing – This is to be done by the software developer in order to ensure


that his/her assigned module meets its design and runs as intended
 Integration testing – This is handled by the assigned software tester so that it is
assured that the software application meets the design expected when the individual
project modules are combined and tested in groups.
 Performance testing – This is also handled by the assigned software tester for him to
see if the application runs at an acceptable speed, and does not consume too much
system resources.
 User acceptance testing (UAT) – This is the part where the end users themselves will
be able to test the software application. This can be further divided into two parts,
namely:
o Alpha Testing – tests are conducted in a test environment slightly controlled
by the development team
o Beta Testing – tests are conducted in an uncontrolled test environment
The general testing process flow diagram is shown below:
The SQ personnel will assure that the testing activity is implemented in accordance to the
Software Management Plan and/or Software Test Plan. They will also be monitoring the
testing efforts required to make sure that testing activity still follows the schedule. They will
assure that all constraints, assumptions made during the testing process, and the test results
are recorded accurately.

Lastly, they will review post-test documents such as test reports, test results, etc.

8. Problem Reporting and Corrective Action


This section describes the reporting and control system used by SQA to record and analyze
discrepancies and to monitor the implementation of corrective action. The forms utilized by
SQA for reporting are the Internal Audit Report, Corrective Action Report, Preventive Action
Report, Software Tool Evaluation Report, Facilities Evaluation Report, and monthly SQA
Status Report. Each of these forms and their uses are discussed in the following paragraph.

8.1. Internal Audit Report


The following are the pertinent information that should be included in the internal audit report:
 Classification of findings. Impact on the quality of service, business risks and
impact on the integrity of the QMS must be considered when classifying all problems
found during the SQA activity. The following are the classifications that must be used:
o Minor - an isolated observed lapse in the fulfilment of a specified requirement
that has no significant impact on the achievement of customer satisfaction.
o Major - absence or total breakdown of the system to meet a specified
requirement of a clause of ISO 9001:2008, or other reference document
including regulatory requirements, causing significant business risk.
o Observation - activities that are moving towards a non-conformance; and
activities if addressed, would improve the present practices.
 Non-conformances must be:
o Factual and objective.
o Give the clause of ISO 9001:2008
o Agreed with the Auditee
 Formulation of corrective and preventive actions.
o Corrective actions - are actions taken to eliminate the root cause of an
existing non-conformity, or other undesirable condition to prevent recurrence.
o Preventive actions - are actions taken to eliminate the root cause of a
potential non-conformity or other undesirable condition to prevent its
occurrence.
o Corrective and preventive actions must:
 Address the non-conformance.
 Address the root causes.
 Be timely - with practical and realistic time frames for implementation.
 Be monitored to assess their effectiveness.
(Please check Form-001: Internal Audit Report located in Appendix F.)

8.2. Corrective Action Report


The objective of a Corrective Action Report is to ensure all non-conformances identified
during the implementation of the management system are analyzed and actions formulated
to eliminate their root causes. This is to prevent recurrence. This covers formulation and
implementation of corrective actions for all types of non-conformances identified.

In the Corrective Action Report, all non-conformances should be identified during each
process in operations and details should be recorded. Non-conformances may be from the
voice of the customer (customer feedback) or voice of the system. Root causes of the non-
conformance should be analyzed and identified and result of analysis should be recorded in
the Corrective Action Report. The after, formulate and recommend actions to be taken.
Ensure time frames and responsibilities for actions are clearly defined.

The corrective action is considered effective if after one (1) month implementation and the
nonconformance did not recur.

The assigned person should monitor implementation of the recommended actions following
defined time frames. Identify interfaces with other processes of the management system and
determine effects. If actions taken are effective (the same problem did not recur), formalize
the changes.

(Please check Form-002: Corrective Action Report located in Appendix F.)

8.3. Preventive Action Report


The objective of a Preventive Action Report is to ensure all potential conformances and
improvements identified during the implementation of the management system are analyzed
and actions formulated to eliminate their root causes. This is to prevent occurrence. This
procedure covers formulation and implementation of preventive actions for all types of
potential non-conformances and improvements identified.

In the Preventive Action Report, all potential non-conformances and/or improvements during
each process should be identified and details should be recorded. Suggestions for
improvements are encouraged from all employees. Improvement suggestions should be
justified in terms of costs involved and benefits to be derived. Root Causes of the potential
non-conformance should be analyze and identified. Results of the analysis should be
recorded in the Preventive Action Report. Then after, formulate and recommend actions.
Ensure time frames and responsibilities for actions are clearly defined. The report assists in
monitoring implementation of the recommended actions following defined time frames.
Identify interfaces with other processes of the management system and determine effects. If
actions taken are effective (the same problem did not recur), evaluate.

(Please check Form-003: Preventive Action Report located in Appendix F.)

8.4. Software Tool Evaluation Report


The Software Tool Evaluation Report provides the format for evaluating software tools.

(Please check Form-004: Software Tool Evaluation Report located in Appendix F.)

8.5. Facilities Evaluation Report


The Facilities Evaluation Report provides the format for evaluating existing and planned
Watercooler facilities.

(Please check Form-005: Facilities Evaluation Report located in Appendix F.)

9. Tools, Techniques, and Methodologies

9.1. Software Quality Tools


Listed below are the SQA tools used during the development of the Watercooler project:

 Microsoft Office applications – these will be used in creating the SQA documents
throughout the project
 BugZilla - a Web-based general-purpose bugtracker and testing tool. This will be
used by the Watercooler project for tracking bugs/requests during development and
testing phase.
 PHPUnit – a unit-testing framework (based on Java’s JUnit framework) used for PhP
applications. This can be intergrated to the IDE as an external tool.

9.2. Project Tools


The development team used Eclipse for PhP developers to be the IDE for the Watercooler
project. The IDE allows for faster development by providing the common needed tools that
maximizes the developer’s productivity, such as a good source code editor and a debugger.
Also, the IDE is configured so that the following will be integrated as well:
 Concurrent Versioning System (CVS)
 Apache Server

9.3. Software Quality Techniques and Methodologies


Every SQA meeting scheduled must include all members of the SQA team. If a member is
not able to attend the meeting, he/she must be notified by the QAM on what happened
during the course of the meeting.

Any enhancements or defects raised must be analyzed by the SQA team. The team must be
able to assign a priority level to the said enhancement or fix depending on its complexity and
the impact on the system.

The development team must always be aware of the priorities of the issues raised. All
high/critical priority issues must be addressed first.

After a defect has been fixed or an enhancement has been added, the development team
must notify the QAM at the next SQA review, in which case the changes will be noted.

If a bug has been fixed or a change has been made, the software tester must re-execute the
testing process to verify the correctness of the fix/change and to make sure no child bug has
been created in the process.

10. Media Control


Media control includes:

 regularly scheduled backup of the media


 labelled and inventoried media filed in a storage area in accordance with security
requirements and in a controlled environment that prevents degradation or damage
to the media
 adequate protection from unauthorized access
The software media control methods and facilities are described in the Watercooler SCMP.
SQA will conduct ongoing evaluations of the software media control process to ensure that
the process of controlling the software media is effective and in compliance with the
Watercooler SCMP.
11. Supplier Control
Prior to any purchase of software to support the development effort, SQA and project
members will define and provide complete requirements to the supplier/vendor. The
Software Tool Evaluation process will be followed. Part of the evaluation process will require
the supplier/vendor to describe then technical support, handling of user questions and
problems, and software product upgrades.

All supplier software has been operationally tested in the target system. No future supplier
software is planned.

12. Record Collection, Maintenance, and Retention


SQ personnel will be in charge of the collection, maintenance and organization of all
document assessments done during and after the project. Document records include
process/product assessment reports, preventive/correction action reports, the SQ Activity
Schedule, progress reports, etc. Keeping these records provides traceability of assessments
performed during and after the project’s development.

13. Training
Software Quality personnel shall have fundamental knowledge in the following
areas/disciplines through prior experience, training, or certification in methodologies,
processes, and standards.

 Software Quality Assurance


 Audits and Reviews
 Risk Management
 Configuration Management
 ISO 9001:2008
Appropriate training has been provided to the Quality Assurance Management team. It is,
however, the responsibility of the Quality Assurance Management team to keep their
knowledge up-to-date during the period when there are no available trainings.

14. Risk Management


Risk management is the identification, assessment, and prioritization of risks followed by
coordinated and economical application of resources to minimize, monitor, and control the
probability and/or impact of unfortunate events or to maximize the realization of opportunities.
The strategies to manage risk include transferring the risk to another party, avoiding the risk,
reducing the negative effect of the risk, and accepting some or all of the consequences of a
particular risk.

Risk management includes the following activities:

 Planning how risk will be managed in the particular project. Plan should include risk
management tasks, responsibilities, activities and budget.
 Assigning a risk officer - a team member other than a project manager who is
responsible for foreseeing potential project problems.
 Maintaining live project risk database. Each risk should have the following attributes:
opening date, title, short description, probability and importance. Optionally a risk
may have an assigned person responsible for its resolution and a date by which the
risk must be resolved.
 Creating anonymous risk reporting channel. Each team member should have
possibility to report risk that he/she foresees in the project.
 Preparing mitigation plans for risks that are chosen to be mitigated. The purpose of
the mitigation plan is to describe how this particular risk will be handled – what, when,
by who and how will it be done to avoid it or minimize consequences if it becomes a
liability
 Summarizing planned and faced risks, effectiveness of mitigation activities, and effort
spent for the risk management.

15. SQA Plan Change Procedure and History


SQ personnel will be in charge of the maintenance of this SQA plan. Changes in SQ
activities and other related processes must reflect on this plan. Also, the Record of Changes
must be updated upon applying these changes. Proposed changes shall be submitted to the
QAM along with supportive material justifying the proposed change.
Table of Appendices

Appendix A – List of Abbreviations and Acronyms


Abbreviation/ DEFINITION
Acronym
AR Acceptance Review
CEI Contract End Item
CDR Critical Design Review
CVS Concurrent Versioning System
FQR Formal Qualification Review
ICD Interface Controlled Description
IDE Integrated Development Environment
IRS Interface Requirements Specification
ISO International Organization for Standardization
ORR Operational Readiness Review
PDR Preliminary Design Review
PM Project Manager
PPQA Process and Product Quality Assurance
PRR Production Readiness Review
QA Quality Assurance
QAM Quality Assurance Manager
QAP Quality Assurance Personnel
SCM Software Configurations Management
SCMP Software Configurations Management Plan
SCR Software Concept Review
SDP Software Development Plan
SQA Software Quality Assurance
SQAP Software Quality Assurance Plan
SRD System Requirements Document
SRS Software Requirements Specification
SSR Software Specification Review
SVD Software Version Description
TRR Test Readiness Review
UAT User Acceptance Testing
Appendix B – Business Process / System Flowchart
Appendix C – Sequence Diagrams
Appendix D – Test Procedures

Test Scenario: Verify system registration of new user


Test Case Definition: This test case verifies that the system registration allows for the
creation of new Watercooler users with access to the system.
Preconditions: Nil
Step Action Expected Results
1. Click on the Login link. The System Login page must be
displayed. To the right, link and button
for sign-up/registration must be
displayed.
2. Click on the Register button. The Watercooler Sign Up page must be
displayed.
3. Fill up the registration form, specifying The Acknowledgement page must be
an unregistered Email. Click on the displayed indicating a successful
Sign Up button. registration.

An email must be sent to the registered


email address with an activation link
provided.
4. In the registration confirmation email, The account is activated, and the user
click on the provided activation link. is automatically logged in to
Watercooler.

Name of the logged in user must be


displayed in the header.

Upon log in, the Followed Threads


page must be displayed.

Test Scenario: Verify profile update of existing user


Test Case Definition: This test case verifies that the system allows the registered user to
modify his/her account details.
Preconditions: User is logged in.
Step Action Expected Results
1. Click on the header link with the user’s The User Profile > Info page must be
name. displayed with details shown as read-
only.
2. Click on the Change Password link. The Change Password page must be
displayed with editable fields.
3. Enter the old and new passwords. Entered values must be masked. Copy
or paste to and from the password
fields must not be allowed.
4. Click on the Submit button. Password change must be stored into
the database in encrypted (not plain
text) format.
5. Click on the Edit Profile link. The Edit Profile page must be displayed
with editable fields.
6. Modify the details displayed, specifying Changes to the fields must be stored
values for all provided fields, then click into the database.
on the Submit button.
Returning to the User Profile > Info
page must display the values as
entered in the Edit Profile page.
7. Click on the Contact Settings link. The Contact Settings page must be
displayed with editable checkboxes.
8. Modify the details displayed such that Changes to the fields must be stored
there are some items checked and into the database.
some that are unchecked. Click on the
Submit button.
9. Log out of Watercooler. The System Login page must be
displayed.
10. Attempt to login using the original System error must be displayed saying
password. invalid email or password has been
provided.
11. Search for the Watercooler account to Displayed details must be according to
view the profile. what was specified in the Contact
Settings page.
 Fields with the corresponding
checkbox marked must be
displayed.
 Fields with the corresponding
checkbox unmarked must be
hidden.
12. Attempt to login using the new User must be successfully logged in to
password. the system.

Test Scenario: Verify creation of new threads and tags


Test Case Definition: This test case verifies that the registered user can create new
threads and tags.
Preconditions: User is logged in.
Step Action Expected Results
1. Click on the Ask Now! link from the The Ask Question page must be
header. displayed.
2. Enter the Title, Message and Tags, The page must refresh such that the
then click on the Submit button. created thread is displayed.

The Vote Up/Down icons must be


disabled for threads created by the
logged in user.
3. View the user profile, and go to the The thread created in the previous step
Created Threads tab. must be included in the list.
4. Click on the Tags link from the header. The tags used in the created thread
must be included in the list, under the
corresponding letter of the tag’s initial
letter.
Test Scenario: Verify posting of a reply to forum threads
Test Case Definition: This test case verifies that the registered user can reply to existing
forum threads.
Preconditions:
- User is logged in.
- There must be forum threads already created.
Step Action Expected Results
1. Click on the Forum link from the The Forum page must be displayed
header. with the threads sorted according to the
Date Posted with the newest thread
displayed first.
2. Click on the title of one of the threads. The Thread page must be displayed
with the selected thread displayed.
3. Input values into the Post Reply memo The page must be refreshed with the
box, then click on the Reply button. added reply displayed as read-only.
4. Click on the Forum link to return to the The corresponding number of replies
Forum page. for the selected thread must increment
by 1.
5. Click on the title link of the previously The Thread page must be displayed
selected thread. with the selected thread displayed.
6. Click on the Quote link of the thread or The page must be refreshed with the
of any existing reply. Post Reply memo box containing the
quoted post.
7. Click on the Reply button. The corresponding number of replies
for the selected thread must increment
by 1.

Test Scenario: Verify voting up/down of an existing thread


Test Case Definition: This test case verifies that the registered user can vote up/down an
existing thread.
Preconditions:
- User is logged in.
- There must be forum threads already created by another user.
Step Action Expected Results
1. Click on the Forum link from the The Forum page must be displayed
header. with the threads sorted according to the
Date Posted with the newest thread
displayed first.
2. Click on the title of one of the threads. The Thread page must be displayed
with the selected thread displayed.
3. Click on the Vote Up icon. The voting icons must then be disabled.
The Rating value must then be updated
with the recalculated value.

The Rating value is the percentage of


vote ups over the total number of votes.
4. Click on the Forum link to return to the The corresponding rating of the
Forum page. selected thread must show the newly
recalculated value.
Test Scenario: Verify flagging of forum posts
Test Case Definition: This test case verifies that the registered user can flag an existing
thread, and that the admin user can unflag or delete the flagged post.
Preconditions:
- User is logged in.
- There must be forum threads already created with replies.
Step Action Expected Results
1. Login as a non-Admin user. Upon successful login, the Followed
Threads page must be displayed.
2. Click on the Forum link from the The Forum page must be displayed
header. with the threads sorted according to the
Date Posted with the newest thread
displayed first.
3. Click on the title of one of the threads. The Thread page must be displayed
with the selected thread displayed.
4. Click on the Flag link for the initial post. Upon confirming the action, the page
must be refreshed. A red flagged icon
must be displayed to indicate that the
thread has been flagged.

An email notification containing a link to


the thread must then be sent to the
Admin user.
5. Click on the Forum link to return to the The flagged thread must then have a
Forum page. red flagged icon displayed.
6. Click on the title of one of the threads The Thread page must be displayed
that already have replies. with the selected thread displayed.
7. Click on the Flag link for one of the Upon confirming the action, the page
replies. must be refreshed. A red flagged icon
must be displayed corresponding to the
flagged reply.

An email notification containing a link to


the thread must then be sent to the
Admin user.
8. Log out from the system, then log in as Upon successful login, the Followed
an Admin user. Threads page must be displayed.
9. Open email client to retrieve the email The thread that had been flagged or
notifications. In one of the email containing a reply that had been
notifications for flagged posts, click on flagged must be displayed.
the provided link.
10. Click on the Unflag link. The corresponding reply or thread
becomes unflagged.
11. Click on the Delete link. The corresponding reply or thread must
then be deleted.
Test Scenario: Verify user moderation by Admin user
Test Case Definition: This test case verifies that the Admin user can view the list of users,
and disable user accounts.
Preconditions: Nil
Step Action Expected Results
1. Log in as an Admin user. Upon successful login, the Followed
Threads page must be displayed.

For Admin users, there must be another


tab labelled “Manage Accounts”.
2. Click on the Manage Accounts tab. The list of registered non-Admin users
must be listed in the page.
3. For one of the existing user account Upon confirmation, the page must be
listed, click on the Promote link. refreshed and the promoted user must
not be in the list anymore.
4. Click on the Moderators link. The list of registered users with Admin
rights must be listed in the page.

The promoted user from the previous


step must be displayed in this list.
5. Click on the Users link. The list of registered non-Admin users
must be listed in the page.
6. Click on the Disable link. Upon confirmation, the page must be
refreshed. Instead of “Disable”, there
must be an “Enable” link displayed.
7. Log out of Watercooler. The System Login page must be
displayed.

Test Scenario: Verify failed login of disabled accounts.


Test Case Definition: This test case verifies that a registered user that has been disabled
will not be allowed to log in to Watercooler.
Preconditions:
- User account that is disabled.
- Another user account that is still enabled.
Step Action Expected Results
1. Log in using a disabled account. An error message must be displayed
indicating failed login.
2. Log in using an enabled account. User must be successfully logged in.
By default, the Followed Threads page
must be displayed.

Test Scenario: Verify following/unfollowing of threads/tags


Test Case Definition: This test case verifies that the registered user can perform the
following:
- Follow/unfollow an existing tag
- View followed threads
- View threads under an existing tag
Preconditions:
- User is logged in.
- There must be forum threads already created.
Step Action Expected Results
1. Click on the Tags link from the header. The Tags page must be displayed with
the index at A by default.

Tags beginning with the letter A must


be displayed.
2. Click on one of the index links. The page must be refreshed with the
tags beginning with the selected index
displayed.
3. Click on the follow link beside a tag. The page must be refreshed. The tag
must then be displayed with an unfollow
link beside it.
4. Go to the User Profile > User’s Tags The followed tag must be included in
page. the list of tags displayed.

The followed tag must also be


displayed in the right-hand side under
the Followed Tags heading.
5. Go to the User Profile > Followed Forum threads having tags that the
Threads page. registered user follows must be
displayed.
6. Click on the Forum link. The Forum page must be displayed.
7. Click on one of the tag buttons. The Forum page must be refreshed
with the threads having the selected tag
displayed.
Appendix E – System Screenshots

Figure 1 - System Registration

Figure 2 - System Registration Acknowledgement

Figure 3 – Activation of Account


Figure 4 - Terms of Service
Figure 5 - System Login Page

Figure 6 - User Profile Page


Figure 7 - Edit User Profile Page

Figure 8 - Change Contact Settings Page


Figure 9 - Change Password Page

Figure 10 - Followed Threads Page


Figure 11 - Followed Tags Page
Figure 12 - Forums Page
Figure 13 - Create Topic Page

Figure 14 - View Topic Page


Figure 15 - Manage Accounts

Figure 16 - Delete / Lock Threads


Appendix F – Forms
This page was intentionally left blank.
FORM-001: INTERNAL AUDIT REPORT
Audit Date(s): IAR No.

Process:
Department:

Details of Non-Conformance: Classification:


Major
Minor
Observation

ISO Clause:

Audited by: Date:


Noted by: Date:

Analysis of root causes:

Corrective Action:

References:
Target Close-Out Date:
Prepared by: Date:
Approved by: Date:

Preventive Action :

References:
Target Close-Out Date:
Prepared by: Date:
Approved by: Date:

Follow-up required? No Yes Date:


Follow-up Conducted by: Date:
Remarks:

Reviewed and confirmed effective by:


Remarks:

Close-Out Date:
FORM-002: CORRECTIVE ACTION REPORT
CAR No. : __________________

PROCESS: DEPARTMENT/SECTION:

DETAILS OF NON-CONFORMANCE:

REFERENCES:

REPORTED BY: DATE:

INVESTIGATION RESULTS/ROOT CAUSE ANALYSIS:

CONDUCTED BY: DATE:

CORRECTIVE ACTIONS: RESPONSIBILITY TIMEFRAME

Target Date for evaluation of


effectiveness:

PREPARED BY: DATE:


APPROVED BY: DATE:
CONFIRMED EFFECTIVE BY: DATE:
REMARKS:
FORM-003: PREVENTIVE ACTION REPORT
PAR No. : ______________

PROCESS: DEPARTMENT:

DETAILS OF POTENTIAL NON-CONFORMANCE/SUGGESTION:

REFERENCES/JUSTIFICATIONS:

REPORTED BY: DATE:

INVESTIGATION RESULTS/ROOT CAUSE ANALYSIS:

CONDUCTED BY: DATE:

PREVENTIVE ACTIONS: RESPONSIBILITY TIMEFRAME

Target Date for evaluation of


effectiveness:

PREPARED BY: DATE:


APPROVED BY: DATE:
CONFIRMED EFFECTIVE BY: DATE:
REMARKS:
FORM-004: SOFTWARE TOOL EVALUATION REPORT

SOFTWARE TOOL EVALUATION

SQA:_________________________ DATE OF EVALUATION:________

Software Tool Evaluated:

Methods or criteria used in the evaluation:

Evaluation Results:

Recommended Corrective Actions

Corrective Action Taken


FORM-005: FACILITIES EVALUATION REPORT

PROJECT FACILITIES EVALUATION

SQA:_________________________ DATE OF EVALUATION:________

Facility Evaluated (Equipment, User/Test/Library Space):

Methods or criteria used in the evaluation:

Evaluation Results:

Recommended Corrective Actions

Corrective Action Taken

You might also like