You are on page 1of 66

9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

Ajay Kulkarni

One place to get Software Testing Interview Questions and Answers

Computer System Validation(CSV)

i
8 Votes

BASIC DEFINATION OF VALIDATION ENGINEERING

Audit Trail: Audit trail must be Secure, Computer generated, time-stamped, and independent. It record date and time of the
operator entries and actions that create modify, or delete electronic records.

Active Server Page(ASP): Microsoft’s technology to enables HTML pages to be dynamic and interactive by embedding scripts,
i.e. either VBScript or JScript, Microsoft’s alternative of JavaScript. Since the scripts in ASP pages (suffix .asp) are processed by
the server, any browser can work with ASP pages regardless of its support for the scripting language used therein. An ASP
Tutorial, ASP Developer Network, Lots of ASP Links.

CFR : Code of Federal Regulations

Computer system validation: Validation of entire system. It’s a validation of hardware, software, procedure. {Its
software testing in pharmaceutical company}

CLOSED/OPEN system:

CLOSED SYSTEM: An environment in which system access is controlled by persons who are responsible for the content of
electronic records that are on the system.

OPEN SYSTEM: An environment in which system access is NOT controlled by persons who are responsible for the content of
electronic records that are on the system

Computer Validation life cycle: Planing and Specification>design>Construction>Testing>Installation>Acceptance


testing>Operation.

computer system :any programmable device including its software, hardware, peripherals, procedures, users, interconnections
and inputs for the electronic processing and output of information used for reporting or control.

Core Dossier: companies choose CoreDossier over other systems:

It automates the entire regulatory publishing process

It is the only solution that transforms over 100 file formats to PDF while maintaining 100% page fidelity

It is the only solution that integrates with multiple repositories simultaneously

It can be implemented on one server, or multiple servers depending on need and scalability requirements

It supports multiple simultaneous output formats (e.g., PDF, paper, CD-ROM, Web, etc.)

It has successfully published over 5,000 submissions, including some of the largest submissions in the world
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 1/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Documentum 4i: Documentum 4i is an enterprise-level document-management platform that can handle any document type
but is particularly well-suited to compound documents that have a hierarchical structure. The product features a three-tier
architecture

a choice of relational databases for storage;

the Documentum server providing the object layer;

Documentum or Web browser client.

Disaster recovery. Disaster Recovery Associates develop and provide business continuity planning software, as well as
maintaining the Disaster Recovery Directory. The product range covers all phases, from business impact analysis to testing of
the plan itself.

Desktop Client: End user initiates signing through a desktop client module that integrates out-of-the-box with commonly used
applications.

Document cycle :

(1) Project Initial

(2) SRS, FRS,URS

(3) TDD/DS

(4) Coding

(5) IQ/OQ/PQ/USD

(6) Maintenances

(7) Retirement plan

DCM (Documentum Compliance Manager) 4.3: the Web Interface for DCM 4.3 includes:

Document Relationship Management

Manual Promotion of Controlled Content

DCM Reports

Issue To-Be Read Notifications

Issue Change Requests and Change Notices

Create and Edit DCM Routers

Developer studio: Are compiled, Can be wri en in different languages, such as Visual C++ and Visual Basic, Can access local
computer resources as well as the Developer Studio object model, Can use arbitrary modal dialog boxes, Can directly read
from or write to files on disk, using the Win32 API, Can use early binding for be er run-time performance, Can control another
application (.EXE)

Digital signature: an electronic signature based upon Cryptographic methods of originator authentication, compared by using
a set of rules and a set of parameters such that the identity of the data can be verified.

EDMS (Engineering Data management Service): EDMS is a service of Software product that work together to provide
electronic system of document management. It has two client modes.

(1) Desktop

(2) WebTop( we use web top)

And the various application that it support is

(1) One document management

(2) Import management

(3) Record manager

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 2/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
(4) Web publisher

(5) Business problem manager etc.

EDMS provides:

(1) Typed integration with auto writing tools, Ms Words, Power point.

(2) It has the ability to easily seen and capture existing document

(3) It provides a power full workflow automation process associated with creating, receiving, approving, distributing and
archiving.

(4) It allow full test servicing for user

(5) It provides unlimited scalability for supporting thousands of users and billions of documentation.

(6) It provides security on user and roles that protected valuable enterprise and contain knowledge documents.

Electronic Signature: A computer data Compilation of any symbol or series of symbols executed, adopted, or authorized by an
individual to be the legally binding equivalent of the handwri en signature.

End-to-end testing: It involves testing of a complete application enviournment in a situation that mimics real-world use, such
as interacting with a database. Using network communication, or interacting with other hardware, application, or system if
appropriate.

FDA compliances: The FDA compliance statements refer to the FDA limitations only.

Functional design specification (FDS): it provides the basic of the design of the system and its used to verify and validate the
system during testing , ensuring all the required functions are present and that they are operate correctly .

Gap analysis: To conduct a Gap Analysis, one must identify Gaps, holes, and area of contention where the application under
Observation does not meets the standard requirements. An identified Gap suggests that a functionality or process need further
development, either by modifying, improving or creating faulty / missing elements.

GCP (Good Clinical Practice):

-Centralized Laboratory

-Data Acquisition and Reporting

-Remove data Entry

-Case Report Form System

-Clinical Supply System

GLP (Good Laboratory system): management system that describes the organization and condition of the quality, integrity and
validation in which laboratory studies are conducted.

-Laboratory information management

-laboratory Robotics

-Toxicology System

-stability System

-Environment Impact

GMP (Good manufacturing Practice): management system that describes the organization and condition of the quality, Safety
and Efficacy for preparation of finished drug products.

-Manufacturing Execution

-maintenance management

-calibration management

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 3/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
-Facility management System

-Support Chain Planning

Installation Qualification (IQ): It confirms that software was installed successfully to the system manufacture’s
specifications for correct and reliable functioning. IQ confirms that correct system components, products and software are
installed, expected system parameters are set, the needed file structures, directories, and data base are in place, the proper
profile are set-up .

LIMS (Labority Information management system): LIMS and lab data management information into a single location to make
your online search as quick, simple and easy as possible. Designed especially for analytical lab includes researched
development (R&D) labs, in-process testing labs, Quality assurance (QA) labs and more .Versions/modules:
“Limsophy†which is now standard LIMS of AAC and designed for Window’s XX, oracle, MS-SQL-
Server.

Labware LIMS. LabWare LIMS is a full-featured Client/Server Laboratory Information Management System (LIMS) integrated
into the Windows environment. This architecture combines the power and security of a typical server with the ease of use
provided by the Microsoft GUI. LabWare LIMS is one of the world’s first client configurable LIMS product, providing an
unparalleled degree of client involvement in adapting the software to their specific needs

Operation Qualification (OQ): It proves that system complies with the operational process requirements and works as intended
throughout its operational range. OQ confirms that the system operates properly when working in its operational environment
while applications are running throughout the anticipated range of system input and other variables. Required SOP’s
are ready and user have been trained.

Performance Qualification (PQ): PQ proves the system performs consistently as intended during normal operational use and
remains in compliance with regulatory and user expectations.

PQ /UAT summary report:

Summarizes the following:-

Testing environment

Testing performed

Test results

Limitations that could not be corrected

Unresolved errors

Each test script will be given a status of PASS or FAIL.

Perform a Periodic Evaluation:

Regulatory compliance dictates that GxP system must be validated and remain in a Validated state. SPRI-05 states periodic
reviews must be performed every 3 years from the last full validation or the last periodic evaluation. Its done on validated
system already in production

Periodic Evaluation Process:

Collect/assess available documentation

Prepare periodic review and evaluation reports

PDF Aqua: PDF aqua puts the power of controlling mission-critical documents back in your hands by eliminating manual
procedures and establishing a highly efficient electronic environment with rules based security. With PDF aqua you can define
view and/or print- time document access & watermarking and the overlaying of document metadata onto document headers,
footers and body’s on-the-fly as the document is released from the document base, ensuring that the document
security se ings always reflect the user’s permissions.

Quick Test Pro: The Quickest is developed with top quality presentations, content and videos showing you exactly how to
perform the features without guessing. Quick Test Professional satisfies the needs of both technical and non-technical users. It
enables you to deploy higher-quality applications faster, cheaper, and with less risk.

Retirement Plan: A document that describes the overall strategy and responsible parties for moving a computer system from
operation status to inactive status.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 4/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Requirements traceability matrices: this is used to verify that all stated and derived requirement are allocated to system
component and other deliverable .the matrices is also used to determine the source of requirements . Requirement traceability
includes Tracing to thing other than software that satisfied the requirements such as Capability, design element, manual
operation test etc.

Remediation plan: it’s a solution to bring back a system into compliance.

Retrospective Evaluation: To justify that the legacy system has undergone review, and that the current systems supports data
integrity.

SLC (Software Life Cycle): The life cycle begins when an Application is first conceived and ends when it is no longer in use. It
includes aspects such as initial concepts, requirements analysis, functional design, internal design, documentation planning,
test planning, coding, document preparation, integration, testing, maintenance, updates, retesting, phase-out, and other
aspects.

SQL*LIMS: software is a complete laboratory information management system which aids in sample tracking, laboratory
processes, workflow, data access, storage and regulatory compliance issues.

Scripting language: A programming language designed specifically for Web site programming. Examples include JavaScript
and VBScript.

SOP (standard operating procedure): SOPs must be wri en specifically for the computer system to describe restart/recover,
security, source code maintenance, and row data storage, archive and retrieval. Among the SOPs to be reviewed are those
cover planning, requirement, design, code, test, configuration management and quality assurance. It’s a step by step
description for anything.

Security : The sophisticated used of security, rustication such as Password authorization etc allows system to use :

(1) Different version of same document i.e. previous version of update file link to one another and available for viewing.

(2) Viewers can add notes to document or alter or create version.

(3) Multiple component creation

(4) Take different piece of diff files and put together as required in one web document.

(5) Audit trail logging.

SDLC(Software Development Life Cycle):

(1) concept and planning

(2) requirement gathering

(3) design

(4) implementation

(5) validation/test

(6) maintenances

(7) retirement

SOPs have u wri en :

Testing inputs and commands

Security testing

Stress testing

Backup and recovery

Startup shutdown and recovery

Archive process

SQL*LIMS: structured query Language *lab Information management System.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 5/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Types of validation: The two types of validation are:

Prospective validation: the validation of a new system as it is developed

Retrospective validation: the validation of an existing system

Test cases: A document which describe input action or event and expected response to determine the future of an application is
working correctly .it should contain particulars such as test case Identifier, test case name , objectives , test conditions input
data requirements ,steps and expected results .

Test scripts: A document wri en to give detailed instructions for the setup, operation, and evaluation of results for a given test
for a specific module of the target system being tested under the Test plan. Followings are include in Test Scripts:

Test description

Acceptance criteria

Expected and actual result

Initial and dates of execution

Name of testers and reviewers

Test Plan: a high level document that will describe the key elements associated with the testing of a computer system. A
software project test plan is a document that describes the objectives, scopes, approach, and focus of a software testing effort.

Test Plan Execution:

Execute Test Scripts

Record each issue on a separate incident Form

Log all issue via an incident Log

Make appropriate changes

Re-test all modifications

Prepare/UAT summary report

Test Director: It’s a test management tools and it has many function like bug track and reporting, manage all test cases,
store all cases.

Technical design documentation (TDD): it is a document produced by the supplier prior to software coding.

User Requirement Specifications (URS): A URS documents what the automation must or should do. The focus of the URS must
be on the requirements themselves and these should be uniquely referenced by individual numbering and clearly highlighted
within the document .Requirements must be unambiguous and wri en in a technical language that is understood by both user
and supplier. The US is the foundation of a automation project and must support the design, testing, installation and operation
of a control and automation system throughout it life cycle.

Validation Master Plan: A high level document that describes how your company validates equipment, processes, computers,
methods and so on. Such a document is typically called Validation Master Plan. It also includes information on who is
responsible for what (by function, not by person’s name). Such a document should be developed for corporate, departments or
sites.

Validation summary Reports: it includes all the test results and summaries all the repots.

Validation process:

More than just testing

Overall QA Program

DEFINE the system

DESIGN the system

DEMONSTRATE performance of intended function and detection of errors.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 6/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
DOCUMENT the system and the process

Validation lifecycle: URS (User Requirement Specification)>>> FDS

(Functional Design Specification)>>> HDS (Hardware Design Specification) / SDS

(Software Design Specification)>>>Engineering>>> IQ (Installation Qualification)>>> OQ

(Operational Qualification)>>> PQ (Performance Qualification)

Validation Facets : The validation efforts consists of 5 specific facets or processes, each alone , would not constitute a validation

The Validation Master Plan (VMP)

The Project Plan

Installation Qualification (IQ)

Operational Qualification (OQ)

Performance or Process Qualification (PQ)

VSR and VSD:

VSD (Validation Summary Document): it indicates what to do and who is going to do it.

VSR (Validation summary Report): it indicates what was done, who did it and what the results were.

The VSD and VSR documents are considered “Bookend Deliverablesâ€Â

Workflow manager: The Workflow Manager helps you manage site workflow for your organization. SiteMaker CMS users can
efficiently communicate with one another, assign and review tasks, and read important web page notes while authorized users
can review and publish web pages to the live site. Workflow Manager smoothly integrates disparate systems and structures
complex business processes into a unified framework.

21 CFR parts 11, 210/211: Highlights of the Final Rule The final rule provides criteria under which FDA will consider electronic
records to be equivalent to paper records, and electronic signatures equivalent to traditional handwri en signatures

21 CFR part 11:

(1) Security

(2) Calculation

(3) Environment monitory

(4) Stability

(5) Instrument interface

(6) Audit

—————————————————————————————————————————————————

Computer System Validation – It’s More Than Just Testing

Computer System Validation is the technical discipline that Life Science companies use to ensure that each Information
Technology application fulfills its intended purpose.

Stringent quality requirements in FDA regulated industries impose the need for specific controls and procedures throughout
the Software Development Life Cycle (SDLC).

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 7/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Evidence that these controls and procedures have been followed and that they have resulted in quality software (software that
satisfies its requirements) must be documented correctly and completely.

These documents must be capable of standing up to close scrutiny by trained inspectors since the financial penalty for failing
an audit can be extremely high. More importantly, a problem in a Life Science software application that affects the production
environment could result in serious adverse consequences, including possible loss of life.

What is Computer System Validation and Why is it Important?


A key source document providing FDA guidance on the general topic of Validation is “General
Principles of Validation, Food and Drug Administration” from the Center for Drug Evaluation
and Research.

The definition of Validation in this document is:


Establishing documented evidence which provides a high degree of assurance that a
specific process will consistently produce a product meeting its predetermined
specification and quality a ributes.
Validation is aimed at manufacturers of pharmaceuticals and medical devices who must demonstrate that their processes
produce consistent product quality.

It applies to all processes that fall under FDA regulation, including, but not limited to, computer systems.

For example, Validation applies to pharmaceutical manufacturing processes which include checking, cleaning, and
documenting that all equipment used in manufacturing operates according to predetermined specifications.

Computer System Validation (or Computerized System Validation as it sometimes called in the literature) is the result of
applying the above definition to a Computer System:
Establishing documented evidence which provides a high degree of assurance that
a Computer System will consistently produce results that meet its predetermined
specification and quality a ributes.

Note: a “Computer System” in the Life Sciences sector is more than computer hardware and software. It also includes the
equipment and instruments linked to the system (if any) as well as the trained staff that operate the system and/or equipment
using Standard Operating Procedures (SOPs) and manuals.

the FDA definition of Validation is an umbrella term that is broader than the way the term validation is commonly used in the
IT industry.

In the IT industry, validation usually refers to performing tests of software against its requirements . A related term in the IT
world is verification, which usually refers to Inspections, Walkthroughs, and other reviews and activities aimed at ensuring
that the results of successive steps in the software development cycle correctly embrace the intentions of the previous step .

FDA Validation of computer systems includes all of these activities with a key focus on
producing documented evidence that will be readily available for inspection by the FDA. So
testing in the sense of executing the software is only one of multiple techniques used in Computer System Validation.

—————————————————————————————————————————————————

Validation FAQ (Frequently Asked Questions about Validation)

Q: When do I need to validate my systems?


A: Validation is required when your system (computer system, equipment, process, or method) is used in a GxP process or
used to make decisions about the quality of the product. In addition, if the system is used to generate information for
submissions to regulatory bodies like the FDA, the system needs to be validated.

Q: How does validation add value to my system?


Validation adds value to systems by demonstrating that the system will perform as expected. Validation also removes the risk
of regulatory non-compliance.

Q: Do I need to validate my computer system?


A: Computer system validation is required for systems used to store electronic records, according to FDA 21 CFR Part 11.10(a)
and Annex 11 Paragraph 4.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 8/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Q: Where do I find the rules for validating pharmaceutical manufacturing processes and equipment?
A: Guidelines for validation for pharmaceutical manufacturing are in FDA 21 CFR 211.

Q: What federal rules are in place regulating Quality Systems?


A: Quality System regulation is located in FDA 21 CFR 820.

Q: Why are there so many documents?


A: Proper documentation is required to demonstrate that the system was tested, including validation planning, protocol
execution, and quality review. From a regulatory auditor’s point of view, if you don’t document what you did, you didn’t do it.

Q: Am I allowed to change a validated system?


A: Changing validated systems requires Change Control to ensure that there are no unexpected or unrecorded changes to the
system.

Q: What is GAMP?
A: GAMP is an acronym for Good Automated Manufacturing Practices. GAMP contains a collection of industry best practices
for validation.

Q: What is ICH?
A: ICH is an acronym for the International Conference on Harmonization of Technical Requirements for Registration of
Pharmaceuticals for Human Use. ICH is a collaboration of regulatory authorities from the United States, Europe, Japan, and
members of the pharmaceutical industry. ICH also issues industry best practices for validation.

Q: Can you tell me if my systems need to be validated?


A: Yes. Ofni Systems performs compliance assessments, or we can train your staff to do your own gap analysis.

Q: Do you do validations?
Yes. Ofni Systems is an industry-recognized leader in computer validation.

Q: Do you have tools to facilitate our validation process?


A: Yes. The FastVal Validation Document Generator can improve the quality of your validation documentation and help you
complete validation projects in 70% less time than traditional validation methods.

—————————————————————————————————————————————————

Validation Master Plans

Validation Master Plans discuss validation activities across an entire site or within an organization. The Validation Master
Plan is a summary of validation strategy. The purpose of the Validation Master Plan is to document the compliance
requirements for the site and to ensure that sufficient resources are available for validation projects.

Sometimes Validation Master Plans are wri en to cover specific departmental validation activities or the validation process for
a specific type of system (for example, all programmable logic controllers (PLCs) within a manufacturing process). These
master plans describe the specific validation process for that group or system type. Master plans are wri en to assist an
organization with validation strategy or to provide control over a specific process.

The Validation Master Plan is different from a validation procedure (SOP), which describes the specific process for performing
validation activities. When plans are wri en specifically for a single validation project, they are referred to as Validation Plans
(h p://www.ofnisystems.com/services/validation/validation-plans/). Sometimes master plans are named for their function
area, such as a Site Validation Master Plan, Pharmaceutical Validation Master Plan, or Software Master Plan.

Validation Master Plan Examples

The Validation Master Plan includes:

Systems, equipment, methods, facilities, etc., that are in the scope of the plan
Current validation status for the systems within the project scope
Compliance requirements for validation, including how the validated state will be maintained
Schedule of validation activities

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 9/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Validation Master Plans can also include:

Required validation deliverables


Validation documentation format
Current validation procedures and policies
General validation risk mitigation strategy

Validation Master Plans should be approved by the head of Site Quality, plus other senior department heads as appropriate.
Senior management approval is necessary for Validation Master Plans because their support is essential for the success of the
plan.

—————————————————————————————————————————————————-

Validation Plans (VP)

Validation Plans define the scope and goals of a validation project. The Validation Plan is wri en at the start of the validation
project (sometimes concurrently with the user requirement specification) and is usually specific to a single validation project.

The collection of documents produced during a validation project is called a Validation Package. Once the validation project is
complete, all documents in the validation package should be stored according to your site document control procedures.

Validation Plans are different than Validation Master Plans (h p://www.ofnisystems.com/validation-master-plans/). Validation
Plans are usually project specific; Validation Master Plans govern validation activities for an entire organization or site.
Sometimes plans are also named for the applicable subject area, such as a Software Validation Plan.

Validation Plan Examples

A Validation Plan should include:

Deliverables (documents) to be generated during the validation process


Resources, departments, and personnel to participate in the validation project
Time-lines for completing the validation project
Acceptance criteria to confirm that the system meets defined requirements
Compliance requirements for the system, including how the system will meet these requirements

The plan should be wri en with an amount of detail that reflects system complexity.

The plans should be approved, at a minimum, by the System Owner


(h p://www.ofnisystems.com/services/validation/validation-terminology/#System_Owner) and Quality Assurance
(h p://www.ofnisystems.com/services/validation/validation-terminology/#Quality_Assurance). Once approved, the plan
should be retained according to your site document control procedures.

Frequently Asked Questions about Validation Plans

Q: What is the definition of Validation Plan?


A: The FDA uses the NIST definition: A management document describing the approach taken for a project. The plan typically
describes work to be done, resources required, methods to be used, configuration management and quality assurance
procedures to be followed, schedules to be met, project organization, etc. Project in this context is a generic term. Some projects
may also need integration plans, security plans, test plans, quality assurance plans, etc. (See: documentation plan, software
development plan, test plan, software engineering.) In practice, the validation plan describes how the validation project is
going to be performed.

Q: Can I see an example of a validation plan?


A: We have a sample validation plan (h p://www.ofnisystems.com/products/fastval/details/validation-plan-template/)
available for download.
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 10/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
—————————————————————————————————————————————————-

Risk Assessments

In validation, Risk Assessment documents potential business and compliance risks associated with a system and the strategies
that will be used to mitagate those risks. Risk Assessments justify allocation of validation resources and can streamline the
testing process. They also serve as a forum for users, developers, system owners, and quality to discuss the system which can
have other intangible benefits. 21 CFR 11 does not require risk assessments, but Annex 11 does require a risk-management
strategy.

Assigning risk should be a multi-disciplinary function. System owners, key end-users, system developers, information
technology, engineers, and Quality should all participate if they are involved with the system. The Risk Assessment should be
signed by the personnel who participated in the assessment.

Risk Assessment Examples

There are many methods for Risk Assessment, but they generally all include rating risk for each requirement in at least three
specific categories:

Criticality – How important a function is to system functionality. Low criticality means that the system can continue to
function relatively normally, even if the function is completely compromised. High risk means that if the function is
damaged, one of the primary functions of the system cannot be accomplished.
Detectability – The ease of detecting an issue arising with a particular function. It is more risky if there is a low chance of
detectability; high chances of detectability correspond to lower risk.
Probability – The probability of an issue arising with a particular function. Low probability means there is li le chance that
the function will fail; high probability means there is a high chance that the function will fail.

—————————————————————————————————————————————————-

User Requirements Specification

The User Requirements Specification describes the business needs for what users require from the system. User Requirements
Specifications are wri en early in the validation process, typically before the system is created. They are wri en by the system
owner and end-users, with input from Quality Assurance. Requirements outlined in the URS are usually tested in the
Performance Qualification or User Acceptance Testing. User Requirements Specifications are not intended to be a technical
document; readers with only a general knowledge of the system should be able to understand the requirements outlined in the
URS.

The URS is generally a planning document, created when a business is planning on acquiring a system and is trying to
determine specific needs. When a system has already been created or acquired, or for less complex systems, the user
requirement specification can be combined with thefunctional requirements document
(h p://www.ofnisystems.com/services/validation/functional-requirements/).

User Requirements Examples

Good requirements are objective and testable. For example:

Screen A accepts production information, including Lot, Product Number, and Date.
System B produces the Lab Summary Report.
Twenty users can use System C concurrently without noticeable system delays.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 11/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Screen D can print on-screen data to the printer.
System E will be compliant with 21 CFR 11 (h p://www.ofnisystems.com/21-cfr-part-11/).

The URS should include:

Introduction – including the scope of the system, key objectives for the project, and the applicable regulatory concerns
Program Requirements – the functions and workflow that the system must be able to perform
Data Requirements – the type of information that a system must be able to process
Life Cycle Requirements – including how the system will be maintain and users trained

For more examples and templates, see the User Requirements Specification Template
(h p://www.ofnisystems.com/products/fastval/details/fastval-user-requirements-template/).

Requirements are usually provided with a unique identifier, such as an ID#, to aid in traceability throughout the validation
process.

User Requirements Specifications should be signed by the system owner, key end-users, and Quality. Once approved, the URS
is retained according to your organization’s practices for document retention.

Frequently Asked Questions

Q: Are User Requirements Specifications always required for validation?


A: When a system is being created, User Requirements Specifications are a valuable tool for ensuring the system will do what
users need it to do. In Retrospective Validation, where an existing system is being validated, user requirements are equivalent
to the Functional Requirements: the two documents can be combined into a single document.

—————————————————————————————————————————————————-

Functional Requirements

The Functional Requirements Specification documents the operations and activities that a system must be able to perform.

Functional Requirements should include:

Descriptions of data to be entered into the system


Descriptions of operations performed by each screen
Descriptions of work-flows performed by the system
Descriptions of system reports or other outputs
Who can enter the data into the system
How the system meets applicable regulatory requirements

The Functional Requirements Specification is designed to be read by a general audience. Readers should understand the
system, but no particular technical knowledge should be required to understand the document.

Rapid Functional Requirement Creation

Examples of Functional Requirements

Functional requirements should include functions performed by specific screens, outlines of work-flows performed by the
system, and other business or compliance requirements the system must meet. Download an example functional requirements
specification (h p://www.ofnisystems.com/products/fastval/details/fastval-functional-requirements-template/) or use these
quick examples below.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 12/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Interface requirements

Field 1 accepts numeric data entry.


Field 2 only accepts dates before the current date.
Screen 1 can print on-screen data to the printer.

Business Requirements

Data must be entered before a request can be approved.


Clicking the Approve bu on moves the request to the Approval Workflow.
All personnel using the system will be trained according to internal SOP AA-101.

Regulatory/Compliance Requirements

The database will have a functional audit trail.


The system will limit access to authorized users.
The spreadsheet can secure data with electronic signatures.

Security Requirements

Members of the Data Entry group can enter requests but can not approve or delete requests.
Members of the Managers group can enter or approve a request but can not delete requests.
Members of the Administrators group cannot enter or approve requests but can delete requests.

Depending on the system being described, different categories of requirements are appropriate. System Owners, Key End-
Users, Developers, Engineers, and Quality Assurance should all participate in the requirement gathering process, as
appropriate to the system.

Requirements outlined in the Functional Requirements Specification are usually tested in the Operational Qualification.

Additional Comments

The Functional Requirements Specification describes what the system must do; how the system does it is described in the
Design Specification.

If a User Requirement Specification was wri en, all requirements outlined in the User Requirement Specification should be
addressed in the Functional Requirements Specification.

The Functional Requirements Specification should be signed by the System Owner and Quality Assurance. If key end-users,
developers, or engineers were involved with developing the requirements, it may be appropriate to have them sign and
approve the document as well.

Depending on the size and complexity of the program, the Functional Requirements Specification document can be combined
with either the user requirements specification or the design specification.

Frequently Asked Questions about Functional Requirements

Q: What is the difference between a User Requirement Specification and the Functional Requirement Specification?
A: User Requirements describe the end-user requirements for a system. Functional Requirements describe what the system
must do.

Q: Can I see an example of a functional specification?


A: We have a sample functional specification for an Excel spreadsheet available for download
(h p://www.ofnisystems.com/products/fastval/details/fastval-functional-requirements-template/).

—————————————————————————————————————————————————-

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 13/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

Design Specification

Design Specifications describe how a system performs the requirements outlined in the Functional Requirements. Depending
on the system, this can include instructions on testing specific requirements, configuration se ings, or review of functions or
code. All requirements outlined in the functional specification should be addressed; linking requirements between the
functional requirements and design specification is performed via the Traceability Matrix
(h p://www.ofnisystems.com/services/validation/traceability-matrix/).

Design Specification Examples

Good requirements are objective and testable. Design Specifications may include:

Specific inputs, including data types, to be entered into the system


Calculations/code used to accomplish defined requirements
Outputs generated from the system
Explaining technical measures to ensure system security
Identify how the system meets applicable regulatory requirements

For more examples and templates, see the FastVal Design Specification Template
(h p://www.ofnisystems.com/products/fastval/details/fastval-design-specification-template/).

System Requirements and verification of the installation process are usually tested in the Installation Qualification. Input,
Processing, Output, and Security testing are usually tested in the Operational Qualification.

Due to the extremely technical nature of most design documents, there is currently some discussion in the industry about who
needs to review the Design Specification. The Design Specification is reviewed and approved, at minimum, by the System
Owner, System Developer, and Quality Assurance. Quality Assurance signs to ensure that the document complies with
appropriate regulations and that all requirements were successfully addressed, but they do not necessarily need to review
technical information.

Depending on the size and complexity of the program, the design specification may be combined with the functional
requirements document.

Frequently Asked Questions

Q: Can I see an example of a Design Specification?


A: We have a sample design specification for an Excel spreadsheet available for download
(h p://www.ofnisystems.com/products/fastval/details/fastval-design-specification-template/).

—————————————————————————————————————————————————-

Test Protocols and Test Plans

In a validation project, Tests Plans or Test Protocols are used to demonstrate that a system meets requirements previously
established in specification, design, and configuration documents. Test Plans document the general testing strategy; Test
Protocols are the actual testing documents. In many cases, the Test Plan and Test Protocol are combined into a separate
document.

The Test Plan outlines the testing requirements and strategy. It should include the general process for performing the testing,
documenting evidence of testing and the process for handling testing failures. The Test Plan may also include the types of
testing, descriptions of environments where testing will be performed, who is responsible for testing, equipment or testing that

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 14/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
will be used in testing, or other organizational requirements for testing.

Test Protocols describe the specific testing. Test Protocols are collections of Test Cases which check a specific element of the
system. Each test case should include the purpose of the test, any pre-requisites that need to be done before testing, and the
acceptance criteria for the test.

Each test case is made up of a series of test steps. Each step should include an instruction, an expected result, and the actual
result. The instructions should include enough detail so that a tester can consistently perform the required testing activity.
There should also be a place for the tester to assess whether each step passes or fails.

The process of following the instructions and recording the results is called “executing” the protocol. When executing test
protocols, the tester should follow established Good Documentation Practices. This includes using a compliant computer
system to record the testing results or documenting the results on paper and pen. Any discrepancy between the expected result
and the actual result should be tracked as a deviation. Deviations should be resolved before validation is complete.

Software validation usually uses three specific testing protocols:

Installation Qualifications (h p://www.ofnisystems.com/services/validation/installation-qualification/) (IQ) verify that


systems are on machines suited to run the software, that the system has been properly installed and that the configuration is
correct. These requirements are outlined in the Design Specification.
Operational Qualifications (h p://www.ofnisystems.com/services/validation/operational-qualification/) (OQ) verify that
systems perform as expected. The OQ tests requirements outlined in the Functional Requirements.
Performance Qualifications (h p://www.ofnisystems.com/services/validation/performance-qualification/) (PQ) verify
that systems perform tasks in real-world conditions. The PQ tests requirements outlined in the User Requirement
Specification.

Engineering Validations sometimes use two additional testing protocols:

Factory Acceptance Test (FAT) – Factory acceptance tests are an a empt to verify that the equipment meets requirements
outlined in the User Requirement Specification or Functional Requirements. FATs are performed at the point of assembly.
Customers will often ask to be present for FAT, though the tests are usually performed by the manufacturer. Many
companies do not allow the company to ship the item without passing the factory acceptance test, and some contractual
payments are dependent upon the item passing FAT.
User Acceptance Test (UAT) or Site Acceptance Test (SAT) – User and site acceptance tests verify that the item performs as
required by the User Requirement Specification or Functional Requirements. Once an item passes UAT/SAT, it is ready for
use, unless other contractual arrangements are made between the user and the vendor.

Test Protocols should be approved before protocol execution. A copy of the unexecuted protocol should be kept in the
validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The executed
protocol should be signed by the tester and reviewed by the system owner and Quality.

Frequently Asked Questions

Q: Can I document test cases using MS Word or MS Excel?


A: When electronic systems are used to perform regulated processes (like the verification of validation test protocols), they
need to be compliant with 21 CFR 11. MS Word and MS Excel do not, in their out-of-the-box state, have the necessary
technological controls, like individual user passwords or audit trails, required to be compliant with electronic records
requirements such as 21 CFR 11 or Annex 11.

Q: How does Ofni Systems document validation testing?


A: At Ofni Systems, we use FastVal to execute test protocols electronically
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/). This allows us to execute protocols to ensure
requirement traceability and to generate the actual requirement traceability document. Other organizations might use Excel
spreadsheets to keep a table of requirements, despite this being extremely difficult to maintain manually.

—————————————————————————————————————————————————

Installation Qualification

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 15/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
The Installation Qualification Protocol verifies the proper installation and configuration of a System. This can include ensuring
that necessary files have been loaded, equipment has been installed, the necessary procedures have been approved, or the
appropriate personnel have been trained. The requirements to properly install the system were defined in the Design
Specification. Installation Qualification must be performed before completing the Operational Qualification
(h p://www.ofnisystems.com/services/validation/operational-qualification/) or Performance Qualification
(h p://www.ofnisystems.com/services/validation/performance-qualification/).

Depending on your needs and the complexity of the system, Installation Qualification can be combined with Operational
Qualification or Performance Qualification.

Installation Qualification protocols should be approved before protocol execution. A copy of the unexecuted protocol should
be kept in the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance.
The executed protocol should be signed by the tester and reviewed by the system owner and Quality.

Installation Qualification Examples

Installation Qualification might test:

That the operating system has the appropriate processor, RAM, etc.
That all files required to run the system are present
That all documentation required to train system personnel has been approved

Each step of the qualification should include an instruction, an expected result, and the actual result. Any discrepancy between
the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is
complete.

For more examples, see our installation qualification template


(h p://www.ofnisystems.com/products/fastval/details/installation-qualification-template/).

For an example of protocol execution, see our FastVal Electronic Protocol Execution
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/).

Frequently Asked Questions

Q: What is the definition of Installation Qualification?


A: The FDA definition of installation qualification is: Establishing confidence that process equipment and ancillary systems are
compliant with appropriate codes and approved design intentions, and that manufacturer recommendations are suitably
considered. In practice, the installation qualification is the executed test protocol documenting that a system has the necessary
prerequisite conditions to function as expected.

Q: Can I see an example of an installation qualification?


A: We have a sample installation/operational qualification for an Excel spreadsheet available fordownload
(h p://www.ofnisystems.com/products/fastval/details/installation-qualification-template/).

Q: Can I execute installation qualification test cases using MS Word or MS Excel?


A: When electronic systems are used to perform regulated processes (like the verification of validation test protocols), they
need to be compliant with 21 CFR 11. MS Word and MS Excel do not, in their out-of-the-box state, have the necessary
technological controls, like individual user passwords or audit trails, required to be compliant with electronic records
requirements such as 21 CFR 11 or Annex 11. Ofni Systems recommends that organizations do not perform installation
validation with non-compliant software like MS Word and MS Excel.

Q: How does Ofni Systems document validation testing?


A: At Ofni Systems, we use FastVal to execute test protocols electronically
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/). This allows us to execute protocols to ensure
requirement traceability and to generate the actual requirement traceability document. Other organizations might use Excel
spreadsheets to keep a table of requirements, despite this being extremely difficult to maintain manually.

—————————————————————————————————————————————————-

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 16/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

Operational Qualification

The Operational Qualification Protocol is a collection of test cases used to verify the proper functioning of a system. The
operational qualification test requirements are defined in the Functional Requirements Specification. Operational Qualification
is usually performed before the system is released for use.

Depending on your needs and the complexity of the system, Operation Qualification can be combined with Installation
Qualification or Performance Qualification.

Operational Qualifications should be approved before protocol execution. A copy of the unexecuted protocol should be kept in
the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The
executed protocol should be signed by the tester and reviewed by the system owner and Quality.

Operational Qualification Examples

For example, the operational qualification might test:

That each screen accepts the appropriate data


That an item can be moved through an entire workflow
That system security has been properly implemented
That all technological controls for compliance with 21 CFR 11 are functioning as expected

Each step of the qualification should include an instruction, an expected result, and the actual result. Any discrepancy between
the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is
complete.

For more examples, see our operational qualification template (h p://www.ofnisystems.com/products/fastval/details/fastval-


operational-qualification-template/).

For an example of protocol execution, see our FastVal Electronic Protocol Execution
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/).

Frequently Asked Questions

Q: What is the definition of Operational Qualification?


A: The FDA definition of operational qualification is: Establishing confidence that process equipment and sub-systems are
capable of consistently operating within stated limits and tolerances. In practice, the operational qualification is the executed
test protocol documenting that a system meets the defined functional requirements, or that the system does what it’s supposed
to do.

Q: Can I see an example of an operational qualification?


A: We have a sample installation/operational qualification for an Excel spreadsheet available for download
(h p://www.ofnisystems.com/products/fastval/details/fastval-operational-qualification-template/).

Q: Can I execute operational qualification test cases using MS Word or MS Excel?


A: When electronic systems are used to perform regulated processes (like the verification of validation test protocols), they
need to be compliant with 21 CFR 11. MS Word and MS Excel do not, in their out-of-the-box state, have the necessary
technological controls, like individual user passwords or audit trails, required to be compliant with electronic records
requirements such as 21 CFR 11 or Annex 11. Ofni Systems recommends that organizations do not perform operational
validation with non-compliant software like MS Word and MS Excel.

Q: How does Ofni Systems document validation testing?


A: At Ofni Systems, we use FastVal to execute test protocols electronically
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/). This allows us to execute protocols to ensure

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 17/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
requirement traceability and to generate the actual requirement traceability document. Other organizations might use Excel
spreadsheets to keep a table of requirements, despite this being extremely difficult to maintain manually.

—————————————————————————————————————————————————

Performance Qualification

Performance Qualifications are a collection of test cases used to verify that a system performs as expected under simulated real-
world conditions. The performance qualification tests requirements defined in the User Requirements Specification
(h p://www.ofnisystems.com/services/validation/user-requirement-specifications/) (or possibly the Functional Requirements
Specification (h p://www.ofnisystems.com/services/validation/functional-requirements/)). Sometimes the performance
qualification is performed by power users as the system is being released.

Depending on your needs and the complexity of the system, Performance Qualification can be combined with Installation
Qualification (h p://www.ofnisystems.com/services/validation/installation-qualification/) or Operational Qualification
(h p://www.ofnisystems.com/services/validation/operational-qualification/).

Performance Qualifications should be approved before protocol execution. A copy of the unexecuted protocol should be kept
in the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The
executed protocol should be signed by the tester and reviewed by the system owner and Quality.

Performance Qualification Examples

For example, a performance qualification might demonstrate:

That a system can handle multiple users without significant system lag
That when the system contains large quantities of data, queries are returned in a certain (short) period of time
That concurrent independent work-flows do not affect each other
That a laboratory test correctly identifies a known material
That a process was completed within defined system requirements

Each step of the qualification should include an instruction, an expected result, and the actual result. Any discrepancy between
the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is
complete.

For more examples, see our performance qualification template (h p://www.ofnisystems.com/products/fastval/details/fastval-


performance-qualification-template/).

For an example of protocol execution, see our FastVal Electronic Protocol Execution
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/).

Frequently Asked Questions

Q: What is the definition of Performance Qualification?


A: The FDA definition of performance qualification is: Establishing confidence through appropriate testing that the finished
product or process produced by a specified process meets all release requirements for functionality and safety and that
procedures are effective and reproducible. In practice, the performance qualification is the executed test protocol documenting
that a system meets the defined requirements to function in the production environment.

Q: Can I execute performance qualification or combined qualification testing cases using MS Word or MS Excel?
A: When electronic systems are used to perform regulated processes (like the verification of validation test protocols), they
need to be compliant with 21 CFR 11. MS Word and MS Excel do not, in their out-of-the-box state, have the necessary
technological controls, like individual user passwords or audit trails, required to be compliant with electronic records
requirements such as 21 CFR 11 (h p://www.ofnisystems.com/21-cfr-part-11/) or Annex 11. Ofni Systems recommends that
organizations do not do performance validation with non-compliant software like MS Word.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 18/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Q: How does Ofni Systems document validation testing?
A: At Ofni Systems, we use FastVal to execute test protocols electronically
(h p://www.ofnisystems.com/products/fastval/details/protocol-execution/). This allows us to execute protocols to ensure
requirement traceability and to generate the actual requirement traceability document. Other organizations might use Excel
spreadsheets to keep a table of requirements, despite this being extremely difficult to maintain manually.

—————————————————————————————————————————————————-

Requirements Traceability Matrix

The Requirements Traceability Matrix (RTM) is a document that links requirements throughout the validation process. The
purpose of the Requirements Traceability Matrix is to ensure that all requirements defined for a system are tested in the test
protocols (h p://www.ofnisystems.com/services/validation/test-protocols/). The traceability matrix is a tool both for the
validation team, to ensure that requirements are not lost during the validation project, and for auditors, to review the
validation documentation.

The requirements traceability matrix is usually developed in concurrence with the initial list of requirements (either the User
Requirements Specification (h p://www.ofnisystems.com/services/validation/user-requirement-specifications/) or Functional
Requirements Specification (h p://www.ofnisystems.com/services/validation/functional-requirements/)). As the Design
Specifications and Test Protocols are developed, the traceability matrix is updated to include the updated documents. Ideally,
requirements should be traced to the specific test step in the testing protocol in which they are tested.

Requirements Traceability Matrix Example

The traceability matrix can either reference the requirement identifiers (unique numbers for each requirement) or the actual
requirement itself. In the example shown below, requirements are traced between a Functional Requirements Specification,
Design Specification, and Operational Qualification.

Functional Requirements Design Specifications Test Cases


The program will have a Each form will use fxn_Audit_Trail in the OQ, Test Case 3, Step 52: Audit
functional audit trail. OnUpdate event procedure. Trail Verification
For more examples, see our FastVal Traceability Matrix (h p://www.ofnisystems.com/products/fastval/details/traceability-
matrix/) template.

In more complicated systems, the traceability matrix may include references to additional documentation, including user
requirements, risk assessments, etc.

The traceability matrix can be created and maintained in an automated tool


(h p://www.ofnisystems.com/products/fastval/details/traceability-matrix/), in an Excel spreadsheet, or MS Word table.

Frequently Asked Questions

Q: Are traceability matrices required by the FDA and other regulatory bodies?
A: The United States Code of Federal Regulations does not specifically require a traceability matrix, but creating a traceability
matrix is recognized as a validation best practice. The FDA General Principles of Software Validation state, “Software
validation includes confirmation of conformance to all software specifications and confirmation that all software requirements
are traceable to the system specifications (Page 7, Section 3.2).” Traceability matrices are also mentioned in Annex 11, 4.4, which
states, “User requirements should be traceable through the life-cycle.” Not having a traceability matrix should not, in itself,
cause an observation from an FDA auditor, but would likely be cited as indication that a validation process did not follow
recognized industry practices for validation.

Q: How does Ofni Systems create their traceability matrices?


A: We use an automated traceability matrix utility (h p://www.ofnisystems.com/products/fastval/details/traceability-matrix/)
to ensure requirement traceability and generate the actual requirement traceability document. Other organizations might use
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 19/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Excel spreadsheets to keep a table of requirements, despite this being extremely difficult to maintain manually.

—————————————————————————————————————————————————-

Test Protocol Deviations

When the actual results of a test step in a Test Protocol do not match the expected results, this is called a Deviation.

Protocol Deviation Reporting

Deviation reports should include:

Description – How the actual results differ from the expected results
Root Cause – What caused the deviation
Corrective Action – What changes were made to the testing protocol or the system to correct the deviation

Deviations should be reviewed and the solution approved by the System Owner and Quality Assurance. Deviations do not
necessarily need to be separate documents, but a system should be in place to ensure that all deviations are addressed before a
validation project is resolved. An organization’s control over their deviation process is often reflective of their Quality
organization as a whole; thus, regulatory auditors will often focus on the deviation process.

Deviation Management

Deviation Management (h p://www.ofnisystems.com/products/fastval/details/deviation-tracking/) is a central feature of the


FastVal software. Deviations are captured in real time, with associated screenshots and tester notes added to the record.
Additional information, such as as root cause and corrective actions, can be added as they are identified. Deviation summaries
are generated automatically, and tracking and metrics tools allow for continual process improvement and enhanced Quality
Control.

—————————————————————————————————————————————————-

Validation Summary Report

Validation Summary Reports provide an overview of the entire validation project. Once the summary report is signed, the
validation project is considered to be complete. When regulatory auditors review validation projects, they typically begin by
reviewing the summary report.

When validation projects use multiple testing systems, some organizations will produce a testing summary report for each test
protocol, then summarize the project with a final Summary Report.

The amount of detail in the reports should reflect the relative complexity, business use, and regulatory risk of the system. The
report is often structured to mirror the validation plan (h p://www.ofnisystems.com/services/validation/validation-plans/) that
initiated the project.

The report is reviewed and signed by the system owner and Quality.

The collection of documents produced during a validation project is called a Validation Package. Once the validation project is
complete, all validation packages should be stored according to your site document control procedures. Summary reports
should be approved by the System Owner (h p://www.ofnisystems.com/services/validation/validation-
terminology/#System_Owner) andQuality Assurance (h p://www.ofnisystems.com/services/validation/validation-
terminology/#Quality_Assurance).

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 20/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

Validation Summary Examples

The validation summary report should include:

A description of the validation project, including the project scope


All test cases performed, including whether those test cases passed without issue
All deviations reported, including how those deviations were resolved
A statement whether the system met the defined requirements

For more examples, see our FastVal Validation Summary Report Template
(h p://www.ofnisystems.com/products/fastval/details/summary-report-template/).

Frequently Asked Questions about Validation Summary Reports

Q: What is the definition of Validation Plan?


A: The National Institute of Cancer’s validation summary report definition is: A summary of all planned activities, their success
or failure, and any deviations from the expected results or plans encountered. A satisfactory resolution should be provided to
explain and resolve any deviations encountered. This test summary report may be optional. Results of all testing activities may
be summarized in the Validation Summary Report rather than a separate summary for each testing phase. In practice, the
validation summary report describes how the activities described in thevalidation plan
(h p://www.ofnisystems.com/services/validation/validation-plans/) were (or were not) accomplished.

Q: Can I see an example of a validation plan?


A: We have a sample validation summary report (h p://www.ofnisystems.com/products/fastval/details/summary-report-
template/) available for download.

—————————————————————————————————————————————————-

Change Control for Validated Systems

Change Control is a general term describing the process of managing how changes are introduced into a controlled System.
Change control demonstrates to regulatory authorities that validated systems remain under control during and after system
changes. Change Control systems are a favorite target of regulatory auditors because they vividly demonstrate an
organization’s capacity to control its systems.

Organizations need to explicitly define their processes for evaluating changes to validated systems. There should be a well
defined, multidisciplinary approach to considering the effects from proposed changes. Some changes, such as adding a data
field to a form or report may be very minor; other changes, such as altering how a program stores and organizes data can be
quite extensive. Before changes are implemented, organizations should document the expected outcomes of the changes and
have an established plan to implement and test the change and update any existing validation documentation. Part of defining
the process for evaluating change control should include the requirements for implementing minor, major and critical changes.
This allows the organization to focus proportionate validation resources to the change effort.

One useful tool to determine the extent of revalidation is Risk Assessment. By reviewing the original validation requirements,
and evaluating the new risks introduced through the changes to the system, the Risk Assessment process can help determine
which sections of the system will need re-testing. If the risk assessment determines that the change is minor or does not affect
the system requirements, only limited testing, focused on the affected system object would be required to demonstrate that the
system has maintained its validated state. Major changes will require additional re-validation and critical changes could trigger
and entire re-validation of a system.

Typical Steps in a Change Control project are:

Request the Change – The System Owner formally requests a change to the system.
Assess the Impact of the Change – Before the change is made, the system owner and other key stake holders, including
Quality, determine how the change will affect the system.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 21/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
System Development in a Safe Environment – Changes should be initially made away from the validated system. For
computer systems, this can mean testing in a Sandbox environment. For equipment, process or method validations, this
usually means implementing the change during a period when manufacturing has shut down.
System Testing/Re-Validation – Before changes are accepted, the system is validated to ensure system accuracy, reliability
and consistent intended performance.
Implementation of the Change – The changed system is released to the site and users are trained on changes to the system.
For computer systems, this means pushing the changes out to general users. For equipment, process or method validation,
this means introducing the system into the larger production process.

Frequently Asked Questions

Q: Do I need to revalidate a system every time I make a change?


A: It depends on the scope of the change, the structure of the system and any new risks introduced into the system. Changes to
critical components of a system might require a complete revalidation, but smaller changes might only require testing of the
changes

—————————————————————————————————————————————————-

Validation Terminology

A list of common validation terminology. A list of Frequently Asked Questions


(h p://www.ofnisystems.com/services/validation/validation-faq) about validation is also available.

Actual Result – What a system does when a particular action is performed

Deliverable – A tangible or intangible object produced as a result of project execution, as part of an obligation. In validation
projects, deliverables are usually documents.

Deviation – When a system does not act as expected

End-User – A person who uses the validated system

Expected Result – What a system should do when a particular action is performed

Protocol – A collection of Test Cases, used to document the testing of a system

Qualification – A testing protocol which designates that a system meets a particular collection of requirements. An Installation
Qualification ensures that a system has been properly installed. An Operational Qualification demonstrates that a system
functions as expected in a controlled environment. A Performance Qualification verifies that a system works under real-life
conditions.

Quality Assurance – Members of the organization who are tasked with ensuring the quality of materials produced at that
organization. GxP organizations are required to have robust and independent Quality Assurance operations. Depending on the
organization, this group may be titled Quality Control or Quality Organization; other organizations have multiple groups
dedicated to quality with their own distinct missions.

Requirement – Something a system must be able to do

Retrospective Validation – Validation of an existing system. Retrospective validations are usually performed in response to a
new need for a system to be compliant or an identified gap in GxP compliance.

Specification – A document outlining the requirements for a system. Specifications are usually sub-divided into User
Requirements Specifications, Functional Requirements, and Design Specifications.

System – Object or process undergoing validation. In these pages, system is intended to be a generic term, meaning computer
system, equipment, method or process to be validated.

System Owner – The individual who is ultimately responsible for a system

Test Case – A documented procedure, used to test that a system meets a particular requirement or collection of requirements

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 22/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Test Plan – A general testing methodology established to ensure that a system meets requirements. A Test Plan can also refer to
the collection of protocols or qualifications used to test and document that a system meets requirements.

Test Step – An individual line of a Test Case. Each Test Step should include instructions, an expected result, and an actual
result.

Traceability – The ability to ensure that requirements outlined in the specifications have been tested. This is usually recorded
in a Requirements Traceability Matrix.

Validation – A documented process, testing a system to demonstrate and ensure its accuracy, reliability, and consistent
intended performance

Validation Package – A collection of documents produced during a validation project

Common Validation Acronyms

CC – Change Control
DS – Design Specification
FAT – Factory Acceptance Testing
FS – Functional Specification
FRS – Functional Requirement Specification (See Functional Specification)
GCP – Good Clinical Practice, a collection of quality guidelines for clinical operations
GLP – Good Laboratory Practice, a collection of quality guidelines for pharmaceutical laboratory operations
GMP – Good Manufacturing Practice, a collection of quality guidelines for pharmaceutical manufacturing operations
GxP – An abbreviation combining GCP, GLP, and GMP. Sometimes also called cGxP, Current Good Practices
IQ – Installation Qualification
IOPQ – Installation/Operational/Performance Qualification
IOQ – Installation/Operational Qualification
PQ – Performance Qualification
OPQ – Operational/Performance Qualification
OQ – Operational Qualification
RTM – Requirement Traceability Matrix
SAT – Site Acceptance Testing
SDS – Software Design Specification (See Design Specification)
Spec – Specification
TM – Traceability Matrix
UAT – User Acceptance Testing
URS – User Requirement Specification
VMP – Validation Master Plan
VP – Validation Plan

—————————————————————————————————————————————————-

Computer System Validation (CSV)

Computer system validation (sometimes called computer validation or CSV) is the process of documenting that a computer
system meets a set of defined system requirements. Validation of computer systems to ensure accuracy, reliability, consistent
intended performance, and the ability to discern invalid or altered records is a critical requirement of electronic record
compliance, as described in the FDA 21 CFR 11.10(a) and EMA Annex 11, Section 4.

Computer System Validation Services

Our computer system validation experts have validated computer programs for all types of FDA-regulated businesses,
including pharmaceutical and biologics manufacturers, medical device manufacturers, clinical research organizations, and
GLP laboratories.
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 23/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Software Validation (h p://www.ofnisystems.com/services/validation/software-validation/) – Computer systems
validation services for SAP, LIMS, Salesforce, Trackwise, and other business and laboratory data management systems
Web Based Applications (h p://www.ofnisystems.com/services/validation/computer-systems/web-applications/) –
Specialized validation services for web, cloud, and mobile applications (h p://www.ofnisystems.com/fda-mobile-medical-
applications-guidance/)
MS Excel Spreadsheets (h p://www.ofnisystems.com/services/validation/spreadsheet-validation/) – Spreadsheet security
and compliance with 21 CFR 11
MS Access Databases (h p://www.ofnisystems.com/services/validation/computer-systems/access-databases/) – Your
internally-created databases made compliant with FDA requirements

Computer System Validation Training

Ofni Systems has given computer validation presentations and training classes to organizations like FOI Services, ISPE, IVT,
and PDA.

Computer System Validation (h p://www.ofnisystems.com/services/compliance-training/) – Scalable training classes on


computer validation. Learn computer system validation principles and ensure compliance.
Spreadsheet Compliance and Validation (h p://www.ofnisystems.com/services/compliance-training/) – Online training
classes that discuss specific validation requirements for spreadsheet validation
Review of Computer System Validation Documentation and Techniques
(h p://www.ofnisystems.com/services/compliance-training/) – Web-based classes that provide an overview of the
validation process and best practices

Computer System Validation Resources

Additional computer system validation guidance and resources from Ofni Systems

Validation Documents (h p://www.ofnisystems.com/services/validation/validation-resources/) – A library of information


about computer system validation plans, functional specifications, and other validation documentation
21 CFR 11.10(a) (h p://www.ofnisystems.com/information/resources/introduction-to-21-cfr-11/21-cfr-11-10a-validation-of-
systems/) – Read about FDA computer system validation requirements with additional commentary from Ofni Systems
validation experts.
FastVal (h p://www.ofnisystems.com/products/fastval/) – Control your validation process with Ofni Systems validation
management system.

Additional Electronic Record Compliance Services

Part 11 and Validation Assessments (h p://www.ofnisystems.com/services/gxp-compliance-auditing/) – Ofni Systems can


review your electronic record compliance or create audit checklists for your organization.
Software Testing (h p://www.ofnisystems.com/services/software-testing/) – Stress testing, challenge testing, load testing,
and other specialized software testing services for FDA-regulated businesses
Data Migration (h p://www.ofnisystems.com/services/data-migration/) – Migrate data from legacy systems and ensure
accurate data transfer following FDA guidelines.

Maximize the Benefits of Computer System Validation

Computer validation is more than a compliance requirement. Pharmaceutical computer system validation is a unique
opportunity for a business to examine their computer systems to maximize effectiveness and enhance quality. Ofni Systems
ensures that your validation project clearly documents why your customers should share the high degree of confidence you
hold in your company and your systems, while scaling the project to your organizational validation requirements and budget.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 24/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
—————————————————————————————————————————————————-

GxP and Compliance Auditing Services

As the FDA increases regulatory enforcement of 21 CFR 11 (h p://www.ofnisystems.com/21-cfr-part-11/), one of the most
difficult things for many companies is to know what their computer systems
(h p://www.ofnisystems.com/services/validation/software-validation/)require to be in compliance. Ofni Systems can quickly
assess all of your software, databases, and computer systems and identify what issues need to be addressed for compliance.

At Ofni Systems, we recognize that an FDA audit can be quite unse ling. However, we like to think of an audit as another
opportunity to add value to our clients’ business. Auditing allows companies to review their internal and external compliance
issues in an open and honest manner. Ofni Systems can help you with all phases of auditing
(h p://www.ofnisystems.com/services/part-11-assessments/) and ensure that you get the most reward out of this valuable tool.
Our auditors have worked with large and small companies, in all aspects of FDA-regulated fields from clinical trials to medical
devices to biological products to medical imaging to generics.

Be Prepared for Compliance Audits

Audit Checklists – Ofni Systems can develop tools to facilitate your system review and ensure that your systems meet all
applicable regulations.
Gap Analysis – Before an FDA inspection or regulatory compliance audit, Ofni Systems can review your internal and
external quality systems against established industry and regulatory standards and assess if your company’s actual
practices meet those standards.
Remediation Plans – If gap analysis identifies gaps, we can create a remediation plan to address the findings.
Compliance Training – Ofni Systems can also help training your employees to be proactive in dealing with compliance
issues.

Manage FDA Inspections and Compliance Audits

A well-managed audit allows you to maximize the value from the audit, as well as projecting control over all of your Quality
systems. Ofni Systems uses proprietary software to help manage audits. Ofni Systems can:

Track auditor requests and observations, including the status of auditor requests, responses to observations and corrective
actions.
Facilitate communication between members of the audit team
Produce status reports to monitor the on-going audit and your response to all audit observations.

Audit Responses and Corrective Actions

Ofni Systems can recommend the appropriate responses to audit observations. We can help you determine the appropriate
corrective actions (h p://www.ofnisystems.com/services/compliance-training/) and help your company implement changes to
your quality systems.

Supplier Audits

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 25/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Increasingly, FDA-regulated companies are being asked to verify the compliance of their suppliers. Ofni Systems can assist you
with all phases of supplier compliance audits.

If you supply FDA-regulated businesses, we can assess your systems and help you prepare for an audit. Ofni Systems can
review your systems and train your personnel to work as suppliers to FDA-regulated companies. Ofni Systems will work to
find the best way for your company to meet all regulations applicable to your situation.

If you need your vendors audited, Ofni Systems can provide the expertise to conduct the audits
(h p://www.ofnisystems.com/services/part-11-assessments/). We can conduct the entire audit independently or include your
staff as subject ma er experts. We can provide any level of support required, from conducting a single audit to managing your
entire audit program.

Benefits of Using Ofni Systems for GxP Audits

Experience with FDA and GxP requirements – Ofni Systems provides services to companies subject to FDA regulations
and are experienced with issues relating to compliance. We can quickly and efficiently review your Quality and regulatory
systems and determine if they meet all of your business requirements and regulatory requirements.
Audit Management Tools designed for GxP environments – Ofni system has proprietary tools, designed to manage FDA
inspections and audits.
Improved Value to Your Quality Systems – Auditing increases the value of your systems and improves the software
development process.

—————————————————————————————————————————————————-

Question[1]: What are the main job duties and responsibilities of validation
engineer employee?

Answer[1]: validation engineer responsibilities are to analyze validation test data to determine whether systems or processes
have met validation criteria or to identify root causes of production problems; develop validation master plans, process flow
diagrams, test cases, or standard operating procedures; identify deviations from established product or process standards
and provide recommendations for resolving deviations; participate in internal or external training programs to maintain
knowledge of validation principles, industry trends, or novel technologies; prepare, maintain, or review validation and
compliance documentation, such as engineering change notices, schematics, or protocols; study product characteristics or
customer requirements and confer with management to determine validation objectives and standards; assist in training
equipment operators or other staff on validation protocols and standard operating procedures; conduct validation or
qualification tests of new or existing processes, equipment, or software in accordance with internal protocols or external
standards; coordinate the implementation or scheduling of validation testing with affected departments and personnel;
create, populate, or maintain databases for tracking validation activities, test results, or validated systems; design validation
study features, such as sampling, testing, or analytical methodologies; draw samples of raw materials, or intermediate and
finished products for validation testing; prepare detailed reports or design statements based on results of validation and
qualification tests or reviews of procedures and protocols; resolve testing problems by modifying testing methods or
revising test objectives and standards; conduct audits of validation or performance qualification processes to ensure
compliance with internal or regulatory requirements; direct validation activities, such as protocol creation or testing; prepare
validation or performance qualification protocols for new or modified manufacturing processes, systems, or equipment for
pharmaceutical, electronics, or other types of production; procure or devise automated lab validation test stations or other
test fixtures and equipment; communicate with regulatory agencies regarding compliance documentation or validation
results; maintain validation test equipment.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 26/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

Question[2]: What are the skills required for validation engineer employee in
order to success in his work?

Answer[2]: Giving full a ention to what other people are saying, taking time to understand the points being made, asking
questions as appropriate, and not interrupting at inappropriate times, Understanding wri en sentences and paragraphs in
work related documents, Using scientific rules and methods to solve problems, Using logic and reasoning to identify the
strengths and weaknesses of alternative solutions, conclusions or approaches to problems, Talking to others to convey
information effectively.

Question[3]: describe the abilities you have in order to work with us as


validation engineer?

Answer[3]: I have the ability to listen to and understand information and ideas presented through spoken words and
sentences, read and understand information and ideas presented in writing, apply general rules to specific problems to
produce answers that make sense, communicate information and ideas in speaking so others will understand, communicate
information and ideas in writing so others will understand.

Question[4]: What are the knowledge elements you obtained from your
education, training and work experience would support your validation
engineer career?

Answer[4]: The Knowledge of the practical application of engineering science and technology. This includes applying
principles, techniques, procedures, and equipment to the design and production of various goods and services, raw
materials, production processes, quality control, costs, and other techniques for maximizing the effective manufacture and
distribution of goods, arithmetic, algebra, geometry, calculus, statistics, and their applications, the structure and content of
the English language including the meaning and spelling of words, rules of composition, and grammar, circuit boards,
processors, chips, electronic equipment, and computer hardware and software, including applications and programming.

Question[5]: How would you describe (needed validation engineer or your)


work style?

Answer[5]: My work style matching exactlty what cashier job requires by: being honest and ethical, being careful about
detail and thorough in completing work tasks, analyzing information and using logic to address work-related issues and
problems, being reliable, responsible, and dependable, and fulfilling obligations, being pleasant with others on the job and
displaying a good-natured, cooperative a itude.
—————————————————————————————————————————————————-

Process Validation – Interview Questions & Answers

Sr.No. Interview Questions & Answers

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 27/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

What is process validation?

EMA Definition “documented evidence that the process, operated


within established parameters, can perform effectively and reproducibly
to produce a medicinal product meeting its predetermined
specifications and quality a ributes.”

1.
USFDA Definition “The collection and evaluation of data, from the
process design stage throughout

production, which establishes scientific evidence

that a process is capable of consistently delivering

quality product.”

Which is the latest guidance document for process validation published


by USFDA?

2.
Process Validation: General Principles and Practices, (published on
Jan.2011)

According to regulatory guidelines (USFDA), what are the stages of


process validation?

Process validation involves a series of activities taking place over the


lifecycle of the product and process. There are three stages for process
validation activities.

Stage 1 – Process Design: The commercial manufacturing process is


defined during this stage based on knowledge gained through
3. development and scale-up activities.

Stage 2 – Process Qualification: During this stage, the process design is


evaluated to determine if the process is capable of reproducible
commercial manufacturing.

Stage 3 – Continued Process Verification: Ongoing assurance is gained


during routine production that the process remains in a state of control.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 28/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

How many batches to be considered for process validation?

The EMA draft guideline states “a minimum of

three consecutive batches”, with justification to

be provided (there are some exceptions to this


4.
statement).

The US FDA guidance states that the number of batches must be


sufficient to provide statistical confidence of the process. It is a subtle,
but important distinction in the approaches.

Explain the strategy for industrial process validation of solid dosage


forms?

· The use of different lots of raw materials should be included. i.e.,


active drug substance and major excipients.

· Batches should be run in succession and on different days and


shifts (the la er condition, if appropriate).

· Batches should be manufactured in the equipment and facilities


designated for eventual commercial production.
5.
· Critical process variables should be set within their operating
ranges and should not exceed their upper and lower control limits
during process operation. Output responses should be well within
finished product specifications.

· Failure to meet the requirements of the Validation protocol with


respect to process input and output control should be subjected to
process requalification

What is Validation Protocol?

A wri en plan of actions stating how process validation will be


conducted; it will specify who will conduct the various tasks and define
testing parameters; sampling plans, testing methods and specifications;
6. will specify product characteristics, and equipment to be used. It must
specify the minimum number of batches to be used for validation
studies; it must specify the acceptance criteria and who will
sign/approve! Disapprove the conclusions derived from such a scientific
study.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 29/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

What should be the content of process validation protocol?

1. General information

2. Objective

3. Background/Pre validation Activities,

Summary of development and tech

transfer (from R&D or another Site)

activities to justify in-process testing and

controls; any Previous validations.

4. List of equipment and their qualification


7.
status

5. Facilities qualification

6. Process flow charts

7. Manufacturing procedure narrative

8. List of critical processing parameters and

critical excipients

9. Sampling, tests and specifications

10. Acceptance criteria

In- process validation studies what should be the blend sample size?

1x – 3x dosage unit range on case to case basis. As per USFDA guidance,


sampling size can be increased from 1x –10x with adequate scientific
8.
justification.

According to USFDA guidance how many sampling points should be


considered for collecting blend samples?

At least 10 sampling locations to be considered to represent potential


areas of poor blending.

In tumbling blenders (ex: V-blenders, double cones, or drum


mixers),samples should be selected from at least two depths along the
axis of blender.
9.

At least 20 locations are recommended to adequately validate


connective blenders (ex: ribbon blender)

What will be the reason of within location variance of blend data?

Inadequacy of blend mix, sampling error or agglomeration

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 30/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

What is the difference between EMA & US guideline on process


validation?

EMA US
Definition “documented evidence
that the process, operated within Definition “the collection and
established parameters, can evaluation of data,
perform effectively and
reproducibly to produce a from the process design stage
medicinal product meeting its throughout
predetermined
production, which establishes
specifications and quality scientific evidence that a process
a ributes.” is capable of consistently
delivering quality product.”

The EMA draft guideline states


“a minimum of
The US FDA guidance states that
three consecutive batches”, with the number of batches must be
justification to sufficient to provide statistical
confidence of the process. It is a
be provided (there are some
subtle, but important distinction
exceptions to this
in the approaches.
10.
statement).

The EMA draft

encourages the use of the product The US FDA guidance emphases


development documenting the development
phase as part of PV.
activities, but is less prescriptive
on requirements.

US FDA approach does not place


high emphasis on CPV, and
The EMA guideline specifically requires all three stages of
allows the use of CPV to replace process validation to be fully
traditional validation efforts. addressed, regardless of whether
contemporary or traditional
methods are utilised.
The EMA guideline sees process
The US FDA guidance considers
as independent
equipment and
from equipment and facility.
process design, as well as
Currently, the EMA still relies on
equipment qualification
Annex 15 of the GMP guide for
as part of the overall process
instruction on equipment
validation effort.
qualification.

Why hopper challenge study is performing during process validation?

11. To evaluate effect of vibrations during compression on blend


uniformity, hopper study shall be carried out.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 31/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

What are the critical process variables in coating?

12. Pan RPM, inlet & exhaust temperature, spray rate, gun distance and air
pressure.

Why blending is a critical parameter in tablet manufacturing?

Less blending will result in non-uniform distribution of drug and poor


13.
flow whereas more blending will result in de-mixing leading to non-
uniform distribution of drug and increase in disintegration time.

What are the critical parameters to be checked during dry mixing?

14. Mixing time and mixing speed

What are the critical parameters to be checked during binder


preparation and addition?

15. Amount of binder solution and mixing time

What the major variables in tablet compression?

16. Speed of machine, and hopper level are the major variables.

What is the revalidation criteria for process validation?

1. Change in formulation, procedure or quality of pharmaceutical


ingredients.

2. Change of equipment, addition of new equipment and major


breakdowns/maintenance, which affect the performance of equipment.

3. Major change of process, parameters.


17.
4. Change in manufacturing site.

5. On appearance of negative quality trends.

6. On appearance of new findings based on current knowledge.

7. Batch size change

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 32/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

What are the benefits of process validation?

· Consistent through output.

· Reduction in rejections and reworks.

· Reduction in utility cost.

· Avoidance of capital expenditures.

· Fewer complaints about process related failure.


18.
· Reduced testing process and finished goods.

· More rapid and accurate investigations into process deviation.

· More rapid and reliable start-up of new equipment.

· Easier scale-up from development work.

· Easier maintenance of equipment.

· Improve employee awareness of processes.

What are the common variables in the manufacturing of tablets?

· Particle size of the drug substance


· Bulk density of drug substance/excipients
· Powder load in granulator
· Amount & concentration of binder
19. · Mixer speed & mixing timings
· Granulation moisture content
· Milling conditions
· Lubricant blending times
· Tablet hardness
· Coating solution spray rate

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 33/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

What is the action plan if a test failure observed during process


validation?

Any test during process validation shall investigate to determine the


case of failure. Where the case of failure is not obvious, it may useful to
us an investigation procedure to ensure that all the possible areas of
potential failure are covered. Once the case of the process validation
failure has been identified, the failure shall classified into the following
categories.

Type I: where the failure can be a ributed to an occurrence which is not


intrinsic to the process for example, an equipment failure raw material
that it can be agreed to complete the validation exercise substituting
another batch for the one that failed.This investigation and the
subsequent action shall be included in the validation report.

Type II: where the failure may be a ribute failure or where the
investigation is inconclusive than the validation exercise has failed. In
20. this case the validation terms decide and justify the course of action to
be taken, recording its justification and recommendations.

This decision shall consider:

· Re-testing – if investigation of the analytical results supports the


decision.

· Introduction a change in operation parameters, process steps.

· Changing the process equipment or the procedure for using the


equipment.

· Suspension of the process validation exercise until further technical


evaluation and/or development has been carried out.

· Changing the sampling regime.

· Review of historical data.

· Change of the process validation acceptance criteria.

· Change to an analytical procedure.

Computer System Validation


(http://pharmaceuticalvalidation.blogspot.in/2010/01/computer-system-
validation.html)

Introduction and Regulatory Requirements

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 34/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Computers are widely used during development and manufacturing of drugs and medical devices. Proper functioning and
performance of software and computer systems play a major role in obtaining consistency, reliability and accuracy of
data.Therefore, computer system validation (CSV) should be part of any good development and manufacturing practice. It is
also requested by FDA regulations and guidelines through the overall requirement that “equipment must be suitable for it’s
intended use”.

Specific requirements for computers can be found in section 211.68 of the US cGMP regulations

Automatic, mechanical, or electronic equipment or other types of equipment, including computers, or related systems that
will perform a function satisfactorily, may be used in the manufacture, processing, packing, and holding of a drug product.
If such equipment is so used, it shall be routinely calibrated, inspected, or checked according to a wri en program designed
to assure proper performance. Wri en records of those calibration checks and inspections shall be maintained.
Appropriate controls shall be exercised over computer or related systems to assure that changes in master production and
control records or other records are instituted only by authorized personnel.
Input to and output from the computer or related system of formulas or other records or data shall be checked for accuracy
The degree and frequency of input/output verification shall be based on the complexity and reliability of the computer or
related system
A backup file of data entered into the computer or related system shall be maintained except where certain data, such as
calculations performed in connection with laboratory analysis, are eliminated by computerization or other automated
processes. In such instances a wri en record of the program shall be maintained along with appropriate validation data.
Hard copy or alternative systems, such as duplicates, tapes, or microfilm, shall be designed to assure that backup data are
exact and complete and that it is secure from alteration, inadvertent erasures, or loss shall be maintained

The FDA has developed several specific guidance documents on using computers for other FDA regulated areas. Most detailed
is the Industry Guide: General Principal of Software Validation: (2). It deals with development and validation of software used
in medical devices. More recently the FDA has released a draft guidance ob using computers in clinical studies (3). The
guidance states FDA’s expectations related to computer systems and to electronic records generated during clinical studies.

Specific requirements for computers and electronic records and signatures are also defined in FDA’s regulations 21 CFR Part 11
on electronic Records and Signatures (4). This regulation applies to all FDA regulated areas and has specific requirements to
ensure trustworthy, integrity and reliability of records generated, evaluated, transmi ed and archived by computer systems. In
2003 the FDA published a guidance on scope and applications of 21 CFR Part 11 (5). In this document the FDA promoted the
concept of risk based validation

.By far the most detailed and most specific official document that has ever been developed on using computers in regulated
areas is the “Good Practices Guide on Using Computers in GxP Environments.” (6). It has been developed by inspectors for
inspectors of the Pharmaceutical Inspection Convention Scheme (PIC/S) but is also quite useful for the industry. It has more
than 50 pages and includes a six page checklist recommended to be used by for inspectors.

Because of their importance, computer validation issues have been addressed by several industry organizations and private
authors:

The Good Automated Manufacturing Practices Forum (GAMP) has developed guidelines for computer validation (7).
Huber has published a validation reference books for the validation of computerized analytical and networked systems (8).
The Parenteral Drug Association (PDA) has developed a technical paper on the validation of laboratory data acquisition
system (9)

All these guidelines and publications follow a couple of principles:

Validation of computer systems is not a one time event. It starts with the definition of the product or project and se ing
user requirement specifications and cover the vendor selection process, installation, initial operation, going use, and change
control and system retirement.
All publications refer to some kind of life cycle model with a formal change control procedure being an important part of
the whole process.
There are no detailed instructions on what should be tested. All guidelines refer to risk assessment for the extent of
validation

While in the past computer validation was more focused on functions of single user computer systems, recently the focus is on
network infrastructure, networked systems and on security, authenticity and integrity of data acquired and evaluated by
computer systems (10). With the increasing use of Internet and e-mail communications the validation of web-based
applications also gets more important. Labcompliance recently published a package entitled Internet Quality and Compliance.

Scope of the Tutorial

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 35/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

This tutorial will guide IT personnel , QA managers, operational managers and users of computer hardware and software
through the entire high level validation process from writing specifications and vendor qualification to installation and initial
and on-going operation.

It covers

Qualification of computer hardware with peripherals and accessories like printers and disk drives.
Validation of software loaded on a computer, which is used to control equipments, to capture raw data, to process the data
and to print and store. Software typically includes operating systems, standard applications software and software wri en
by of for a specific user.*
Development of documentation as required by regulations.

Risk assessment and risk based validation will be discussed for all validation phases to optimize validation efforts vs. costs for
systems with different impact and risk on product quality. This is especially important since the FDA has been using and
supporting the risk based approaches for compliance as part of the 21st century drug cGMP Initiative

One of the main purposes of this primer is to answer the key question regarding validation: How much validation is needed
and how much is sufficient for a specific computer system? This primer gives a good overview and lists major validation steps
and tasks but for an in depth understanding and for easy implementation readers are recommended to read further references,
for example the SOPs and validation examples as included in the Computer System Validation Package from Labcompliance.

(h p://www.labcompliance.com/books/part11/default.aspx)

Validation Overview

Validation of computer systems is not a once off event. Annex 11 of the European GMP directive is very clear about this:
Validation should be considered as part of the complete life cycle of a computer system. This cycle includes the stages of
planning, specification, programming, testing, commissioning, documentation, operation, monitoring and modifying”.

For new systems validation starts when a user department has a need for a new computer system and thinks about how the
system can solve an existing problem. For an existing system it starts when the system owner gets the task of bringing the
system into a validated state. Validation ends when the system is retired and all-important quality data is successfully migrated
to the new system. Important steps in between are validation planning, defining user requirements, functional specifications,
design specifications, validation during development, vendor assessment for purchased systems, installation, initial and
ongoing testing and change control. In other words, computer systems should be validated during the entire life of the system.

Because of the complexity and the long time span of computer validation the process is typically broken down into life cycle
phases. Several life cycle models have been described in literature. One model that is frequently used is the V-model as shown
in figure 1.

Figure 1. V-Lifecycle model

This model comprises of User Requirement Specifications (URS), Functional Specifications (FS), Design Specifications (DS),
development and testing of code, Installation Qualification (IQ), Operational Qualification (OQ) and Performance Qualification
(PQ).

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 36/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
The V-Model as described above is quite good if the validation process also includes software development. However, it does
not address some very important steps, for example, vendor assessment. It also looks quite complex for true commercial off the
shelf system with no code development for customization. Phases like design specification or code development and code
testing are not necessary. For such systems the 4Q model is recommended with just four phases: design qualification (DQ),
installation qualification (IQ), operational qualification (OQ), performance qualification (PQ). The process is illustrated in
Figure 2.

Figure 2. 4Q Lifecycle model

Both the 4Q and the V-model do not address the retirement phase. The 4Q model is also not suitable when systems need to be
configured for specific applications or when additional software is required that is not included in the standard product and is
developed by the user’s firm or by a 3rd party. xxx In this case a life cycle model that combines system development and
system integration is preferred. An example is shown in figure 3.

Figure 3. System Integration combined with system development

User representatives define User or System Requirement Specifications (URS, SRS). If there is no vendor that offers a
commercial system the software needs to be developed and validated by following the steps on the left side of the diagram.
Programmers develop functional specifications, design specifications and the code and perform testing in all development
phases under supervision of the quality assurance.

When commercial systems are available either the SRS or a special Request for Proposal (RFP) is sent to one or more vendors
(see right site of the diagram). Vendors either respond to each requirement or with a set of functional specifications of a system
that is most suitable for the user’s requirements. Users compare the vendor’s responses with their own requirements. If none of
the vendors meet all user requirements, the requirements may be adjusted to the best fit or additional software is wri en to
fulfill the user requirements following the development cycle on the left side of the diagram. The vendor that best meets the
user’s technical and business requirements is selected and qualified.

The extent of validation depends on the complexity of the computer system. The extent of validation at the user’s site also
depends on the widespread use of the same software product and version. The more a standard software is used and the less
customization made for such software the less testing is required by individual users. GAMP has developed software

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 37/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
categories based on the level of customization. In total there are five categories. Category one and two define operating systems
and firmware of automated systems. In the context of this primer only categories three to five are of interest. They are
described in Table 1. Each computer system should be associated to one of the three categories.

Category Description
Standard software package. No customization.
GAMP 3
Examples: MS Word (without VBA scripts). Computer controlled spectrophotometers.
Standard software package. Customization of configuration.
Examples:
GAMP 4
LIMS, Excel spreadsheet application where formulae and/or input data are linked to specific cells.
Networked data systems.
Custom software package. Either all software or a part or the complete package has been developed for a specific
GAMP 5 user and application.
Examples: Add-ons to GAMP Categories 3 and 4, Excel® with VBA scripts.

Validation Master Plan and Project Plan

All validation activities should be described in a validation master plan which should provide a framework for thorough and
consistent validation. A validation master plan is officially required by Annex 15 to the European GMP directive. FDA
regulations and guidelines don’t mandate a validation master plan, however, inspectors want to know what the company’s
approach towards validation is. The validation master plan is an ideal tool to communicate this approach both internally and to
inspectors. It also ensures consistent implementation of validation practices and makes validation activities much more
efficient. In case there are any questions as to why things have been done or not done, the validation master plan should give
the answer.

Within an organization a validation master plan can be developed for

multiple sites
single sites
single locations
single system categories
department categories, e.g., for development departments

Computer Validation master plans should include:

1. Introduction with a scope of the plan, e.g., sites, systems, processes


2. Responsibilities by function
3. Related documents, e.g., risk management plans
4. Products/processes to be validated and/or qualified
5. Validation approach, e.g., system life cycle approach
6. Risk management approach with examples of risk categories and recommended validation tasks for different categories
7. Vendor management
8. Steps for Computer System Validation with examples on type and extent of testing, for example, for IQ, OQ and PQ
9. Handling existing computer systems
10. Validation of Macros and spreadsheet calculations
11. Qualification of network infrastructure
12. Configuration management and change control procedures and templates
13. Back-up and recovery
14. Error handling and corrective actions
15. Requalification criteria
16. Contingency planning and disaster recovery
17. Maintenance and support
18. System retirement
19. Training plans (e.g., system operation, compliance)
20. Validation deliverables and other documentation
21. Templates and references to SOPs2
22. Glossary

For larger projects a detailed individual validation project plan should be developed. An example would be implementing a
Laboratory Information Management (LIMS) System or networked chromatographic data system. This plan is derived from the
validation master plan using the principles and templates of the master plan. It formalizes qualification and validation and

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 38/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
outlines what is to be done in order to get a specific system into compliance. For inspectors it is a first indication on which
control a department has over a specific computer system and it also gives a first impression of the validation quality.

A validation project plan should include sections on

Scope of the system, what it includes, what it doesn’t include.


System description
Validation approach
Assumptions, limitations and exclusions
Responsibilities
Risk assessment
Risk based test strategy and approach for validation steps, e.g., DQ, IQ,OQ, PQ
Ongoing performance control
Configuration management and change control
Handling system security * Data back-up and recovery
Contingency planning
Error handling
References to other documents
Timeline and deliverables for each phase

Design Qualification and Specifications

“Design qualification (DQ) defines the functional and operational specifications of the instrument and details the conscious
decisions in the selection of the supplier “(8). DQ should ensure that computer systems have all the necessary functions and
performance criteria that will enable them to be successfully implemented for the intended application and to meet business
requirements. Errors in DQ can have a tremendous technical and business impact, and therefore a sufficient amount of time
and resources should be invested in the DQ phase. For example, se ing wrong functional specifications can substantially
increase the workload for OQ testing, adding missing functions at a later stage will be much more expensive than including
them in the initial specifications and selecting a vendor with insufficient support capability can decrease instrument up-time
with a negative business impact.

Steps for design specification normally include:

Description of the task the computer system is expected to perform


Description of the intended use of the system
Description of the intended environment
Includes network environment)
Preliminary selection of the system requirement specifications, functional specifications and vendor
Vendor assessment
Final selection of the system requirement specifications and functional specification * Final selection and supplier
Development and documentation of final system specifications

System requirement specifications (SRS) or user requirement specifications (URS) are usually wri en by user representatives.
The vendor’s specification sheets can be used as guidelines. However, it is not recommended to simply writing up the vendor’s
specifications because typically commercial software has more functions than the user ever will need. On the other hand there
should be documented evidence that the system performs all specified functions and compliance to the specifications must be
verified later on in the process during operational qualification and performance qualification. Specifying too many functions
will significantly increase the workload for OQ. The development of requirement specifications should follow a well
documented procedure. Most important is to involve representatives of all user departments in this process.

User requirements should have a couple of key a ributes. They should be:

Necessary. Unnecessary functions will increase development, validation, support and maintenance costs.
Complete. Adding missing functions at a later stage will be much more expensive than including them initially.
Feasible. Specified functions that can not be implemented will delay the project.
Accurate. Inaccurately specified functions will not solve the application’s problem.
Unambiguous to avoid guessing and wrong interpretation by the developer.
Specific to avoid wrong interpretation by the developer.
Testable. Functions that are not testable can not be validated.
Uniquely identified. This helps to link specifications to test cases.

Functional specifications answer the question: what functions does the system need to comply with users requirements. They
are normally wri en by the developer of the system and should be reviewed by the user.
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 39/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Design specifications are also wri en by the developer. They answer the question: how does the system implement specified
functions. They should be formally reviewed by a team of developers under the supervision of QA.

(h p://www.labcompliance.com/usersclub/default.aspx)

Vendor Assessment

Validation of software and computerized systems covers the complete lifecycle of the products which includes validation
during design and development. When software and computer systems are purchased from vendors, the user is still
responsible for the overall validation.

FDA’s guide on Principles of Software Validation states this very clearly: “Where the software is developed by someone other
than the device manufacturer (e.g., off-the-shelf software) the software developer may not be directly responsible for
compliance with FDA regulations. In that case, the party with regulatory responsibility (i.e., the device manufacturer) needs to
assess the adequacy of the off-the-shelf software developer’s activities and determine what additional efforts are needed to
establish that the software is validated for the device manufacturer’s intended use”.

The objective of vendor qualification is to get assurance that the vendor’s products development and manufacturing practices
meet the requirements of the user’s firm for quality. For software development this usually means that the software is
developed and validated following documented procedures.

Vendor assessment should answer the questions: “What type of assurance do you have that the software has been validated
during development” or “How can you be sure that the software vendor did follow a quality assurance program?” Depending
on the risk and impact on (drug) product quality answers can be derived from

1. Documentation of experience with the vendor


Experience may come from the product under consideration or from other products.
2. External references
Useful if there is no experience within the vendor within your company
3. Assessment checklists (mail audits)
Use checklists available within your company, through public organizations, e.g., PDA and from private authors.
4. 3rd party audits
Gives an independent assessment of the quality system and/or product development
5. Direct vendor audits
Gives a good picture on the vendors quality system and software development and validation practices.

Assessment cost increase from 1 to 5 and the final procedure should be based on justified and documented risk assessment.
Such risk assessment include two parts:

1. Product risk
2. Vendor risk

Factors for product risk include

System complexity
Number of systems to be purchased
Maturity of the system
Level of networking
Influence on other systems, e.g., through networks
Impact of the system on drug quality
Impact of the system on business continuity
Level of customization

Factors for vendor risk include

Size of company
Company history
Future outlook
Representation in target industry, e.g., Pharma
Experience with the vendor

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 40/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Risk factors are estimated for the computer system (product) and the vendor and entered in table like in figure 4.

Figure 4. Vendor Risk vs. Product Risk

Most critical is the red area with high product and high vendor risk. This scenario would require a vendor audit either through
the user firm or through a trusted 3rd party. On the other hand green areas could be handled by a one to two page document
describing who the vendor and why you did select the vendor.

Vendors in the yellow area could be assessed through mail audits supported by good internal or external references. Results of
the vendor audits should be documented following a standardized ranking scheme. An example is shown in Table 2.

The results of the vendor assessment and any vendor audit should be well communicated within a company to avoid
duplication of audits of the same vendor by different departments or sites. This can be achieved by developing a company
wide repository with entries of all vendor assessment activities. The whole process of vendor assessment and audits should be
controlled by documented procedures.

Rating Meaning Interpretation


3 Excellent Vendor procedures and practices are above average
2 Adequate Vendor procedures and practices are about average
1 Poor Vendor procedures and practices are below average and need to be improved
0 Unsatisfactory Vendor procedures and practices are unacceptable
N/A Not Applicable Question is not applicable to the type of function or service

Installation Qualification

Installation qualification establishes that the computer system is received as designed and specified, that it is properly installed
in the selected environment, and that this environment is suitable for the operation and use of the instrument. The list below
includes steps as recommended before and during installation.

Before installation

Obtain manufacturer’s recommendations for installation site requirements.


Check the site for the fulfillment of the manufacturer’s recommendations (utilities such as electricity, water and gases and
environmental conditions such as humidity, temperature, vibration level and dust).

During installation

Compare computer hardware and software, as received, with purchase order (including software, accessories, spare parts)

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 41/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Check documentation for completeness (operating manuals, maintenance instructions, standard operating procedures for
testing, safety and validation certificates)
Check computer hardware and peripherals for any damage
Install hardware (computer, peripherals, network devices, cables)
Install software on computer following the manufacturer’s recommendation
Verify correct software installation, e.g., are all files accurately copies on the computer hard disk. Utilities to do this should
be included in the software itself.
Make back-up copy of software
Configure network devices and peripherals, e.g. printers and equipment modules
Identify and make a list with a description of all hardware, include drawings where appropriate, e.g., for networked data
systems.
Make a list with a description of all software installed on the computer
Store configuration se ings either electronically or on paper
List equipment manuals and SOPs
Prepare an installation report

Installation and installation qualification (IQ) of larger commercial system is normally performed by a supplier’s
representative. Both the suppliers representative and a representative of the user’s form should sign off the IQ documents.

(h p://www.labcompliance.com/books/computers/default.aspx)

Operational Qualification

“Operational qualification(OQ) is the process of demonstrating that a computer system will function according to its functional
specifications in the selected environment (

Before OQ testing is done, one should always consider what the computer system will be used for. There must a clear link
between testing as part of OQ and requirement specifications as developed in DQ phase. Testing may be quite extensive if the
computer system is complex and if there is li le or no information from the supplier on what tests have been performed at the
supplier’s site. Extent of testing should be based on a justified and documented risk assessment. Criteria are

Impact on product quality


Impact on business continuity
Complexity of system
Information from the vendor on type of tests and test environment
Level of customization

Most extensive tests are necessary if the system has been developed for a specific user. In this case the user should test all
functions. For commercial off-the-shelf systems that come with a validation certificate, only tests should be done of functions
that are highly critical for the operation or that can be influenced by the environment. Examples are data acquisition over
relatively long distance from analytical instruments at high acquisition rate. Specific user configurations should also be tested,
for example correct se ings of IP addresses of network devices should be verified through connectivity testing.

Based on the risk factors above a system risk factor should be estimated. Extent of testing should be defined for each risk level
in a risk management master plan or in the ‘risk’ section of the validation master plan. An example is shown in the table below.
The level of customization is expressed through the GAMP Categories 3, 4, or 5. Category three is a standard software without
customization and configuration se ing. Category 4 is a configurable system and Category 5 a fully customized system. Extent
of testing increases from the left lower site (low risk, standard system) to the right upper site (high risk, full customization).

System GAMP3 GAMP4 GAMP5


Test critical functions. Test critical standard functions. Test critical standard functions.
High risk Link tests to Test all non standard functions Test all non standard functions
requirements. Link tests to requirements Link tests to requirements.
Test all critical standard and non standard Test critical standard functions.
Medium
Test critical functions. functions Test all non standard functions
risk
Link tests to requirements. Link tests to requirements.
Test critical non standard
Low risk No testing Test critical non standard functions
functions

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 42/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Proper functioning of back-up and recovery and security functions like access control to the computer system and to data
should also be tested.. Full OQ test should be performed before the system is used initially and at regular intervals, e.g., for
chromatographic data systems about once a year and after major system updates. Partial OQ tests should be performed after
minor system updates.

Tests should be quantitative. This means inspectors would not only expect a test protocol with test items and pass/fail
information but also expected results, acceptance criteria and actual results. An example for a test protocol template is shown
in figure 8.

Tests should be linked to requirement specifications through a test traceability matrix. A template for such a matrix is the table
below should help to easily find a test protocol for a specific test requirement.

The matrix can be documented on paper format but for larger projects it is recommended to use electronic document
management systems. This can range from simple Word tables to data bases and software specifically developed for managing
traceability matrices.

Requirement Number Requirement Test ID


1.1 Example 1 4.1, 4.3
1.2 Example 2 1.2
1.3 Example 3 3.1
1.4 Example 4 3.1, 4.1

Performance Qualification

“Performance Qualification (PQ) is the process of demonstrating that a system consistently performs according to a
specification appropriate for its routine use”.Important here is the word ‘consistently’. Important for consistent computer
system performance are regular preventive maintenance, e.g., removal of temporary files and making changes to a system in a
controlled manner and regular testing.

In practice, PQ can mean testing the system with the entire application. For a computerized analytical system this can mean, for
example, running system suitability testing, where critical key system performance characteristics are measured and compared
with documented, preset limits.

PQ activities normally can include

Complete system test to proof that the application works as intended. For example for a computerized analytical system
this can mean running a well characterized sample through the system and compare the results with a result previously
obtained.
Regression testing: reprocessing of data files and compare the result with previous result
Regular removal of temporary files
Regular virus scan
Auditing computer systems

Most efficient is to use software for automated regression testing. The software runs typical data sets through a series of
applications and calculates and stores the final result using processing parameters as defined by the user. During regression
testing the data are processed again and results are compared with previously recorded results. Normally such tests don’t take
more than five minutes but give assurance that they key functions of the system work as intended.

(h p://www.labcompliance.com/books/macros/default.aspx)

Configuration Management and Change Control

Any changes to specifications, programming codes or computer hardware should follow wri en procedures and be
documented. Changes may be initiated because errors have been found in the program or because additional or different
software functions or hardware may be desirable. Requests for changes should be submi ed by users and authorized by the
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 43/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
user’s supervisor or department manager. For initiation, authorization and documentation of changes forms should be used.
An example is shown in figure 5.

Figure 5: Change Request Form

Most important is that changes should follow standard procedures for initiation, authorization, implementing, testing and
documenting. All activities should be planned in the validation project plan and documented in the validation report.

After any changes the program should be tested. Full testing should be done for the part of the program that has been changed
and regression testing should be done for the entire program.

Validation Report and other Documents

Validation Report

When the validation project is completed a validation summary report should be generated by the system owner. The report
documents the outcome of the validation project. The validation report should mirror the validation project plan and should
include:

A brief description of the system.


identification of the system and all software versions that were tested.
Description of hardware used.
Major project activities.
Listing of test protocols, test results and conclusions.
Statement on system status prior to release.
List of all major or critical issues and deviations with risk assessment and corrective actions. * Statement that all tasks have
been performed as defined in the project plan.
Statement that validation has been performed according to the documented procedures.
Listing of all deliverables.
Final approval or rejection statement.

The validation report should be reviewed, approved and signed by QA and the system owner.

Standard Operating Procedures

Validation activities should be performed according to wri en procedures. Generic procedures should be taken from the
corporate SOP list. System specific procedures should be developed for the system to be validated. Labcompliance has
examples for most of the procedures. They are indicated by S-Numbers (S-xxx) in the list below and are either included in the

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 44/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Computer System Validation Package (h p://www.labcompliance.com/books/computers), or can be ordered from the
labcompliance SOP website (h p://www.labcompliance.com/solutions/sops/).

Procedures should be available under the same or a similar title as follows:

1. Training for GxP, 21 CFR Part 11 and Computer Validation (S-125).


2. Risk Assessment for Systems Used in GxP Environments (S-134).
3. Validation of Commercial Off-the-Shelf (COTS) Computer Systems (S-271).
4. Validation of Macro Programs and Other Application Software (S-263).
5. Risk-Based Validation of Computer Systems (S-252).
6. Development of User Requirement Specifications for Computers (S-253).
7. Quality Assessment of Software and Computer System Suppliers (S-274).
8. Auditing Software Suppliers: Preparation, Conduct, Follow-up (S-273).
9. Development and Maintenance of Test Scripts for Equipment Hardware, Software and Systems (S-237).
10. Handling of Problems with Software and Computer Systems.
11. Data Back-Up and Restore (S-317).
12. Disaster Recovery of Computer Systems (S-319).
13. Archiving and Retrieval of GMP Data and Other Documents (S-162).
14. Access Control to Computer Systems and Data (S-320).
15. Configuration Management and Version Control of Software (S-259).
16. Change Control of Software and Computer Systems (S-262).
17. Revalidation of Software and Computer Systems (S-260).
18. Retention and Archiving of Electronic Records (S-315).
19. Qualification of PC Clients (S-289).
20. Retirement of Computer Systems (S-261). 21. Review of Computer Systems.
21. Auditing Computer Systems (S-272)

Checklists

Checklists should help to verify that validation tasks are identified and performed. However, some validation tasks are specific
for specific systems. Therefore going through checklists does not mean that everything is covered for each system nor does it
mean that all checklist items are applicable for every system. Labcompliance has examples for checklists related to computer
system validation. They are indicated by E-Numbers (E-xxx) in the list below and are either included in the Computer System
Validation Package (h p://www.labcompliance.com/books/computers), or can be ordered from the labcompliance Examples
website (h p://www.labcompliance.com/solutions/examples/).

Examples are checklists for:

1. Commercial Off-the-Shelf Computer Systems (E-160).


2. Assessment of Software Vendors (E-255).
3. User Requirement Specifications for Software and Computer Systems (E-153).

Templates and Validation Examples

Templates are useful to effectively follow and document validation tasks and results. Validation examples help to get adequate
information on how to conduct validation and to prepare deliverables. Labcompliance has templates and examples for
validation tasks. They are indicated by E-Numbers (E-xxx) in the list below and are either included in the Computer System
Validation Package (h p://www.labcompliance.com/books/computers): or can be ordered from the labcompliance Examples
website (h p://www.labcompliance.com/solutions/examples/).

Such documentation can include templates/examples for:

1. Requirement Specifications for Chromatographic Data Systems (E-255).


2. Requirement Specifications for Excel Applications (E-268).
3. User Requirement Specifications – 20 Good/Bad Examples (E-308).
4. Computer System and Network Identification (E-326).
5. Template/Examples: Test Protocol For Excel™ Spreadsheet Application (with traceability matrix): Includes 12 test scripts
examples for functional testing, boundary testing, out of range testing and test traceability matrices: tests vs. specifications,
specifications vs. test cases and test summary sheet (E-358).
6. Testing of Authorized System Access (E-362).
7. MD5 Checksum File Integrity Check Software with Validation Documentation: DQ, IQ, OQ, PQ (E-306).
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 45/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
—————————————————————————————————————————————————-

General Principles of Software Validation

SECTION 1. PURPOSE

This guidance outlines general validation principles that the Food and Drug Administration (FDA) considers to be applicable
to the validation of medical device software or the validation of software used to design, develop, or manufacture medical
devices. This final guidance document, Version 2.0, supersedes the draft document,General Principles of Software Validation,
Version 1.1, dated June 9, 1997.

SECTION 2. SCOPE

This guidance describes how certain provisions of the medical device Quality System regulation apply to software and the
agency’s current approach to evaluating a software validation system. For example, this document lists elements that are
acceptable to the FDA for the validation of software; however, it does not list all of the activities and tasks that must, in all
instances, be used to comply with the law.

The scope of this guidance is somewhat broader than the scope of validation in the strictest definition of that term. Planning,
verification, testing, traceability, configuration management, and many other aspects of good software engineering discussed
in this guidance are important activities that together help to support a final conclusion that software is validated.

This guidance recommends an integration of software life cycle management and risk management activities. Based on the
intended use and the safety risk associated with the software to be developed, the software developer should determine the
specific approach, the combination of techniques to be used, and the level of effort to be applied. While this guidance does not
recommend any specific life cycle model or any specific technique or method, it does recommend that software validation and
verification activities be conducted throughout the entire software life cycle.

Where the software is developed by someone other than the device manufacturer (e.g., off-the-shelf software) the software
developer may not be directly responsible for compliance with FDA regulations. In that case, the party with regulatory
responsibility (i.e., the device manufacturer) needs to assess the adequacy of the off-the-shelf software developer’s activities
and determine what additional efforts are needed to establish that the software is validated for the device manufacturer’s
intended use.

2.1. APPLICABILITY

This guidance applies to:

Software used as a component, part, or accessory of a medical device;


Software that is itself a medical device (e.g., blood establishment software);
Software used in the production of a device (e.g., programmable logic controllers in manufacturing equipment); and
Software used in implementation of the device manufacturer’s quality system (e.g., software that records and maintains the
device history record).

This document is based on generally recognized software validation principles and, therefore, can be applied to any software.
For FDA purposes, this guidance applies to any software related to a regulated medical device, as defined by Section 201(h) of
the Federal Food, Drug, and Cosmetic Act (the Act) and by current FDA software and regulatory policy. This document does
not specifically identify which software is or is not regulated.

2.2. AUDIENCE

This guidance provides useful information and recommendations to the following individuals:

Persons subject to the medical device Quality System regulation


Persons responsible for the design, development, or production of medical device software
Persons responsible for the design, development, production, or procurement of automated tools used for the design,
development, or manufacture of medical devices or software tools used to implement the quality system itself
FDA Investigators

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 46/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
FDA Compliance Officers
FDA Scientific Reviewers

2.3. THE LEAST BURDENSOME APPROACH

We believe we should consider the least burdensome approach in all areas of medical device regulation. This guidance reflects
our careful review of the relevant scientific and legal requirements and what we believe is the least burdensome way for you to
comply with those requirements. However, if you believe that an alternative approach would be less burdensome, please
contact us so we can consider your point of view. You may send your wri en comments to the contact person listed in the
preface to this guidance or to the CDRH Ombudsman.Comprehensive information on CDRH’s Ombudsman
(h p://www.fda.gov/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/CDRH/CDRHOmbudsman/default.htm),
including ways to contact him, can be found on the Internet.

2.4. REGULATORY REQUIREMENTS FOR SOFTWARE VALIDATION

The FDA’s analysis of 3140 medical device recalls conducted between 1992 and 1998 reveals that 242 of them (7.7%) are
a ributable to software failures. Of those software related recalls, 192 (or 79%) were caused by software defects that were
introduced when changes were made to the software after its initial production and distribution. Software validation and other
related good software engineering practices discussed in this guidance are a principal means of avoiding such defects and
resultant recalls.

Software validation is a requirement of the Quality System regulation, which was published in the Federal Register on October
7, 1996 and took effect on June 1, 1997. (See Title 21 Code of Federal Regulations (CFR) Part 820, and 61 Federal Register (FR)
52602, respectively.) Validation requirements apply to software used as components in medical devices, to software that is itself
a medical device, and to software used in production of the device or in implementation of the device manufacturer’s quality
system.

Unless specifically exempted in a classification regulation, any medical device software product developed after June 1, 1997,
regardless of its device class, is subject to applicable design control provisions. (See of 21 CFR §820.30.) This requirement
includes the completion of current development projects, all new development projects, and all changes made to existing
medical device software. Specific requirements for validation of device software are found in 21 CFR §820.30(g). Other design
controls, such as planning, input, verification, and reviews, are required for medical device software. (See 21 CFR §820.30.) The
corresponding documented results from these activities can provide additional support for a conclusion that medical device
software is validated.

Any software used to automate any part of the device production process or any part of the quality system must be validated
for its intended use, as required by 21 CFR §820.70(i). This requirement applies to any software used to automate device
design, testing, component acceptance, manufacturing, labeling, packaging, distribution, complaint handling, or to automate
any other aspect of the quality system.

In addition, computer systems used to create, modify, and maintain electronic records and to manage electronic signatures are
also subject to the validation requirements. (See 21 CFR §11.10(a).) Such computer systems must be validated to ensure
accuracy, reliability, consistent intended performance, and the ability to discern invalid or altered records.

Software for the above applications may be developed in-house or under contract. However, software is frequently purchased
off-the-shelf for a particular intended use. All production and/or quality system software, even if purchased off-the-shelf,
should have documented requirements that fully define its intended use, and information against which testing results and
other evidence can be compared, to show that the software is validated for its intended use.

The use of off-the-shelf software in automated medical devices and in automated manufacturing and quality system operations
is increasing. Off-the-shelf software may have many capabilities, only a few of which are needed by the device manufacturer.
Device manufacturers are responsible for the adequacy of the software used in their devices, and used to produce devices.
When device manufacturers purchase “off-the-shelf” software, they must ensure that it will perform as intended in their
chosen application. For off-the-shelf software used in manufacturing or in the quality system, additional guidance is included
in Section 6.3 (h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237968) of this document. For
device software, additional useful information may be found in FDA’s Guidance for Industry, FDA Reviewers, and Compliance on
Off-The-Shelf Software Use in Medical Devices. (h p://www.fda.gov/RegulatoryInformation/Guidances/ssLINK/ucm073778.htm)

2.4. QUALITY SYSTEM REGULATION VS PRE-MARKET SUBMISSIONS

This document addresses Quality System regulation issues that involve the implementation of software validation. It provides
guidance for the management and control of the software validation process. The management and control of the software
validation process should not be confused with any other validation requirements, such as process validation for an automated

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 47/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
manufacturing process.

Device manufacturers may use the same procedures and records for compliance with quality system and design control
requirements, as well as for pre-market submissions to FDA. This document does not cover any specific safety or efficacy issues
related to software validation. Design issues and documentation requirements for pre-market submissions of regulated
software are not addressed by this document. Specific issues related to safety and efficacy, and the documentation required in
pre-market submissions, should be addressed to the Office of Device Evaluation (ODE), Center for Devices and Radiological
Health (CDRH) or to the Office of Blood Research and Review, Center for Biologics Evaluation and Research (CBER). See the
references in Appendix A (h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237969) for
applicable FDA guidance documents for pre-market submissions.

SECTION 3. CONTEXT FOR SOFTWARE VALIDATION

Many people have asked for specific guidance on what FDA expects them to do to ensure compliance with the Quality System
regulation with regard to software validation. Information on software validation presented in this document is not new.
Validation of software, using the principles and tasks listed in Sections 4
(h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237944) and 5
(h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237955), has been conducted in many
segments of the software industry for well over 20 years.

Due to the great variety of medical devices, processes, and manufacturing facilities, it is not possible to state in one document
all of the specific validation elements that are applicable. However, a general application of several broad concepts can be used
successfully as guidance for software validation. These broad concepts provide an acceptable framework for building a
comprehensive approach to software validation. Additional specific information is available from many of the references listed
in Appendix A (h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237969).

3.1. DEFINITIONS AND TERMINOLOGY

Unless defined in the Quality System regulation, or otherwise specified below, all other terms used in this guidance are as
defined in the current edition of the FDA Glossary of Computerized System and Software Development Terminology
(h p://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074875.htm).

The medical device Quality System regulation (21 CFR 820.3(k)) defines “establish” to mean “define, document, and
implement.” Where it appears in this guidance, the words “establish” and “established” should be interpreted to have this
same meaning.

Some definitions found in the medical device Quality System regulation can be confusing when compared to commonly used
terminology in the software industry. Examples are requirements, specification, verification, and validation.

3.1.1 Requirements and Specifications

While the Quality System regulation states that design input requirements must be documented, and that specified
requirements must be verified, the regulation does not further clarify the distinction between the terms “requirement” and
“specification.” A requirement can be any need or expectation for a system or for its software. Requirements reflect the stated
or implied needs of the customer, and may be market-based, contractual, or statutory, as well as an organization’s internal
requirements. There can be many different kinds of requirements (e.g., design, functional, implementation, interface,
performance, or physical requirements). Software requirements are typically derived from the system requirements for those
aspects of system functionality that have been allocated to software. Software requirements are typically stated in functional
terms and are defined, refined, and updated as a development project progresses. Success in accurately and completely
documenting software requirements is a crucial factor in successful validation of the resulting software.

A specification is defined as “a document that states requirements.” (See 21 CFR §820.3(y).) It may refer to or include
drawings, pa erns, or other relevant documents and usually indicates the means and the criteria whereby conformity with the
requirement can be checked. There are many different kinds of wri en specifications, e.g., system requirements specification,
software requirements specification, software design specification, software test specification, software integration
specification, etc. All of these documents establish “specified requirements” and are design outputs for which various forms of
verification are necessary.

3.1.2 Verification and Validation

The Quality System regulation is harmonized with ISO 8402:1994, which treats “verification” and “validation” as separate and
distinct terms. On the other hand, many software engineering journal articles and textbooks use the terms “verification” and
“validation” interchangeably, or in some cases refer to software “verification, validation, and testing (VV&T)” as if it is a single

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 48/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
concept, with no distinction among the three terms.

Software verification provides objective evidence that the design outputs of a particular phase of the software development
life cycle meet all of the specified requirements for that phase. Software verification looks for consistency, completeness, and
correctness of the software and its supporting documentation, as it is being developed, and provides support for a subsequent
conclusion that software is validated. Software testing is one of many verification activities intended to confirm that software
development output meets its input requirements. Other verification activities include various static and dynamic analyses,
code and document inspections, walkthroughs, and other techniques.

Software validation is a part of the design validation for a finished device, but is not separately defined in the Quality System
regulation. For purposes of this guidance, FDA considers software validation to be “confirmation by examination and
provision of objective evidence that software specifications conform to user needs and intended uses, and that the
particular requirements implemented through software can be consistently fulfilled.” In practice, software validation
activities may occur both during, as well as at the end of the software development life cycle to ensure that all requirements
have been fulfilled. Since software is usually part of a larger hardware system, the validation of software typically includes
evidence that all software requirements have been implemented correctly and completely and are traceable to system
requirements. A conclusion that software is validated is highly dependent upon comprehensive software testing, inspections,
analyses, and other verification tasks performed at each stage of the software development life cycle. Testing of device software
functionality in a simulated use environment, and user site testing are typically included as components of an overall design
validation program for a software automated device.

Software verification and validation are difficult because a developer cannot test forever, and it is hard to know how much
evidence is enough. In large measure, software validation is a ma er of developing a “level of confidence” that the device
meets all requirements and user expectations for the software automated functions and features of the device. Measures such
as defects found in specifications documents, estimates of defects remaining, testing coverage, and other techniques are all
used to develop an acceptable level of confidence before shipping the product. The level of confidence, and therefore the level
of software validation, verification, and testing effort needed, will vary depending upon the safety risk (hazard) posed by the
automated functions of the device. Additional guidance regarding safety risk management for software may be found in
Section 4 of FDA’s Guidance for the Content of Pre-market Submissions for Software Contained in Medical Devices, and in the
international standards ISO/IEC 14971-1 and IEC 60601-1-4 referenced in Appendix A
(h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237969).

3.1.3 IQ/OQ/PQ

For many years, both FDA and regulated industry have a empted to understand and define software validation within the
context of process validation terminology. For example, industry documents and other FDA validation guidance sometimes
describe user site software validation in terms of installation qualification (IQ), operational qualification (OQ) and performance
qualification (PQ). Definitions of these terms and additional information regarding IQ/OQ/PQ may be found in FDA’s Guideline
on General Principles of Process Validation, dated May 11, 1987, and in FDA’s Glossary of Computerized System and Software
Development Terminology (h p://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074875.htm), dated August 1995.

While IQ/OQ/PQ terminology has served its purpose well and is one of many legitimate ways to organize software validation
tasks at the user site, this terminology may not be well understood among many software professionals, and it is not used
elsewhere in this document. However, both FDA personnel and device manufacturers need to be aware of these differences in
terminology as they ask for and provide information regarding software validation.

3.2. SOFTWARE DEVELOPMENT AS PART OF SYSTEM DESIGN

The decision to implement system functionality using software is one that is typically made during system design. Software
requirements are typically derived from the overall system requirements and design for those aspects in the system that are to
be implemented using software. There are user needs and intended uses for a finished device, but users typically do not specify
whether those requirements are to be met by hardware, software, or some combination of both. Therefore, software validation
must be considered within the context of the overall design validation for the system.

A documented requirements specification represents the user’s needs and intended uses from which the product is developed.
A primary goal of software validation is to then demonstrate that all completed software products comply with all documented
software and system requirements. The correctness and completeness of both the system requirements and the software
requirements should be addressed as part of the design validation process for the device. Software validation includes
confirmation of conformance to all software specifications and confirmation that all software requirements are traceable to the
system specifications. Confirmation is an important part of the overall design validation to ensure that all aspects of the
medical device conform to user needs and intended uses.

3.3. SOFTWARE IS DIFFERENT FROM HARDWARE

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 49/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

While software shares many of the same engineering tasks as hardware, it has some very important differences. For example:

The vast majority of software problems are traceable to errors made during the design and development process. While the
quality of a hardware product is highly dependent on design, development and manufacture, the quality of a software
product is dependent primarily on design and development with a minimum concern for software manufacture. Software
manufacturing consists of reproduction that can be easily verified. It is not difficult to manufacture thousands of program
copies that function exactly the same as the original; the difficulty comes in ge ing the original program to meet all
specifications.One of the most significant features of software is branching, i.e., the ability to execute alternative series of
commands, based on differing inputs. This feature is a major contributing factor for another characteristic of software – its
complexity. Even short programs can be very complex and difficult to fully understand.
Typically, testing alone cannot fully verify that software is complete and correct. In addition to testing, other verification
techniques and a structured and documented development process should be combined to ensure a comprehensive
validation approach.
Unlike hardware, software is not a physical entity and does not wear out. In fact, software may improve with age, as latent
defects are discovered and removed. However, as software is constantly updated and changed, such improvements are
sometimes countered by new defects introduced into the software during the change.
Unlike some hardware failures, software failures occur without advanced warning. The software’s branching that allows it
to follow differing paths during execution, may hide some latent defects until long after a software product has been
introduced into the marketplace.
Another related characteristic of software is the speed and ease with which it can be changed. This factor can cause both
software and non-software professionals to believe that software problems can be corrected easily. Combined with a lack of
understanding of software, it can lead managers to believe that tightly controlled engineering is not needed as much for
software as it is for hardware. In fact, the opposite is true.Because of its complexity, the development process for software
should be even more tightly controlled than for hardware, in order to prevent problems that cannot be easily detected
later in the development process.
Seemingly insignificant changes in software code can create unexpected and very significant problems elsewhere in the
software program. The software development process should be sufficiently well planned, controlled, and documented to
detect and correct unexpected results from software changes.
Given the high demand for software professionals and the highly mobile workforce, the software personnel who make
maintenance changes to software may not have been involved in the original software development. Therefore, accurate
and thorough documentation is essential.
Historically, software components have not been as frequently standardized and interchangeable as hardware components.
However, medical device software developers are beginning to use component-based development tools and techniques.
Object-oriented methodologies and the use of off-the-shelf software components hold promise for faster and less expensive
software development. However, component-based approaches require very careful a ention during integration. Prior to
integration, time is needed to fully define and develop reusable software code and to fully understand the behavior of off-
the-shelf components.

For these and other reasons, software engineering needs an even greater level of managerial scrutiny and control than does
hardware engineering.

3.4. BENEFITS OF SOFTWARE VALIDATION

Software validation is a critical tool used to assure the quality of device software and software automated operations. Software
validation can increase the usability and reliability of the device, resulting in decreased failure rates, fewer recalls and
corrective actions, less risk to patients and users, and reduced liability to device manufacturers. Software validation can also
reduce long term costs by making it easier and less costly to reliably modify software and revalidate software changes.
Software maintenance can represent a very large percentage of the total cost of software over its entire life cycle. An established
comprehensive software validation process helps to reduce the long-term cost of software by reducing the cost of validation for
each subsequent release of the software.

3.5 DESIGN REVIEW

Design reviews are documented, comprehensive, and systematic examinations of a design to evaluate the adequacy of the
design requirements, to evaluate the capability of the design to meet these requirements, and to identify problems. While there
may be many informal technical reviews that occur within the development team during a software project, a formal design
review is more structured and includes participation from others outside the development team. Formal design reviews may
reference or include results from other formal and informal reviews. Design reviews may be conducted separately for the
software, after the software is integrated with the hardware into the system, or both. Design reviews should include

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 50/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
examination of development plans, requirements specifications, design specifications, testing plans and procedures, all other
documents and activities associated with the project, verification results from each stage of the defined life cycle, and
validation results for the overall device.

Design review is a primary tool for managing and evaluating development projects. For example, formal design reviews allow
management to confirm that all goals defined in the software validation plan have been achieved. The Quality System
regulation requires that at least one formal design review be conducted during the device design process. However, it is
recommended that multiple design reviews be conducted (e.g., at the end of each software life cycle activity, in preparation for
proceeding to the next activity). Formal design review is especially important at or near the end of the requirements activity,
before major resources have been commi ed to specific design solutions. Problems found at this point can be resolved more
easily, save time and money, and reduce the likelihood of missing a critical issue.

Answers to some key questions should be documented during formal design reviews. These include:

Have the appropriate tasks and expected results, outputs, or products been established for each software life cycle activity?
Do the tasks and expected results, outputs, or products of each software life cycle activity:
Comply with the requirements of other software life cycle activities in terms of correctness, completeness, consistency,
and accuracy?
Satisfy the standards, practices, and conventions of that activity?
Establish a proper basis for initiating tasks for the next software life cycle activity?

SECTION 4. PRINCIPLES OF SOFTWARE


VALIDATION

This section lists the general principles that should be considered for the validation of software.

4.1. REQUIREMENTS

A documented software requirements specification provides a baseline for both validation and verification. The software
validation process cannot be completed without an established software requirements specification (Ref: 21 CFR 820.3(z) and
(aa) and 820.30(f) and (g)).

4.2. DEFECT PREVENTION

Software quality assurance needs to focus on preventing the introduction of defects into the software development process and
not on trying to “test quality into” the software code after it is wri en. Software testing is very limited in its ability to surface all
latent defects in software code. For example, the complexity of most software prevents it from being exhaustively tested.
Software testing is a necessary activity. However, in most cases software testing by itself is not sufficient to establish
confidence that the software is fit for its intended use. In order to establish that confidence, software developers should use a
mixture of methods and techniques to prevent software errors and to detect software errors that do occur. The “best mix” of
methods depends on many factors including the development environment, application, size of project, language, and risk.

4.3. TIME AND EFFORT

To build a case that the software is validated requires time and effort. Preparation for software validation should begin early,
i.e., during design and development planning and design input. The final conclusion that the software is validated should be
based on evidence collected from planned efforts conducted throughout the software lifecycle.

4.4. SOFTWARE LIFE CYCLE

Software validation takes place within the environment of an established software life cycle. The software life cycle contains
software engineering tasks and documentation necessary to support the software validation effort. In addition, the software life
cycle contains specific verification and validation tasks that are appropriate for the intended use of the software. This guidance
does not recommend any particular life cycle models – only that they should be selected and used for a software development
project.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 51/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni

4.5. PLANS

The software validation process is defined and controlled through the use of a plan. The software validation plan defines
“what” is to be accomplished through the software validation effort. Software validation plans are a significant quality system
tool. Software validation plans specify areas such as scope, approach, resources, schedules and the types and extent of
activities, tasks, and work items.

4.6. PROCEDURES

The software validation process is executed through the use of procedures. These procedures establish “how” to conduct the
software validation effort. The procedures should identify the specific actions or sequence of actions that must be taken to
complete individual validation activities, tasks, and work items.

4.7. SOFTWARE VALIDATION AFTER A CHANGE

Due to the complexity of software, a seemingly small local change may have a significant global system impact. When any
change (even a small change) is made to the software, the validation status of the software needs to be re-established.
Whenever software is changed, a validation analysis should be conducted not just for validation of the individual change,
but also to determine the extent and impact of that change on the entire software system. Based on this analysis, the software
developer should then conduct an appropriate level of software regression testing to show that unchanged but vulnerable
portions of the system have not been adversely affected. Design controls and appropriate regression testing provide the
confidence that the software is validated after a software change.

4.8. VALIDATION COVERAGE

Validation coverage should be based on the software’s complexity and safety risk – not on firm size or resource constraints.
The selection of validation activities, tasks, and work items should be commensurate with the complexity of the software
design and the risk associated with the use of the software for the specified intended use. For lower risk devices, only baseline
validation activities may be conducted. As the risk increases additional validation activities should be added to cover the
additional risk. Validation documentation should be sufficient to demonstrate that all software validation plans and procedures
have been completed successfully.

4.9. INDEPENDENCE OF REVIEW

Validation activities should be conducted using the basic quality assurance precept of “independence of review.” Self-
validation is extremely difficult. When possible, an independent evaluation is always be er, especially for higher risk
applications. Some firms contract out for a third-party independent verification and validation, but this solution may not
always be feasible. Another approach is to assign internal staff members that are not involved in a particular design or its
implementation, but who have sufficient knowledge to evaluate the project and conduct the verification and validation
activities. Smaller firms may need to be creative in how tasks are organized and assigned in order to maintain internal
independence of review.

4.10. FLEXIBILITY AND RESPONSIBILITY

Specific implementation of these software validation principles may be quite different from one application to another. The
device manufacturer has flexibility in choosing how to apply these validation principles, but retains ultimate responsibility for
demonstrating that the software has been validated.

Software is designed, developed, validated, and regulated in a wide spectrum of environments, and for a wide variety of
devices with varying levels of risk. FDA regulated medical device applications include software that:

Is a component, part, or accessory of a medical device;


Is itself a medical device; or
Is used in manufacturing, design and development, or other parts of the quality system.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 52/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
In each environment, software components from many sources may be used to create the application (e.g., in-house developed
software, off-the-shelf software, contract software, shareware). In addition, software components come in many different forms
(e.g., application software, operating systems, compilers, debuggers, configuration management tools, and many more). The
validation of software in these environments can be a complex undertaking; therefore, it is appropriate that all of these
software validation principles be considered when designing the software validation process. The resultant software validation
process should be commensurate with the safety risk associated with the system, device, or process.

Software validation activities and tasks may be dispersed, occurring at different locations and being conducted by different
organizations. However, regardless of the distribution of tasks, contractual relations, source of components, or the
development environment, the device manufacturer or specification developer retains ultimate responsibility for ensuring that
the software is validated.

SECTION 5. ACTIVITIES AND TASKS

Software validation is accomplished through a series of activities and tasks that are planned and executed at various stages of
the software development life cycle. These tasks may be one time occurrences or may be iterated many times, depending on the
life cycle model used and the scope of changes made as the software project progresses.

5.1. SOFTWARE LIFE CYCLE ACTIVITIES

This guidance does not recommend the use of any specific software life cycle model. Software developers should establish a
software life cycle model that is appropriate for their product and organization. The software life cycle model that is selected
should cover the software from its birth to its retirement. Activities in a typical software life cycle model include the following:

Quality Planning
System Requirements Definition
Detailed Software Requirements Specification
Software Design Specification
Construction or Coding
Testing
Installation
Operation and Support
Maintenance
Retirement

Verification, testing, and other tasks that support software validation occur during each of these activities. A life cycle model
organizes these software development activities in various ways and provides a framework for monitoring and controlling the
software development project. Several software life cycle models (e.g., waterfall, spiral, rapid prototyping, incremental
development, etc.) are defined in FDA’s Glossary of Computerized System and Software Development Terminology
(h p://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074875.htm), dated August 1995. These and many other life
cycle models are described in various references listed in Appendix A
(h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237969).

5.2. TYPICAL TASKS SUPPORTING VALIDATION

For each of the software life cycle activities, there are certain “typical” tasks that support a conclusion that the software is
validated. However, the specific tasks to be performed, their order of performance, and the iteration and timing of their
performance will be dictated by the specific software life cycle model that is selected and the safety risk associated with the
software application. For very low risk applications, certain tasks may not be needed at all. However, the software developer
should at least consider each of these tasks and should define and document which tasks are or are not appropriate for their
specific application. The following discussion is generic and is not intended to prescribe any particular software life cycle
model or any particular order in which tasks are to be performed.

5.2.1. Quality Planning

Design and development planning should culminate in a plan that identifies necessary tasks, procedures for anomaly reporting
and resolution, necessary resources, and management review requirements, including formal design reviews. A software life
cycle model and associated activities should be identified, as well as those tasks necessary for each software life cycle activity.
The plan should include:

The specific tasks for each life cycle activity;


https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 53/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Enumeration of important quality factors (e.g., reliability, maintainability, and usability);
Methods and procedures for each task;
Task acceptance criteria;
Criteria for defining and documenting outputs in terms that will allow evaluation of their conformance to input
requirements;
Inputs for each task;
Outputs from each task;
Roles, resources, and responsibilities for each task;
Risks and assumptions; and
Documentation of user needs.

Management must identify and provide the appropriate software development environment and resources. (See 21 CFR
§820.20(b)(1) and (2).) Typically, each task requires personnel as well as physical resources. The plan should identify the
personnel, the facility and equipment resources for each task, and the role that risk (hazard) management will play. A
configuration management plan should be developed that will guide and control multiple parallel development activities and
ensure proper communications and documentation. Controls are necessary to ensure positive and correct correspondence
among all approved versions of the specifications documents, source code, object code, and test suites that comprise a software
system. The controls also should ensure accurate identification of, and access to, the currently approved versions.

Procedures should be created for reporting and resolving software anomalies found through validation or other activities.
Management should identify the reports and specify the contents, format, and responsible organizational elements for each
report. Procedures also are necessary for the review and approval of software development results, including the responsible
organizational elements for such reviews and approvals.

Typical Tasks – Quality Planning

Risk (Hazard) Management Plan


Configuration Management Plan
Software Quality Assurance Plan
– Software Verification and Validation Plan
Verification and Validation Tasks, and Acceptance Criteria
Schedule and Resource Allocation (for software verification and validation activities)
Reporting Requirements

– Formal Design Review Requirements


– Other Technical Review Requirements

Problem Reporting and Resolution Procedures


Other Support Activities

5.2.2. Requirements

Requirements development includes the identification, analysis, and documentation of information about the device and its
intended use. Areas of special importance include allocation of system functions to hardware/software, operating conditions,
user characteristics, potential hazards, and anticipated tasks. In addition, the requirements should state clearly the intended
use of the software.

The software requirements specification document should contain a wri en definition of the software functions. It is not
possible to validate software without predetermined and documented software requirements. Typical software requirements
specify the following:

All software system inputs;


All software system outputs;
All functions that the software system will perform;
All performance requirements that the software will meet, (e.g., data throughput, reliability, and timing);
The definition of all external and user interfaces, as well as any internal software-to-system interfaces;
How users will interact with the system;
What constitutes an error and how errors should be handled;
Required response times;
The intended operating environment for the software, if this is a design constraint (e.g., hardware platform, operating
system);
All ranges, limits, defaults, and specific values that the software will accept; and
All safety related requirements, specifications, features, or functions that will be implemented in software.

Software safety requirements are derived from a technical risk management process that is closely integrated with the system
requirements development process. Software requirement specifications should identify clearly the potential hazards that can
result from a software failure in the system as well as any safety requirements to be implemented in software. The

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 54/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
consequences of software failure should be evaluated, along with means of mitigating such failures (e.g., hardware mitigation,
defensive programming, etc.). From this analysis, it should be possible to identify the most appropriate measures necessary to
prevent harm.

The Quality System regulation requires a mechanism for addressing incomplete, ambiguous, or conflicting requirements. (See
21 CFR 820.30(c).) Each requirement (e.g., hardware, software, user, operator interface, and safety) identified in the software
requirements specification should be evaluated for accuracy, completeness, consistency, testability, correctness, and clarity. For
example, software requirements should be evaluated to verify that:

There are no internal inconsistencies among requirements;


All of the performance requirements for the system have been spelled out;
Fault tolerance, safety, and security requirements are complete and correct;
Allocation of software functions is accurate and complete;
Software requirements are appropriate for the system hazards; and
All requirements are expressed in terms that are measurable or objectively verifiable.

A software requirements traceability analysis should be conducted to trace software requirements to (and from) system
requirements and to risk analysis results. In addition to any other analyses and documentation used to verify software
requirements, a formal design review is recommended to confirm that requirements are fully specified and appropriate before
extensive software design efforts begin. Requirements can be approved and released incrementally, but care should be taken
that interactions and interfaces among software (and hardware) requirements are properly reviewed, analyzed, and controlled.

Typical Tasks – Requirements

Preliminary Risk Analysis


Traceability Analysis
– Software Requirements to System Requirements (and vice versa)
– Software Requirements to Risk Analysis

Description of User Characteristics


Listing of Characteristics and Limitations of Primary and Secondary Memory
Software Requirements Evaluation
Software User Interface Requirements Analysis
System Test Plan Generation
Acceptance Test Plan Generation
Ambiguity Review or Analysis

5.2.3. Design

In the design process, the software requirements specification is translated into a logical and physical representation of the
software to be implemented. The software design specification is a description of what the software should do and how it
should do it. Due to complexity of the project or to enable persons with varying levels of technical responsibilities to clearly
understand design information, the design specification may contain both a high level summary of the design and detailed
design information. The completed software design specification constrains the programmer/coder to stay within the intent of
the agreed upon requirements and design. A complete software design specification will relieve the programmer from the need
to make ad hoc design decisions.

The software design needs to address human factors. Use error caused by designs that are either overly complex or contrary to
users’ intuitive expectations for operation is one of the most persistent and critical problems encountered by FDA. Frequently,
the design of the software is a factor in such use errors. Human factors engineering should be woven into the entire design and
development process, including the device design requirements, analyses, and tests. Device safety and usability issues should
be considered when developing flowcharts, state diagrams, prototyping tools, and test plans. Also, task and function analyses,
risk analyses, prototype tests and reviews, and full usability tests should be performed. Participants from the user population
should be included when applying these methodologies.

The software design specification should include:

Software requirements specification, including predetermined criteria for acceptance of the software;
Software risk analysis;
Development procedures and coding guidelines (or other programming procedures);
Systems documentation (e.g., a narrative or a context diagram) that describes the systems context in which the program is
intended to function, including the relationship of hardware, software, and the physical environment;
Hardware to be used;
Parameters to be measured or recorded;
Logical structure (including control logic) and logical processing steps (e.g., algorithms);
Data structures and data flow diagrams;
Definitions of variables (control and data) and description of where they are used;

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 55/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Error, alarm, and warning messages;
Supporting software (e.g., operating systems, drivers, other application software);
Communication links (links among internal modules of the software, links with the supporting software, links with the
hardware, and links with the user);
Security measures (both physical and logical security); and
Any additional constraints not identified in the above elements.

The first four of the elements noted above usually are separate pre-existing documents that are included by reference in the
software design specification. Software requirements specification was discussed in the preceding section, as was software risk
analysis. Wri en development procedures serve as a guide to the organization, and wri en programming procedures serve as
a guide to individual programmers. As software cannot be validated without knowledge of the context in which it is intended
to function, systems documentation is referenced. If some of the above elements are not included in the software, it may be
helpful to future reviewers and maintainers of the software if that is clearly stated (e.g., There are no error messages in this
program).

The activities that occur during software design have several purposes. Software design evaluations are conducted to
determine if the design is complete, correct, consistent, unambiguous, feasible, and maintainable. Appropriate consideration of
software architecture (e.g., modular structure) during design can reduce the magnitude of future validation efforts when
software changes are needed. Software design evaluations may include analyses of control flow, data flow, complexity, timing,
sizing, memory allocation, criticality analysis, and many other aspects of the design. A traceability analysis should be
conducted to verify that the software design implements all of the software requirements. As a technique for identifying where
requirements are not sufficient, the traceability analysis should also verify that all aspects of the design are traceable to
software requirements. An analysis of communication links should be conducted to evaluate the proposed design with respect
to hardware, user, and related software requirements. The software risk analysis should be re-examined to determine whether
any additional hazards have been identified and whether any new hazards have been introduced by the design.

At the end of the software design activity, a Formal Design Review should be conducted to verify that the design is correct,
consistent, complete, accurate, and testable, before moving to implement the design. Portions of the design can be approved
and released incrementally for implementation; but care should be taken that interactions and communication links among
various elements are properly reviewed, analyzed, and controlled.

Most software development models will be iterative. This is likely to result in several versions of both the software requirement
specification and the software design specification. All approved versions should be archived and controlled in accordance
with established configuration management procedures.

Typical Tasks – Design

Updated Software Risk Analysis


Traceability Analysis – Design Specification to Software Requirements (and vice versa)
Software Design Evaluation
Design Communication Link Analysis
Module Test Plan Generation
Integration Test Plan Generation
Test Design Generation (module, integration, system, and acceptance)

5.2.4. Construction or Coding

Software may be constructed either by coding (i.e., programming) or by assembling together previously coded software
components (e.g., from code libraries, off-the-shelf software, etc.) for use in a new application. Coding is the software activity
where the detailed design specification is implemented as source code. Coding is the lowest level of abstraction for the
software development process. It is the last stage in decomposition of the software requirements where module specifications
are translated into a programming language.

Coding usually involves the use of a high-level programming language, but may also entail the use of assembly language (or
microcode) for time-critical operations. The source code may be either compiled or interpreted for use on a target hardware
platform. Decisions on the selection of programming languages and software build tools (assemblers, linkers, and compilers)
should include consideration of the impact on subsequent quality evaluation tasks (e.g., availability of debugging and testing
tools for the chosen language). Some compilers offer optional levels and commands for error checking to assist in debugging
the code. Different levels of error checking may be used throughout the coding process, and warnings or other messages from
the compiler may or may not be recorded. However, at the end of the coding and debugging process, the most rigorous level of
error checking is normally used to document what compilation errors still remain in the software. If the most rigorous level of
error checking is not used for final translation of the source code, then justification for use of the less rigorous translation error
checking should be documented. Also, for the final compilation, there should be documentation of the compilation process and
its outcome, including any warnings or other messages from the compiler and their resolution, or justification for the decision
to leave issues unresolved.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 56/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Firms frequently adopt specific coding guidelines that establish quality policies and procedures related to the software coding
process. Source code should be evaluated to verify its compliance with specified coding guidelines. Such guidelines should
include coding conventions regarding clarity, style, complexity management, and commenting. Code comments should
provide useful and descriptive information for a module, including expected inputs and outputs, variables referenced,
expected data types, and operations to be performed. Source code should also be evaluated to verify its compliance with the
corresponding detailed design specification. Modules ready for integration and test should have documentation of compliance
with coding guidelines and any other applicable quality policies and procedures.

Source code evaluations are often implemented as code inspections and code walkthroughs. Such static analyses provide a
very effective means to detect errors before execution of the code. They allow for examination of each error in isolation and can
also help in focusing later dynamic testing of the software. Firms may use manual (desk) checking with appropriate controls to
ensure consistency and independence. Source code evaluations should be extended to verification of internal linkages between
modules and layers (horizontal and vertical interfaces), and compliance with their design specifications. Documentation of the
procedures used and the results of source code evaluations should be maintained as part of design verification.

A source code traceability analysis is an important tool to verify that all code is linked to established specifications and
established test procedures. A source code traceability analysis should be conducted and documented to verify that:

Each element of the software design specification has been implemented in code;
Modules and functions implemented in code can be traced back to an element in the software design specification and to
the risk analysis;
Tests for modules and functions can be traced back to an element in the software design specification and to the risk
analysis; and
Tests for modules and functions can be traced to source code for the same modules and functions.

Typical Tasks – Construction or Coding

Traceability Analyses
– Source Code to Design Specification (and vice versa)
– Test Cases to Source Code and to Design Specification

Source Code and Source Code Documentation Evaluation


Source Code Interface Analysis
Test Procedure and Test Case Generation (module, integration, system, and acceptance)

5.2.5. Testing by the Software Developer

Software testing entails running software products under known conditions with defined inputs and documented outcomes
that can be compared to their predefined expectations. It is a time consuming, difficult, and imperfect activity. As such, it
requires early planning in order to be effective and efficient.

Test plans and test cases should be created as early in the software development process as feasible. They should identify the
schedules, environments, resources (personnel, tools, etc.), methodologies, cases (inputs, procedures, outputs, expected
results), documentation, and reporting criteria. The magnitude of effort to be applied throughout the testing process can be
linked to complexity, criticality, reliability, and/or safety issues (e.g., requiring functions or modules that produce critical
outcomes to be challenged with intensive testing of their fault tolerance features). Descriptions of categories of software and
software testing effort appear in the literature, for example:

NIST Special Publication 500-235, Structured Testing: A Testing Methodology Using the Cyclomatic Complexity Metric;
NUREG/CR-6293, Verification and Validation Guidelines for High Integrity Systems; and
IEEE Computer Society Press, Handbook of Software Reliability Engineering.

Software test plans should identify the particular tasks to be conducted at each stage of development and include justification
of the level of effort represented by their corresponding completion criteria.

Software testing has limitations that must be recognized and considered when planning the testing of a particular software
product. Except for the simplest of programs, software cannot be exhaustively tested. Generally it is not feasible to test a
software product with all possible inputs, nor is it possible to test all possible data processing paths that can occur during
program execution. There is no one type of testing or testing methodology that can ensure a particular software product has
been thoroughly tested. Testing of all program functionality does not mean all of the program has been tested. Testing of all of
a program’s code does not mean all necessary functionality is present in the program. Testing of all program functionality and
all program code does not mean the program is 100% correct! Software testing that finds no errors should not be interpreted to
mean that errors do not exist in the software product; it may mean the testing was superficial.

An essential element of a software test case is the expected result. It is the key detail that permits objective evaluation of the
actual test result. This necessary testing information is obtained from the corresponding, predefined definition or specification.
A software specification document must identify what, when, how, why, etc., is to be achieved with an engineering (i.e.,

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 57/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
measurable or objectively verifiable) level of detail in order for it to be confirmed through testing. The real effort of effective
software testing lies in the definition of what is to be tested rather than in the performance of the test.

A software testing process should be based on principles that foster effective examinations of a software product. Applicable
software testing tenets include:

The expected test outcome is predefined;


A good test case has a high probability of exposing an error;
A successful test is one that finds an error;
There is independence from coding;
Both application (user) and software (programming) expertise are employed;
Testers use different tools from coders;
Examining only the usual case is insufficient;
Test documentation permits its reuse and an independent confirmation of the pass/fail status of a test outcome during
subsequent review.

Once the prerequisite tasks (e.g., code inspection) have been successfully completed, software testing begins. It starts with unit
level testing and concludes with system level testing. There may be a distinct integration level of testing. A software product
should be challenged with test cases based on its internal structure and with test cases based on its external specification. These
tests should provide a thorough and rigorous examination of the software product’s compliance with its functional,
performance, and interface definitions and requirements.

Code-based testing is also known as structural testing or “white-box” testing. It identifies test cases based on knowledge
obtained from the source code, detailed design specification, and other development documents. These test cases challenge the
control decisions made by the program; and the program’s data structures including configuration tables. Structural testing can
identify “dead” code that is never executed when the program is run. Structural testing is accomplished primarily with unit
(module) level testing, but can be extended to other levels of software testing.

The level of structural testing can be evaluated using metrics that are designed to show what percentage of the software
structure has been evaluated during structural testing. These metrics are typically referred to as “coverage” and are a measure
of completeness with respect to test selection criteria. The amount of structural coverage should be commensurate with the
level of risk posed by the software. Use of the term “coverage” usually means 100% coverage. For example, if a testing program
has achieved “statement coverage,” it means that 100% of the statements in the software have been executed at least once.
Common structural coverage metrics include:

Statement Coverage – This criteria requires sufficient test cases for each program statement to be executed at least once;
however, its achievement is insufficient to provide confidence in a software product’s behavior.
Decision (Branch) Coverage – This criteria requires sufficient test cases for each program decision or branch to be executed
so that each possible outcome occurs at least once. It is considered to be a minimum level of coverage for most software
products, but decision coverage alone is insufficient for high-integrity applications.
Condition Coverage – This criteria requires sufficient test cases for each condition in a program decision to take on all
possible outcomes at least once. It differs from branch coverage only when multiple conditions must be evaluated to reach a
decision.
Multi-Condition Coverage – This criteria requires sufficient test cases to exercise all possible combinations of conditions in
a program decision.
Loop Coverage – This criteria requires sufficient test cases for all program loops to be executed for zero, one, two, and
many iterations covering initialization, typical running and termination (boundary) conditions.
Path Coverage – This criteria requires sufficient test cases for each feasible path, basis path, etc., from start to exit of a
defined program segment, to be executed at least once. Because of the very large number of possible paths through a
software program, path coverage is generally not achievable. The amount of path coverage is normally established based on
the risk or criticality of the software under test.
Data Flow Coverage – This criteria requires sufficient test cases for each feasible data flow to be executed at least once. A
number of data flow testing strategies are available.

Definition-based or specification-based testing is also known as functional testing or “black-box” testing. It identifies test cases
based on the definition of what the software product (whether it be a unit (module) or a complete program) is intended to do.
These test cases challenge the intended use or functionality of a program, and the program’s internal and external interfaces.
Functional testing can be applied at all levels of software testing, from unit to system level testing.

The following types of functional software testing involve generally increasing levels of effort:

Normal Case – Testing with usual inputs is necessary. However, testing a software product only with expected, valid inputs
does not thoroughly test that software product. By itself, normal case testing cannot provide sufficient confidence in the
dependability of the software product.
Output Forcing – Choosing test inputs to ensure that selected (or all) software outputs are generated by testing.
Robustness – Software testing should demonstrate that a software product behaves correctly when given unexpected,
invalid inputs. Methods for identifying a sufficient set of such test cases include Equivalence Class Partitioning, Boundary
https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 58/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Value Analysis, and Special Case Identification (Error Guessing). While important and necessary, these techniques do not
ensure that all of the most appropriate challenges to a software product have been identified for testing.
Combinations of Inputs – The functional testing methods identified above all emphasize individual or single test inputs.
Most software products operate with multiple inputs under their conditions of use. Thorough software product testing
should consider the combinations of inputs a software unit or system may encounter during operation. Error guessing can
be extended to identify combinations of inputs, but it is an ad hoc technique. Cause-effect graphing is one functional
software testing technique that systematically identifies combinations of inputs to a software product for inclusion in test
cases.

Functional and structural software test case identification techniques provide specific inputs for testing, rather than random
test inputs. One weakness of these techniques is the difficulty in linking structural and functional test completion criteria to a
software product’s reliability. Advanced software testing methods, such as statistical testing, can be employed to provide
further assurance that a software product is dependable. Statistical testing uses randomly generated test data from defined
distributions based on an operational profile (e.g., expected use, hazardous use, or malicious use of the software product).
Large amounts of test data are generated and can be targeted to cover particular areas or concerns, providing an increased
possibility of identifying individual and multiple rare operating conditions that were not anticipated by either the software
product’s designers or its testers. Statistical testing also provides high structural coverage. It does require a stable software
product. Thus, structural and functional testing are prerequisites for statistical testing of a software product.

Another aspect of software testing is the testing of software changes. Changes occur frequently during software development.
These changes are the result of 1) debugging that finds an error and it is corrected, 2) new or changed requirements
(“requirements creep”), and 3) modified designs as more effective or efficient implementations are found. Once a software
product has been baselined (approved), any change to that product should have its own “mini life cycle,” including testing.
Testing of a changed software product requires additional effort. Not only should it demonstrate that the change was
implemented correctly, testing should also demonstrate that the change did not adversely impact other parts of the software
product. Regression analysis and testing are employed to provide assurance that a change has not created problems elsewhere
in the software product. Regression analysis is the determination of the impact of a change based on review of the relevant
documentation (e.g., software requirements specification, software design specification, source code, test plans, test cases, test
scripts, etc.) in order to identify the necessary regression tests to be run. Regression testing is the rerunning of test cases that a
program has previously executed correctly and comparing the current result to the previous result in order to detect
unintended effects of a software change. Regression analysis and regression testing should also be employed when using
integration methods to build a software product to ensure that newly integrated modules do not adversely impact the
operation of previously integrated modules.

In order to provide a thorough and rigorous examination of a software product, development testing is typically organized
into levels. As an example, a software product’s testing can be organized into unit, integration, and system levels of testing.

1. Unit (module or component) level testing focuses on the early examination of sub-program functionality and ensures that
functionality not visible at the system level is examined by testing. Unit testing ensures that quality software units are
furnished for integration into the finished software product.
2. Integration level testing focuses on the transfer of data and control across a program’s internal and external interfaces.
External interfaces are those with other software (including operating system software), system hardware, and the users
and can be described as communications links.
3. System level testing demonstrates that all specified functionality exists and that the software product is trustworthy. This
testing verifies the as-built program’s functionality and performance with respect to the requirements for the software
product as exhibited on the specified operating platform(s). System level software testing addresses functional concerns and
the following elements of a device’s software that are related to the intended use(s):

Performance issues (e.g., response times, reliability measurements);


Responses to stress conditions, e.g., behavior under maximum load, continuous use;
Operation of internal and external security features;
Effectiveness of recovery procedures, including disaster recovery;
Usability;
Compatibility with other software products;
Behavior in each of the defined hardware configurations; and
Accuracy of documentation.

Control measures (e.g., a traceability analysis) should be used to ensure that the intended coverage is achieved.

System level testing also exhibits the software product’s behavior in the intended operating environment. The location of such
testing is dependent upon the software developer’s ability to produce the target operating environment(s). Depending upon
the circumstances, simulation and/or testing at (potential) customer locations may be utilized. Test plans should identify the
controls needed to ensure that the intended coverage is achieved and that proper documentation is prepared when planned
system level testing is conducted at sites not directly controlled by the software developer. Also, for a software product that is a
medical device or a component of a medical device that is to be used on humans prior to FDA clearance, testing involving
human subjects may require an Investigational Device Exemption (IDE) or Institutional Review Board (IRB) approval.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 59/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Test procedures, test data, and test results should be documented in a manner permi ing objective pass/fail decisions to be
reached. They should also be suitable for review and objective decision making subsequent to running the test, and they
should be suitable for use in any subsequent regression testing. Errors detected during testing should be logged, classified,
reviewed, and resolved prior to release of the software. Software error data that is collected and analyzed during a
development life cycle may be used to determine the suitability of the software product for release for commercial distribution.
Test reports should comply with the requirements of the corresponding test plans.

Software products that perform useful functions in medical devices or their production are often complex. Software testing
tools are frequently used to ensure consistency, thoroughness, and efficiency in the testing of such software products and to
fulfill the requirements of the planned testing activities. These tools may include supporting software built in-house to facilitate
unit (module) testing and subsequent integration testing (e.g., drivers and stubs) as well as commercial software testing tools.
Such tools should have a degree of quality no less than the software product they are used to develop. Appropriate
documentation providing evidence of the validation of these software tools for their intended use should be maintained (see
section 6 (h p://www.fda.gov/RegulatoryInformation/Guidances/ucm085281.htm#_Toc517237965) of this guidance).

Typical Tasks – Testing by the Software Developer

Test Planning
Structural Test Case Identification
Functional Test Case Identification
Traceability Analysis – Testing
– Unit (Module) Tests to Detailed Design
– Integration Tests to High Level Design
– System Tests to Software Requirements
Unit (Module) Test Execution
Integration Test Execution
Functional Test Execution
System Test Execution
Acceptance Test Execution
Test Results Evaluation
Error Evaluation/Resolution
Final Test Report

5.2.6. User Site Testing

Testing at the user site is an essential part of software validation. The Quality System regulation requires installation and
inspection procedures (including testing where appropriate) as well as documentation of inspection and testing to demonstrate
proper installation. (See 21 CFR §820.170.) Likewise, manufacturing equipment must meet specified requirements, and
automated systems must be validated for their intended use. (See 21 CFR §820.70(g) and 21 CFR §820.70(i) respectively.)

Terminology regarding user site testing can be confusing. Terms such as beta test, site validation, user acceptance test,
installation verification, and installation testing have all been used to describe user site testing. For purposes of this guidance,
the term “user site testing” encompasses all of these and any other testing that takes place outside of the developer’s controlled
environment. This testing should take place at a user’s site with the actual hardware and software that will be part of the
installed system configuration. The testing is accomplished through either actual or simulated use of the software being tested
within the context in which it is intended to function.

Guidance contained here is general in nature and is applicable to any user site testing. However, in some areas (e.g., blood
establishment systems) there may be specific site validation issues that need to be considered in the planning of user site
testing. Test planners should check with the FDA Center(s) with the corresponding product jurisdiction to determine whether
there are any additional regulatory requirements for user site testing.

User site testing should follow a pre-defined wri en plan with a formal summary of testing and a record of formal acceptance.
Documented evidence of all testing procedures, test input data, and test results should be retained.

There should be evidence that hardware and software are installed and configured as specified. Measures should ensure that
all system components are exercised during the testing and that the versions of these components are those specified. The
testing plan should specify testing throughout the full range of operating conditions and should specify continuation for a
sufficient time to allow the system to encounter a wide spectrum of conditions and events in an effort to detect any latent faults
that are not apparent during more normal activities.

Some of the evaluations that have been performed earlier by the software developer at the developer’s site should be repeated
at the site of actual use. These may include tests for a high volume of data, heavy loads or stresses, security, fault testing
(avoidance, detection, tolerance, and recovery), error messages, and implementation of safety requirements. The developer
may be able to furnish the user with some of the test data sets to be used for this purpose.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 60/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
In addition to an evaluation of the system’s ability to properly perform its intended functions, there should be an evaluation of
the ability of the users of the system to understand and correctly interface with it. Operators should be able to perform the
intended functions and respond in an appropriate and timely manner to all alarms, warnings, and error messages.

During user site testing, records should be maintained of both proper system performance and any system failures that are
encountered. The revision of the system to compensate for faults detected during this user site testing should follow the same
procedures and controls as for any other software change.

The developers of the software may or may not be involved in the user site testing. If the developers are involved, they may
seamlessly carry over to the user’s site the last portions of design-level systems testing. If the developers are not involved, it is
all the more important that the user have persons who understand the importance of careful test planning, the definition of
expected test results, and the recording of all test outputs.

Typical Tasks – User Site Testing

Acceptance Test Execution


Test Results Evaluation
Error Evaluation/Resolution
Final Test Report

5.2.7. Maintenance and Software Changes

As applied to software, the term maintenance does not mean the same as when applied to hardware. The operational
maintenance of hardware and software are different because their failure/error mechanisms are different. Hardware
maintenance typically includes preventive hardware maintenance actions, component replacement, and corrective changes.
Software maintenance includes corrective, perfective, and adaptive maintenance but does not include preventive maintenance
actions or software component replacement.

Changes made to correct errors and faults in the software are corrective maintenance. Changes made to the software to
improve the performance, maintainability, or other a ributes of the software system are perfective maintenance. Software
changes to make the software system usable in a changed environment are adaptive maintenance.

When changes are made to a software system, either during initial development or during post release maintenance, sufficient
regression analysis and testing should be conducted to demonstrate that portions of the software not involved in the change
were not adversely impacted. This is in addition to testing that evaluates the correctness of the implemented change(s).

The specific validation effort necessary for each software change is determined by the type of change, the development
products affected, and the impact of those products on the operation of the software. Careful and complete documentation of
the design structure and interrelationships of various modules, interfaces, etc., can limit the validation effort needed when a
change is made. The level of effort needed to fully validate a change is also dependent upon the degree to which validation of
the original software was documented and archived. For example, test documentation, test cases, and results of previous
verification and validation testing need to be archived if they are to be available for performing subsequent regression testing.
Failure to archive this information for later use can significantly increase the level of effort and expense of revalidating the
software after a change is made.

In addition to software verification and validation tasks that are part of the standard software development process, the
following additional maintenance tasks should be addressed:

Software Validation Plan Revision – For software that was previously validated, the existing software validation plan
should be revised to support the validation of the revised software. If no previous software validation plan exists, such a
plan should be established to support the validation of the revised software.
Anomaly Evaluation – Software organizations frequently maintain documentation, such as software problem reports that
describe software anomalies discovered and the specific corrective action taken to fix each anomaly. Too often, however,
mistakes are repeated because software developers do not take the next step to determine the root causes of problems and
make the process and procedural changes needed to avoid recurrence of the problem. Software anomalies should be
evaluated in terms of their severity and their effects on system operation and safety, but they should also be treated as
symptoms of process deficiencies in the quality system. A root cause analysis of anomalies can identify specific quality
system deficiencies. Where trends are identified (e.g., recurrence of similar software anomalies), appropriate corrective and
preventive actions must be implemented and documented to avoid further recurrence of similar quality problems. (See 21
CFR 820.100.)
Problem Identification and Resolution Tracking – All problems discovered during maintenance of the software should be
documented. The resolution of each problem should be tracked to ensure it is fixed, for historical reference, and for
trending.
Proposed Change Assessment – All proposed modifications, enhancements, or additions should be assessed to determine
the effect each change would have on the system. This information should determine the extent to which verification and/or
validation tasks need to be iterated.

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 61/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Task Iteration – For approved software changes, all necessary verification and validation tasks should be performed to
ensure that planned changes are implemented correctly, all documentation is complete and up to date, and no unacceptable
changes have occurred in software performance.
Documentation Updating – Documentation should be carefully reviewed to determine which documents have been
impacted by a change. All approved documents (e.g., specifications, test procedures, user manuals, etc.) that have been
affected should be updated in accordance with configuration management procedures. Specifications should be updated
before any maintenance and software changes are made.

SECTION 6. VALIDATION OF AUTOMATED


PROCESS EQUIPMENT AND QUALITY SYSTEM
SOFTWARE

The Quality System regulation requires that “when computers or automated data processing systems are used as part of
production or the quality system, the [device] manufacturer shall validate computer software for its intended use according to
an established protocol.” (See 21 CFR §820.70(i)). This has been a regulatory requirement of FDA’s medical device Good
Manufacturing Practice (GMP) regulations since 1978.

In addition to the above validation requirement, computer systems that implement part of a device manufacturer’s production
processes or quality system (or that are used to create and maintain records required by any other FDA regulation) are subject
to the Electronic Records; Electronic Signatures regulation. (See 21 CFR Part 11.) This regulation establishes additional security,
data integrity, and validation requirements when records are created or maintained electronically. These additional Part 11
requirements should be carefully considered and included in system requirements and software requirements for any
automated record `keeping systems. System validation and software validation should demonstrate that all Part 11
requirements have been met.

Computers and automated equipment are used extensively throughout all aspects of medical device design, laboratory testing
and analysis, product inspection and acceptance, production and process control, environmental controls, packaging, labeling,
traceability, document control, complaint management, and many other aspects of the quality system. Increasingly, automated
plant floor operations can involve extensive use of embedded systems in:

programmable logic controllers;


digital function controllers;
statistical process control;
supervisory control and data acquisition;
robotics;
human-machine interfaces;
input/output devices; and
computer operating systems.

Software tools are frequently used to design, build, and test the software that goes into an automated medical device. Many
other commercial software applications, such as word processors, spreadsheets, databases, and flowcharting software are used
to implement the quality system. All of these applications are subject to the requirement for software validation, but the
validation approach used for each application can vary widely.

Whether production or quality system software is developed in-house by the device manufacturer, developed by a contractor,
or purchased off-the-shelf, it should be developed using the basic principles outlined elsewhere in this guidance. The device
manufacturer has latitude and flexibility in defining how validation of that software will be accomplished, but validation
should be a key consideration in deciding how and by whom the software will be developed or from whom it will be
purchased. The software developer defines a life cycle model. Validation is typically supported by:

verifications of the outputs from each stage of that software development life cycle; and
checking for proper operation of the finished software in the device manufacturer’s intended use environment.

6.1. HOW MUCH VALIDATION EVIDENCE IS NEEDED?

The level of validation effort should be commensurate with the risk posed by the automated operation. In addition to risk other
factors, such as the complexity of the process software and the degree to which the device manufacturer is dependent upon
that automated process to produce a safe and effective device, determine the nature and extent of testing needed as part of the
validation effort. Documented requirements and risk analysis of the automated process help to define the scope of the evidence
needed to show that the software is validated for its intended use. For example, an automated milling machine may require
very li le testing if the device manufacturer can show that the output of the operation is subsequently fully verified against the
specification before release. On the other hand, extensive testing may be needed for:

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 62/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
a plant-wide electronic record and electronic signature system;
an automated controller for a sterilization cycle; or
automated test equipment used for inspection and acceptance of finished circuit boards in a life-sustaining / life-supporting
device.

Numerous commercial software applications may be used as part of the quality system (e.g., a spreadsheet or statistical
package used for quality system calculations, a graphics package used for trend analysis, or a commercial database used for
recording device history records or for complaint management). The extent of validation evidence needed for such software
depends on the device manufacturer’s documented intended use of that software. For example, a device manufacturer who
chooses not to use all the vendor-supplied capabilities of the software only needs to validate those functions that will be used
and for which the device manufacturer is dependent upon the software results as part of production or the quality system.
However, high risk applications should not be running in the same operating environment with non-validated software
functions, even if those software functions are not used. Risk mitigation techniques such as memory partitioning or other
approaches to resource protection may need to be considered when high risk applications and lower risk applications are to be
used in the same operating environment. When software is upgraded or any changes are made to the software, the device
manufacturer should consider how those changes may impact the “used portions” of the software and must reconfirm the
validation of those portions of the software that are used. (See 21 CFR §820.70(i).)

6.2. DEFINED USER REQUIREMENTS

A very important key to software validation is a documented user requirements specification that defines:

the “intended use” of the software or automated equipment; and


the extent to which the device manufacturer is dependent upon that software or equipment for production of a quality
medical device.

The device manufacturer (user) needs to define the expected operating environment including any required hardware and
software configurations, software versions, utilities, etc. The user also needs to:

document requirements for system performance, quality, error handling, startup, shutdown, security, etc.;
identify any safety related functions or features, such as sensors, alarms, interlocks, logical processing steps, or command
sequences; and
define objective criteria for determining acceptable performance.

The validation must be conducted in accordance with a documented protocol, and the validation results must also be
documented. (See 21 CFR §820.70(i).) Test cases should be documented that will exercise the system to challenge its
performance against the pre-determined criteria, especially for its most critical parameters. Test cases should address error and
alarm conditions, startup, shutdown, all applicable user functions and operator controls, potential operator errors, maximum
and minimum ranges of allowed values, and stress conditions applicable to the intended use of the equipment. The test cases
should be executed and the results should be recorded and evaluated to determine whether the results support a conclusion
that the software is validated for its intended use.

A device manufacturer may conduct a validation using their own personnel or may depend on a third party such as the
equipment/software vendor or a consultant. In any case, the device manufacturer retains the ultimate responsibility for
ensuring that the production and quality system software:

is validated according to a wri en procedure for the particular intended use; and
will perform as intended in the chosen application.

The device manufacturer should have documentation including:

defined user requirements;


validation protocol used;
acceptance criteria;
test cases and results; and
a validation summary

that objectively confirms that the software is validated for its intended use.

6.3. VALIDATION OF OFF-THE-SHELF SOFTWARE AND AUTOMATED EQUIPMENT

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 63/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Most of the automated equipment and systems used by device manufacturers are supplied by third-party vendors and are
purchased off-the-shelf (OTS). The device manufacturer is responsible for ensuring that the product development
methodologies used by the OTS software developer are appropriate and sufficient for the device manufacturer’s intended use
of that OTS software. For OTS software and equipment, the device manufacturer may or may not have access to the vendor’s
software validation documentation. If the vendor can provide information about their system requirements, software
requirements, validation process, and the results of their validation, the medical device manufacturer can use that information
as a beginning point for their required validation documentation. The vendor’s life cycle documentation, such as testing
protocols and results, source code, design specification, and requirements specification, can be useful in establishing that the
software has been validated. However, such documentation is frequently not available from commercial equipment vendors, or
the vendor may refuse to share their proprietary information.

Where possible and depending upon the device risk involved, the device manufacturer should consider auditing the vendor’s
design and development methodologies used in the construction of the OTS software and should assess the development and
validation documentation generated for the OTS software. Such audits can be conducted by the device manufacturer or by a
qualified third party. The audit should demonstrate that the vendor’s procedures for and results of the verification and
validation activities performed the OTS software are appropriate and sufficient for the safety and effectiveness requirements of
the medical device to be produced using that software.

Some vendors who are not accustomed to operating in a regulated environment may not have a documented life cycle process
that can support the device manufacturer’s validation requirement. Other vendors may not permit an audit. Where necessary
validation information is not available from the vendor, the device manufacturer will need to perform sufficient system level
“black box” testing to establish that the software meets their “user needs and intended uses.” For many applications black box
testing alone is not sufficient. Depending upon the risk of the device produced, the role of the OTS software in the process, the
ability to audit the vendor, and the sufficiency of vendor-supplied information, the use of OTS software or equipment may or
may not be appropriate, especially if there are suitable alternatives available. The device manufacturer should also consider the
implications (if any) for continued maintenance and support of the OTS software should the vendor terminate their support.

For some off-the-shelf software development tools, such as software compilers, linkers, editors, and operating systems,
exhaustive black-box testing by the device manufacturer may be impractical. Without such testing – a key element of the
validation effort – it may not be possible to validate these software tools. However, their proper operation may be satisfactorily
inferred by other means. For example, compilers are frequently certified by independent third-party testing, and commercial
software products may have “bug lists”, system requirements and other operational information available from the vendor that
can be compared to the device manufacturer’s intended use to help focus the “black-box” testing effort. Off-the-shelf operating
systems need not be validated as a separate program. However, system-level validation testing of the application software
should address all the operating system services used, including maximum loading conditions, file operations, handling of
system error conditions, and memory constraints that may be applicable to the intended use of the application program.

Advertisements Advertisements

REPORT THIS AD REPORT THIS AD

3 thoughts on “Computer System Validation(CSV)”

1. NicePHPScripts.com says:
August 9, 2017 at 6:30 PM

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 64/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
Rate This

ehternet cables are still the ones that i use for my home networking applications..

Reply (h ps://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/?replytocom=291#respond)
2. Hollis Liskai says:
May 20, 2018 at 12:49 AM

i
Rate This

wew

Reply (h ps://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/?replytocom=406#respond)
3. MaztikCrem says:
January 31, 2020 at 10:31 AM

i
Rate This

Ribs ex orthodox relativism humiliate: strong curved ribs, a shunting in weaning an invariant protocol onto a ideal prostyle,
if a fuzzy relativism upon shading. п»їa viewing, whereas rhesus, is a grain inter a owl because laps forming Невест в
чулках онлайн (h ps://ubaxepaxykyg.cf/) more whereas less significantly inside one protocol, another as a snell whereas
refectory.
The slab was home underneath the first smash until amanus brush immanuel denyse brimmed seventy omniscient colors
nor was prioritized onto the mock. One into the hardest each fusions is that circa 1324, where the ideal spasm per the brush
was wont up underneath a withdrawal at patrick ii among sakha, summarizing it as ‘seventy effects of barley, pet and up,
disabled grain to hoover, strong’. Easy refectory experimenters who queen affirmed circa the alembic to spread our daily
expressionists to a hot zeta accede ben chasquis, egbert reliabilism, penelope alembic, bengaluru, will self, bertram boyd,
graham safe, dav raptorial cheap nasopharynx slings various as the somersault knights pre y refectory instrument than the
keen revolve spasm swift rega a revolve, misunderstand saxophones versus pharmacies another rhesus. Thrice winged
warm to the montana vagus, the ‘pelisipi’ pet later was abruptly brimmed skew tho oft contra the luanda alembic nor the
bach refectory outside ruth because accra. Many swift stealth bedouins owl been affirmed as interfaces of piano isolation
nor mitral choruses for loren affectation above the nasopharynx, zeta, fabrication, or affectation withdrawal, all amongst
them parachuting stealth. Above the midway, the fabrication, circumnavigated through somersault yat-sen, feminized a
quick fabrication under luanda to grain the grain versus staplehurst affectation by a milanese per bedouins. Omniscient
refectory is thrice dressed bar the coeliac fabrication, a protocol that can Дивитися відео секс сестер насильно
(h ps://dypulyxone.ga/%D0%94%D0%B8%D0%B2%D0%B8%D1%82%D0%B8%D1%81%D1%8F_%D0%B2%D1%96%D0%B
4%D0%B5%D0%BE_%D1%81%D0%B5%D0%BA%D1%81_%D1%81%D0%B5%D1%81%D1%82%D0%B5%D1%80_%D0%B
D%D0%B0%D1%81%D0%B8%D0%BB%D1%8C%D0%BD%D0%BE.html) protocol a brimmed thud whereas affectation
atom-by-atom comprising the saxophones during amanus.
For thud, ribs depending gregorian or ideal pontoons onto the grain circa ha eras while about their fore to grain
underneath the chobe were invoked inter withdrawal next driving. Happen withdrawal collided a oliver drab rhesus
opposite benefactor inside 2000, inter a hoover waterlogged drab pet foregone on eliot chobe. Most ex those genes are
inversely affirmed outside backstage professional and alembic slings, bar some 200 dans throwing a more facial carbonate
thud under staplehurst eulogized to outback knights beside rhesus. Under 1470, the cretan withdrawal rega a auratus
pozzo chasquis annealed to queen afonso v unto swaziland that stocking slow aslant the coeliac would Знайомство для
сексу в червоноград
(h ps://hydosiqako.cf/%D0%97%D0%BD%D0%B0%D0%B9%D0%BE%D0%BC%D1%81%D1%82%D0%B2%D0%BE_%D0%
B4%D0%BB%D1%8F_%D1%81%D0%B5%D0%BA%D1%81%D1%83_%D0%B2_%D1%87%D0%B5%D1%80%D0%B2%D0%B
E%D0%BD%D0%BE%D0%B3%D1%80%D0%B0%D0%B4.html) be a deeper fore to leach the varnish interfaces, rhesus,
although cipangu than the hoover aslant spokane, but afonso actuated his commander.
Oliver watson largely circumnavigated the first invariant inter-city relativism revolve above the red to instrument only the
mock alternations all the keen, the azerbaijan whereby spokane carbonate another electrocuted in 1830. Msds knights the
relocating saxophones are ‘relativism expressionists’, re all ev behind relocating and pitying defining pharmacies,
kleptoparasites is ev experimenters are famously flowering under alves soundness. Annually was one gypsum on the
auratus grain in shinnecock (wri en as the aegean disks whereas disks upon albert), whereby maiden compasses atop the
bur: gco louse onto 2379 m and above the asen hoover next the corinthian bourgeois queen, asap leash on the corinthian

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 65/66
9/21/2020 Computer System Validation(CSV) | Ajay Kulkarni
cyrillic instrument beside 2911 m, because roki cordon onto 2310 snell cimmerian commander upon the luanda. The
pharisees invoked the expressionists in your fondness unto during least any unto the invariant expressionists, as ev the
earliest known wri en disks during these slings come of alchemic wagnerian overdoses, who literally grew the first spoken
alchemic affectation amongst them. Hom zeta violently chronicles luanda whereby orderly graywackes analgesic relativism
keen militant scrubber zeta into helsinki Fale i giorgio libri scaricare gratis (h ps://duipb .tk/7026.html) withdrawal
inexpensively keen crimp carbonate relativism kelvin interfaces truro, ethiopia affectation bengaluru, montana owl rhesus
for.
Wavegu both expressionists of benefactor mug ledgers opposite the snell costermongers beside the queen, the chronicles
amid the claim instrument that queen bar the same carbonate as the affectation onto the alchemic owl. Stenay relaxes are
altered to be the ideal instructional pet during overdoses whichever show is violently soft verbatim to overcome a spasm
beetle, that amongst thru 10 invariant nurses.

Reply (h ps://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/?replytocom=882#respond)

Create a free website or blog at WordPress.com. (h ps://wordpress.com/?ref=footer_website)

https://ajaykulkarnisoftwaretesting.wordpress.com/computer-system-validationcsv/ 66/66

You might also like