Professional Documents
Culture Documents
Chapter 20
Siegfried Schmitt
CONTENTS
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
Vendor Auditing Using GAMP4 Guidance and Templates . . . . . . . . . . . . . . . . 508
A Company Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 509
B Organization and Quality Management . . . . . . . . . . . . . . . . . . . 510
C Planning and Product, Project Management. . . . . . . . . . . . . . . . 512
D Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512
E Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 513
F Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
G Completion and Release . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 516
H Support and Maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 517
I Supporting Procedures and Activities . . . . . . . . . . . . . . . . . . . . 519
Training in GAMP 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 520
Who Are You Talking To? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 522
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 523
INTRODUCTION
The following case studies have been written from an information management
department perspective. Most articles and presentations on practical applications of
GAMP 4 seem to either cover computerized systems within the manufacturing or
the laboratory area, either from the perspective of the user, the vendor, or the
validation consultant. It is rare to come across the information technology (IT) or
information systems (IS) perspective. Information management (IM), which
encompasses IT and IS, plays a dual role, as the user and as the provider.
507
Vendor audits are carried out for a variety of reasons and by a diverse number of
departments on systems that often have automated features as part of the system.
Traditionally engineering would conduct most audits as part of the vendor
evaluation phase, or as part of factory acceptance testing (FAT) during
commissioning. Other audits would be conducted by the QA function to review and
assess a vendor’s quality system. In most cases these audits would not
comprehensively address compliance issues for the computerized system.
It is within the IM QA function where staff has many years of in-depth
experience in performing and assisting in vendor audits for compliance with GXPs
and the widely accepted principles of GAMP. For critical applications, either from
a business or a regulatory perspective, it is highly advisable to include such an
expert in the auditing team. GAMP 4 has become a popular source for information
on how to conduct a vendor audit (Appendix M2), and its vendor assessment form
templates are a suitable starting point for the development of company specific
forms. Immediately the question arises whether these forms should be used as is, or
should they be modified, and if so, to what extent? It is our experience that it is
always necessary to modify the original GAMP 4 template in order to capture the
specifics of an audit. However, one thing we never change is the original numbering
system. We amend or add to the existing entries and extend the existing lists, using
consecutive numbers.
Although specialists may be aware of it, many others are not, that GAMP 4 does
not address 21 CFR 11, the U.S. FDA’s rule on electronic records and electronic
signatures. This chapter is not on 21 CFR 11. It is sufficient to note here that it is
good practice to add the company specific detailed assessment checklist to the audit
questionnaire. In the absence of a specifically developed version, it is possible to
insert the rule itself, without any interpretation.
The following is an example of an extended audit questionnaire, with additional
entries taken from a wide variety of questionnaires the author has come across over
the years. Again, it has to be stressed that for each audit this list is revised, amended
or extended, as necessary. This checklist should be used when auditing suppliers of
automated systems. It is intended as a guide only, and there may be other factors
which require consideration when auditing suppliers. All headers are as in GAMP
4, and any original text entries from GAMP 4 are in italics, for easy reference.
A. Company Overview
Request references that can be contacted post-audit. Have they been cited for
good work? Is the company in any litigation? Does the company hold a
recognized quality certification (check carefully to what parts of the company
these apply)?
What is the company’s experience in the pharmaceutical area (ask for white
papers, article reprints, presentation slides, etc.). Is the pharmaceutical and
healthcare related business profitable?
A.4 Summary of product/service under audit
Request product literature. How important is this product to the company?
A.5 Product or service development plans
What is the long term pharmaceutical and healthcare related business plan?
A.6 Tour of facility
To verify housekeeping, general working environment, working conditions. It
is worth noting how tidy and well organized the computer room is to get an
idea about its actual usage.
• Change control
• Problem resolution
• Back-up
• Disaster recovery
• Customer installation
• Complaint handling, bug fixing
• Quality assurance
• Internal training
• SOP management
D. Specifications
design of the system and how are customer specific user requirements
incorporated into this list?
D.2 Functional Specifications
Are there any narrative descriptions of the system functions?
D.3 Software Design Specifications
Do design considerations cover reliability, maintainability, security, and
safety?
Do they cover standardization or interchangeability?
Do they account for 21 CFR Part 11 requirements?
D.4 Hardware Design Specifications
Do design considerations cover reliability, maintainability, security, and
safety?
Do they cover standardization or interchangeability?
D.5 Relationship between specifications
Together forming a complete specification of the system, which can be tested
objectively.
Were there any minimum performance targets documented in the system
design?
D.6 Traceability through specifications
For example, for a given requirement, can these be traced (preferably using a
matrix) and are these indexed in a controlled manner?
Are design changes proposed, approved, implemented, and properly
controlled (SOP, master index for work requests/punch list)?
D.7 Status of specifications
Reviews (documented, minuted, are users invited to attend design review
meetings?), approvals, distribution, maintenance, and update.
D.8 Accuracy of, and conformance to, relevant process definitions/procedures
D.9 Use and control of design methodologies
CASE tools.
E. Implementation
E.4 Use of build files to compile and link individual software configuration items
into a formal release of the software product
Guidelines or standards for hardware assembly
What third party hardware or software is used?
Is software and hardware supplied by reputable firms?
How would changes to third party products affect the user’s end product?
E.5 Use and version logging of development tools
For example, compilers, linkers, debuggers used for each software
configuration item and the formal build and release of the software product.
E.6 Evidence of source code reviews prior to formal testing
Checking design, adherence to coding standards, logic, redundant code,
identification of critical algorithms.
Is there documented evidence that code walk-throughs have been done on
each module (including information on the outcome)?
E.7 Independence and qualifications of reviewers
E.8 Source code reviews recorded, indexed, followed-up, and closed off (with
supporting evidence)
Evidence of timely management action for reviews which remain open or
where reviews were not closed off satisfactorily.
E.9 Listings and other documents used during source code reviews identified,
controlled, and retained with review reports
F. Testing
TRAINING IN GAMP 4
Adopting the principles of GAMP 4 makes good business sense. It does then
require the company to train staff in the correct use and interpretation of the book.
It is good practice to provide this training to all involved in projects concerning
automated systems. We found that project managers are particularly interested in
the topic and we now run special training sessions for this specific group. When
putting together the training material, it was convenient to use the historic review
on the GAMP history in Pharmaceutical Engineering’s November/December 2001
issue. The redesign of the GAMP website offers an incentive to regularly review the
links in the presentation for ongoing correctness.
Especially for those operating outside the U.S. it is recommended that
background information is provided on other (interest) groups, such as PDA, IEEE,
NAMUR, APV, DIN, ISO, and others.
Participation in the special interest groups (SIG) is not only encouraged, it also
provides a good opportunity to link GAMP to current issues within the company,
such as infrastructure validation, electronic data archival, to name just a few. The
audience will not only be able to relate better to GAMP, it will also provide them
with a more lasting impression of the presented material.
If the electronic version of GAMP 4 was more user-friendly, it would have been
possible to make good use of it when preparing the presentations. However this is
not supported and much of the information has to be retyped.
Project managers deal with systems that have to comply with all areas of GXP.
GAMP 4 is strongly biased towards good manufacturing practice (GMP), which has
to be stressed and the presenters must be aware of. Where appropriate, the
presentation can be extended by including specific guidance on medical devices,
clinical trial systems, etc. Some of the terminology is unclear or unfamiliar to some
in the audience as a result of the basis in GMP. It is helpful to pull together a
glossary of terms, as used in the company for explanation and to point out where
they fit into the lifecycle model. As an example, user acceptance testing is often
synonymous with performance qualification.
Most companies base their methodology on the V-model as promoted by GAMP
4. It is not helpful to have seemingly contradictory presentations of this model in
the guide. It is suggested that a firm’s methodology is explained using the V-model
as described in the quality management guide, rather than simply copy the GAMP
4 presentation.
A software classification guide is helpful and generally useful. It can become an
issue when applied rigidly. Most systems are a mix of various classes of software
and it requires expert knowledge to decide on the appropriate level of validation,
depending on the type of software encountered.
Lastly, it is beneficial to outline which parts of GAMP 4 have been incorporated
into the company’s quality system and how they are being interpreted. For example,
GAMP 4 is not designed for legacy systems, and it is up to the company’s experts
to establish how these systems need to be treated, making best use of the elements
in GAMP 4. This will not be limited to the GAMP recommendations, as the
healthcare industry also has to adopt the ever-changing interpretations on computer
systems validation given by the regulators, such as the U.S. FDA.
Whereas the former versions of GAMP may have overstressed which part of the
validation activities had to be the user’s responsibility and which the vendor’s, by
having two separate chapters, GAMP 4 still offers this distinction. For many simple
systems this may be the correct approach, which can, and indeed should, be
followed by both parties, the user, and the vendor. This is no longer true for complex
systems or projects. As mentioned in the introduction, IM acts in both capacities,
often dealing with one internal customer and several external vendors and
consultants. It is not unusual to find ratios of external to internal staff in IM
departments ranging from 1 to 1, up to 5 external to 1 internal staff. To complicate
matters, the contracts with the external parties are generally drawn up and managed
by the user, not by IM.
As a consequence, vendors may not be contractually obliged to adopt or follow
GAMP 4 guidance. Their documentation may not have to be aligned with established
IM formats and procedures may not satisfy the requirements for fully validated
systems, as defined by IM’s QA. Thus, even if a company’s IM and user function
have established a validation framework, which is based on the lifecycle model and
follows GAMP 4 recommendations, validation success may be jeopardized through
inadequate, or inappropriate third-party contracts and working practices.
Figure 20.1 helps to illustrate the situation.
Information
Management
rs
de
li ve
liv
de
er
s
When this relationship model is properly understood by all parties involved, the
first step towards a successful implementation and associated validation activity is
made.
In summary, GAMP 4 is indeed a most welcome and versatile guide, which is
likely to become much more of a global standard over the coming years. It is however,
necessary to understand its limitations and it is imperative to interpret and adapt its
contents and spirit to reap optimum benefit for each company’s specific circumstance.
REFERENCES
GAMP 4, www.ispe.org
U.S. Food and Drug Administration — Center for Devices and Radiological Health,
Quality Systems Audits, http://www.fda.gov/cdrh/qsr/17audit.html
Wingate, G. Audit Preparation for Suppliers, www.suehorwoodpubltd.com.
1028-1997 IEEE Standard for Software Reviews, www.ieee.org