Professional Documents
Culture Documents
K10. Evaluation of Surveillance System
K10. Evaluation of Surveillance System
systems
Preben Aavitsland
K.10
Epidemiology
Tra ining
Surveillance
Reporting
Event Data
interpretation
Analysis,
Action Information
Feedback,
recommendations
Importance of evaluation
• Quality
° Often neglected
° Basis for improvements
• Obligation
° Does the system deliver?
° Credibility of public health service
• Learning process
° EPIET training objective
° ”Do not create one until you have evaluated one”
General framework
• A. Engagement of stakeholders
• B. Evaluation objective
• C. System description
• D. System performance
• E. Conclusions and recommendations
• F. Communication
A. Engagement of stakeholders
Stakeholders
• The ”owners” and the ”customers”
• Users of surveillance system information
° Public health workers
° Government
° Data providers
° Clinicians
° etc.
• Steering group?
• A condition for change
B. Evaluation objective
Objective and methods
• Specific purpose
• Scope of evaluation
• Methods
° Document studies
° Interviews
° Direct observations
° Special studies
C. System description
C. System description
• 2 Objectives (what?)
• 3 Operations (how?)
Reporting
Event Data
interpretation
Analysis,
Action Information
Feedback,
recommendations
Flowchart
Reference
laboratory
Oral information
Death, emigration Semiannual check
Components of system
• Population under surveillance
• Period of data collection
• Type of information collected
• Data source Confidentiality,
security
• Data transfer
• Data management and storage
• Data analysis: how often, by whom, how
• Dissemination: how often, to whom, how
4. Resources for system
operation
• Funding sources
• Personell time (= €)
• Other costs
° Training
° Mail
° Forms
° Computers
° ...
Annual resource needs
Personell costs
Epidemiologist, NIPH
Consultant, NIPH 900 hours
Secretaries, labs 20 hours
Clinicians 30 hours € 25 000
Other costs
Forms and postage 168 reports € 150
Telephone calls € 50
Total costs € 25 200
D. System performance
System performance
Does it work? Is it useful?
System attributes Use of information
• Simplicity • Users
• Flexibility • Actions taken
• Data quality
• Acceptability Link to objectives
• Sensitivity
• Positive predictive value
• Representativeness
• Timeliness
• Stability
Data quality
Completeness Validity
• Proportion of • True data?
blank / unknown
responses
• Comparison
• Simple counting ° Records inspection
° Patient interviews
° ...
Completeness of information
Information AIDS cases HIV cases without AIDS
Total Records with Total Records with
records item filled in records item filled in
No. No. (%) No. No. (%)
Person
Name 703 703 (100) na
Birth date 703 703 (100) na
Birth month and year 703 703 (100) 1491 1489 (100)
Sex 703 703 (100) 1491 1491 (100)
Municipality of residence at HIV-diagnosis 703 703 (100) 1491 1479 (99)
Country of birth 703 703 (100) 1491 1489 (100)
If not Norway
Reason for stay in Norway 109 100 (92) 592 551 (93)
Length of stay in Norway at HIV-diagnosis 109 62 (57) 592 352 (59)
Place
Infection acquired in Norway or abroad 703 334 (48) 1491 998 (67)
Cases acquired abroad
Country where infection was acquired 196 171 (87) 665 606 (91)
Sensitivity
• = reported true cases
total true cases
• = proportion
Report
of true cases
Pos. specimen
detected
Clinical specimen
Symptoms
Infected
Exposed
Sensitivity versus specificity
Pos. specimen
Clinical specimen
Symptoms
Infected
Exposed
Special studies of sensitivity
• 2500 patients with new hepatitis A or B
tested (1995-2000)
° no unreported HIV-cases
Recognition
Occurence of Reporting of
of event Action taken
event event
(diagnosis)
Usefulness
Health care Surveillance
system centre
Event Data
Action Information
Meeting objectives?
• Was information produced?
° Trends
° Outbreaks
° Future impact
° Cases for further studies
• Was information used, and by whom?
° Actions: list
° Consequences: list
Usefulness
• Ex 1 (mid 1990s):
° Information: Aid workers infected in Africa
° Action: Revision of recruitment policy
• Ex 2 (1999):
° Information: Men infected in Thailand
° Action: Publication --> mass media interest
--> = public health warning
E. Conclusions and
recommendations
Conclusions
• Proper rationale?
• Attributes
° Balance of attributes and costs
• Fulfilling objectives?
• Recommendations
° Continue
° Revise: specify
° Stop
F. Communication
Communicating findings
• To stakeholders
• To data providers
• To public health community
• Report
• Conference presentation
• Scientific article
Scientific publication
• Introduction
° Evaluation objective (B)
• Material and methods
° Methods of evaluation (B)
• Results
° System description (C)
° System performance (D)
• Discussion
° Sources of error and bias
° Conclusions and recommendations (E)
• Acknowledgments
° Stakeholders (A)
Literature
• CDC. Updated guidelines for evaluating public health
surveillance systems. MMWR 2001; 50 (RR-13): 1-35
• WHO. Protocol for the evaluation of epidemiological
surveillance systems. WHO/EMC/DIS/97.2.
• Romaguera RA, German RR, Klaucke DN.
Evaluating public health surveillance. In: Teutsch SM,
Churchill RE, eds. Principles and practice of public
health surveillance, 2nd ed. New York: Oxford
University Press, 2000.