You are on page 1of 41

Evaluation of surveillance

systems
Preben Aavitsland

K.10
Epidemiology
Tra ining
Surveillance

Surveillance is the ongoing


systematic collection, collation,
analysis and interpretation of data;
and the dissemination of information
(to those who need to know) in
order that action may be taken
Information for action!
The surveillance loop
Health care Surveillance
system centre

Reporting
Event Data

interpretation
Analysis,
Action Information
Feedback,
recommendations
Importance of evaluation
• Quality
° Often neglected
° Basis for improvements
• Obligation
° Does the system deliver?
° Credibility of public health service
• Learning process
° EPIET training objective
° ”Do not create one until you have evaluated one”
General framework

• A. Engagement of stakeholders
• B. Evaluation objective
• C. System description
• D. System performance
• E. Conclusions and recommendations
• F. Communication
A. Engagement of stakeholders
Stakeholders
• The ”owners” and the ”customers”
• Users of surveillance system information
° Public health workers
° Government
° Data providers
° Clinicians
° etc.
• Steering group?
• A condition for change
B. Evaluation objective
Objective and methods
• Specific purpose
• Scope of evaluation
• Methods
° Document studies
° Interviews
° Direct observations
° Special studies
C. System description
C. System description

• 1 Public health rationale (why?)

• 2 Objectives (what?)

• 3 Operations (how?)

• 4 Resources (how much?)

• Extreme learning value!!!!


1. Rationale for surveillance
The disease Society
• Severity • Public and mass
• Frequency media interest
• Communicability • Will to prevent
• International • Availability of data
obligations
• Costs
• Preventability
2. Objectives of system
• Documented?
° If not = trouble
• SMART?
° Specific
° Measurable
° Action oriented [information] in order to [action]
° Realistic
° Time frame specified
Possible objectives of
surveillance
• Detect outbreaks
• Monitor trends (by time, place, person)
° towards a control objective
° as programme performance
° as intervention evaluation
• Estimate future disease impact
• Collect cases for further studies
….in order to [action]
Objectives

”To have a continuous overview of the


spread of the disease in Norway in order
to target preventive measures and plan
resource needs.”
3. Operations of system
• Health events under surveillance
° Type of event:
exposure -> infection -> disease / outbreaks -> outcome
° Case definitions
• Legal framework
• Organisational framework
• Components
° Flow chart
° Description
The surveillance loop
Health care Surveillance
system centre

Reporting
Event Data

interpretation
Analysis,
Action Information
Feedback,
recommendations
Flowchart

Reference
laboratory

Primary HIV reporting form,


Blood sample for HIV test laboratory part 1

Lab report and HIV reporting form

HIV reporting form, part 2


HIV infection Primary care (Prompting if necessary) National Institute
physician of Public Health

AIDS reporting form


Patient

AIDS Hospital physician Semiannual check

Oral information
Death, emigration Semiannual check
Components of system
• Population under surveillance
• Period of data collection
• Type of information collected
• Data source Confidentiality,
security
• Data transfer
• Data management and storage
• Data analysis: how often, by whom, how
• Dissemination: how often, to whom, how
4. Resources for system
operation
• Funding sources
• Personell time (= €)
• Other costs
° Training
° Mail
° Forms
° Computers
° ...
Annual resource needs
Personell costs
Epidemiologist, NIPH
Consultant, NIPH 900 hours
Secretaries, labs 20 hours
Clinicians 30 hours € 25 000
Other costs
Forms and postage 168 reports € 150
Telephone calls € 50
Total costs € 25 200
D. System performance
System performance
Does it work? Is it useful?
System attributes Use of information
• Simplicity • Users
• Flexibility • Actions taken
• Data quality
• Acceptability Link to objectives
• Sensitivity
• Positive predictive value
• Representativeness
• Timeliness
• Stability
Data quality
Completeness Validity
• Proportion of • True data?
blank / unknown
responses
• Comparison
• Simple counting ° Records inspection
° Patient interviews
° ...
Completeness of information
Information AIDS cases HIV cases without AIDS
Total Records with Total Records with
records item filled in records item filled in
No. No. (%) No. No. (%)
Person
Name 703 703 (100) na
Birth date 703 703 (100) na
Birth month and year 703 703 (100) 1491 1489 (100)
Sex 703 703 (100) 1491 1491 (100)
Municipality of residence at HIV-diagnosis 703 703 (100) 1491 1479 (99)
Country of birth 703 703 (100) 1491 1489 (100)
If not Norway
Reason for stay in Norway 109 100 (92) 592 551 (93)
Length of stay in Norway at HIV-diagnosis 109 62 (57) 592 352 (59)
Place
Infection acquired in Norway or abroad 703 334 (48) 1491 998 (67)
Cases acquired abroad
Country where infection was acquired 196 171 (87) 665 606 (91)
Sensitivity
• = reported true cases
total true cases
• = proportion
Report
of true cases
Pos. specimen
detected
Clinical specimen

Seek medical attention

Symptoms

Infected

Exposed
Sensitivity versus specificity

The tiered system: confirmed, probable, possible


Measuring sensitivity

• Find total true cases from other data


sources
° medical records
° disease registers
° special studies
• Capture-recapture study
Report

Pos. specimen

Clinical specimen

Seek medical attention

Symptoms

Infected

Exposed
Special studies of sensitivity
• 2500 patients with new hepatitis A or B
tested (1995-2000)
° no unreported HIV-cases

• 70 000 pregnant women tested annually


° 3-8 undiagnosed HIV-cases (immigrants)
Timeliness

Recognition
Occurence of Reporting of
of event Action taken
event event
(diagnosis)
Usefulness
Health care Surveillance
system centre

Event Data

Action Information
Meeting objectives?
• Was information produced?
° Trends
° Outbreaks
° Future impact
° Cases for further studies
• Was information used, and by whom?
° Actions: list
° Consequences: list
Usefulness
• Ex 1 (mid 1990s):
° Information: Aid workers infected in Africa
° Action: Revision of recruitment policy

• Ex 2 (1999):
° Information: Men infected in Thailand
° Action: Publication --> mass media interest
--> = public health warning
E. Conclusions and
recommendations
Conclusions
• Proper rationale?
• Attributes
° Balance of attributes and costs
• Fulfilling objectives?
• Recommendations
° Continue
° Revise: specify
° Stop
F. Communication
Communicating findings
• To stakeholders
• To data providers
• To public health community

• Report
• Conference presentation
• Scientific article
Scientific publication
• Introduction
° Evaluation objective (B)
• Material and methods
° Methods of evaluation (B)
• Results
° System description (C)
° System performance (D)
• Discussion
° Sources of error and bias
° Conclusions and recommendations (E)
• Acknowledgments
° Stakeholders (A)
Literature
• CDC. Updated guidelines for evaluating public health
surveillance systems. MMWR 2001; 50 (RR-13): 1-35
• WHO. Protocol for the evaluation of epidemiological
surveillance systems. WHO/EMC/DIS/97.2.
• Romaguera RA, German RR, Klaucke DN.
Evaluating public health surveillance. In: Teutsch SM,
Churchill RE, eds. Principles and practice of public
health surveillance, 2nd ed. New York: Oxford
University Press, 2000.

You might also like