You are on page 1of 10

Bias in epidemiology

• The role of the “social” in science


• Construct validity
• Investigator/confirmation bias

1
Construct validity
• The extent to which the instrument specifically measures what it is
intended to measure, and avoids measuring other things
• Example: a measure of intelligence should only assess
factors relevant to intelligence and not, for instance,
whether someone is a hard worker

• We often use “race” and “gender” variables to account for


differences in health outcomes to “explain” differences, but don’t
actually critically examine why this is the case. This is bad epi.

2
Let’s talk about construct validity with
regards to race
• Race is social construct, not a biological one
• No reality apart from that conferred by group and time
• “Race” means something completely different in different
places
• However, has real consequences as an assigned, socially
recognized status
• So, what are we measuring?
• NOT biology/genetics
• Culture?
• SES?
• Racism/discrimination?

3
Let’s talk about construct validity with
regards to race
• Define race during the experimental design, and specify the
reason for its use in the study.
• Should be couched within a sociopolitical framework, not a
biological one
• Race is a proxy measure for social, environmental, and structural
factors
• Name racism, identify the form, the mechanism by which it may be
operating, and other intersecting forms of oppression (such as based
on sex, sexual orientation, age, regionality, nationality, religion, or
income) that may compound its effects.
• Naming racism explicitly helps authors avoid incorrectly assigning
race as a risk factor, when racism is the risk factor for racially
disparate outcomes.
• Never offer genetic interpretations of race 
• Not grounded in science
Source: Boyd et al., 2020
4
Investigator/confirmation bias
• Confirmation bias definition: a type of psychological bias in which decisions
are made according to the investigator’s preconceptions, beliefs, or
preferences
• Note that this is often unavoidable: we all operate off of certain
assumptions
• Be as clear as possible about what those assumptions are!

• What assumptions are investigators making that interfere with their ability
to ask pertinent questions and collect good data to answer those
questions?
• E.g.: ascribing certain behaviors to certain groups of people
• When asking a participant questions during a study
• When developing study questions and deciding what kinds of data
to collect before the start of a study

5
Example from midterm
Q: Penicillin treatment is extremely effective in curing the disease,
but patients may become re-infected if their sexual partners are also
not treated. If funding was made available for a widespread campaign
of syphilis prevention focused solely on the diagnosis and treatment
of syphilis in pregnant women (but not their sexual partners), what
would we expect to observe over the next five years?
 
a. Prevalence of primary and secondary syphilis would be significantly
affected, but incidence of congenital syphilis would increase
b. Prevalence of primary and secondary syphilis would be unaffected,
but incidence of congenital syphilis would decrease
c. Incidence of congenital syphilis would significantly decrease
d. Incidence of congenital syphilis would not be significantly affected

6
Example from

7
8
9
How do you make sure that the research you
are doing research is valid?
• In general, know the body of work
• Cite the experts
• In this example, if you are including a race variable, read and
cite scholars of color whose work forms the basis of the
field’s knowledge on racism and its effects
• A plug for field epi and knowing the communities you work in and get
your data from
• Solicit patient input 
• Use community review boards or form patient panels to
ensure the outcomes of research reflect the priorities of the
populations studied
• Identify the stakes and the implications of your research
• If there are none, maybe change what you are doing

10

You might also like