You are on page 1of 36

Problem Formulation

Jeffrey C. Valentine University of Louisville Center for Evidence-Based Crime Policy / Campbell Collaboration Joint Symposium

George Mason University


August 15, 2011
The Campbell Collaboration www.campbellcollaboration.org

Overview of Today
Brief overview of systematic reviews

Problem Formulation
Broad vs. Narrow questions PICOS

Characteristics of a good question (SAMPLE)

Exercise

The Campbell Collaboration

www.campbellcollaboration.org

Brief Overview of Systematic Reviews


Systematic reviews are a form of research secondary observations in which studies are the unit of analysis

Follow basic steps in the research process


Aim to minimize bias and error

But SRs are not immune to bias and error

(not a panacea)
The Campbell Collaboration www.campbellcollaboration.org

Stages of a Research Synthesis (Cooper, 1982)


The seminal article outlining five stages of a research review is by Harris Cooper: Cooper, H.M. (1982). Scientific guidelines for conducting integrative research reviews. Review of Educational Research, 52, 291-302. Cooper, H. M. (2009). Research synthesis and metaanalysis: A step-by-step approach. Thousand Oaks, CA: Sage.

The Campbell Collaboration

www.campbellcollaboration.org

Stages of a Research Synthesis (Cont.)


Problem formulation
Clarifying your questions and writing a protocol
Set explicit inclusion/exclusion criteria

Data collection
Literature search
Information-gathering from studies

Data evaluation
Criteria for including and excluding studies
Assessing study quality
The Campbell Collaboration www.campbellcollaboration.org

Stages of a Research Synthesis (Cont.)


Data analysis and interpretation
Integrating the effects from collected studies
Interpreting analysis results

Report preparation
Narrative, statistics, graphs, tables

The Campbell Collaboration

www.campbellcollaboration.org

Whats Required to do a Good SR?


A team with

Substantive expertise
Methodological expertise Statistical expertise Information retrieval expertise Time and money SRs are labor intensive $50-$150K + depending on scope, complexity,

and number of studies

The Campbell Collaboration

www.campbellcollaboration.org

SRs Should be Governed by a Protocol


A detailed protocol (plan) for the SR should be

developed and made available to readers (Higgins & Green, 2008; Moher et al., 2009) Protocols increase transparency, limit ad hoc decisions The review process is iterative and plans may change during the process The final report should document and explain changes made (deviations from the protocol)
The Campbell Collaboration www.campbellcollaboration.org

SRs Should be Thoroughly Reported


One goal of a SR is to increase transparency

Therefore a lot of methodological detail needs to be reported


Systems exist that provide suggestions (e.g., PRISMA)
Preferred Reporting Items for Systematic Reviews and Meta-

Analyses
The need to document research process creates the need

for more up-front decision making and explication

The Campbell Collaboration

www.campbellcollaboration.org

Final Notes
Most of the work involved in conducting a review is not spent

in statistical analysis.
The scientific contribution of the final product is dependent

on all stages of the review and not just the statistical analysis stage.

The Campbell Collaboration

www.campbellcollaboration.org

Problem Formulation
Two primary questions:
What is the specific hypothesis of interest in this synthesis?
What evidence is relevant to this hypothesis?

A well-formulated problem will define the variables of interest

so that relevant and irrelevant studies can be distinguished from one another

The Campbell Collaboration

www.campbellcollaboration.org

SRs Vary in Scope


Specific, narrow questions

Useful for testing effects of specific treatments Broad, global questions Useful for generating new knowledge Identify common elements of effective programs

(Lipsey, 2008) Build better intervention theories to guide program development and evaluation design (Lipsey, 1997)

Differences in scope can lead to differences in conclusions

across reviews

The Campbell Collaboration

www.campbellcollaboration.org

Scope of SRs
Not limited to questions about effects of interventions

Can address trends, epidemiology, accuracy of

diagnostic and prognostic tests, more


Not limited to randomized controlled trials or quantitative data

Qualitative synthesis (e.g., meta-ethnography,

narrative analysis of qualitative research reports) Mixed/multiple methods synthesis (e.g., Thomas, Harden, et al. on programs to combat childhood obesity)
The Campbell Collaboration www.campbellcollaboration.org

What Kinds of Research Questions?

Questions about intervention effects: What are the effects of x intervention on

y outcomes for z populations/problem? Variations on this theme (e.g., differences in effects of interventions x1 vs x2)
The Campbell Collaboration www.campbellcollaboration.org

Questions, contd
Questions about associations

How does x1 relate to x2 for population z? (direction

and strength of correlation) Variations on this theme (e.g., differences in relation of x1 and x2 between populations z1 and z2 )
Diagnostic/Prognostic questions

Which test (A vs. B) is a better predictor of y?

Which test (A vs. B) is a better predictor of y for z1 vs.

z2 populations?

The Campbell Collaboration

www.campbellcollaboration.org

Example Research Area: Mentoring


To guide the discussion, will use Dubois et al. (in press) work

on mentoring
How effective are mentoring programs for youth?: A

systematic review of the evidence (to appear in Psychological Science in the Public Interest)

The Campbell Collaboration

www.campbellcollaboration.org

Steps in Problem Formulation


Determine the conceptual definitions that are relevant to the

research Determine the operational definitions that are relevant to the research Set the review parameters in terms of PICOS

Populations/Participants Interventions (if applicable) Comparison group (counterfactual condition) Outcomes (what classes of outcomes? what specific operations?) Study designs (should fit purpose)

Note that the conceptual and operational definitions will also

inform the literature search (next session)

The Campbell Collaboration

www.campbellcollaboration.org

Conceptual and Operational Definitions


It is critical that operational definitions fit their intended

constructs (or concepts)


Sometimes this is easy to spot

Sometimes it is not One difficulty is that different fields may use different terminology to refer to the same thing Other times, different fields can use the same term to refer to different things
As such one has to be careful about relying on the labels provided in the

studies, and refer instead to their operational definitions (if given!)

The Campbell Collaboration

www.campbellcollaboration.org

Example
A study used the term mentoring to describe the following

activity:
An adult was paired with 2-4 children Met one hour a week for four weeks Met in school Sessions focused on improving math achievement Probably fits the definition of tutoring better than the definition of mentoring

The Campbell Collaboration

www.campbellcollaboration.org

More on Concept to Operations Fit


Systematic reviews are sometimes subject to the apples

and oranges critique

Combining things that are actually different from one another

Parallels problems in primary studies (there is no average

person, all people are different)

The strong version of this argument and the strong

response rests on identifying whether things that are different are conceptually similar enough to be combined
In other words, if you think that two measures are tapping the

same underlying construct, ok to combine. If they are tapping different underlying constructs, then dont combine.

The Campbell Collaboration

www.campbellcollaboration.org

The PICOS Elements


How effective are mentoring programs for youth?: A systematic review of the evidence What can we learn from the title?

Participants = youth. Operationally defined as ??? Interventions = mentoring programs. Operationally defined as ??? Comparison group = ??? Outcomes = Nothing specified, so ??? Study Design = Effective therefore experiments We can see some of the concepts of interest just from the title. Other concepts, and all of the operational definitions, need to be specified.
www.campbellcollaboration.org

The Campbell Collaboration

Examining the PICOS Elements


Population/Participants
Clearly interested in youth
Included studies of youth under age 19

Study Design
Because interested in effects, clearly need experimental

evidence. Dubois et al. included both randomized and quasiexperiments.

The Campbell Collaboration

www.campbellcollaboration.org

Examining the PICOS Elements (contd)


Interventions
Mentoring (conceptual): on-going, positive relationship with

extrafamilial adult with or without specific instrumental goals


Mentoring (more operational): In the typical program, each

youth is paired on a one-to-one basis with a volunteer from the community with the aim of cultivating a relationship that will foster the young persons positive development and wellbeing.
What can we learn about the interests of Dubois et al. from this

statement?
The Campbell Collaboration www.campbellcollaboration.org

Types of Interventions to Include

In the typical program, each youth is paired on a one-to-one basis with a volunteer from the community with the aim of cultivating a relationship that will foster the young persons positive development and well-being One-to-one: Excludes studies in which multiple youth are matched with a single mentor Volunteer: Excludes studies that used paid professionals (e.g., therapists) Cultivating a relationshippositive development and well-being: Excludes studies that have instrumental goals, e.g., tutoring to improve achievement
www.campbellcollaboration.org

The Campbell Collaboration

Examining the PICOS Elements (contd)


Comparison group
For studies of interventions, have to decide what you want to

compare to:

Other presumably effective interventions? Mentoring vs. Tutoring Other presumably ineffective interventions? Mentoring vs. Bibliotherapy Usual treatment? Not really applicable here, but often is and often cannot be avoided No treatment? Wait list control, no treatment group, etc.

Comparison to another (possibly effective) intervention is quite

different from the other types of comparisons, and this has implications for interpreting the results of the review

The Campbell Collaboration

www.campbellcollaboration.org

Outcomes
Use theoretical and practical considerations to determine

which outcomes should be included in the review


Usually only include outcomes that the intervention ought to

impact Important to have both conceptual and operational definitions


Academic achievement:
Conceptual: a persons level of knowledge in a specific academic domain Operational: many, including GPA, achievement test scores, homework

scores, project grades, graduation, etc.

The Campbell Collaboration

www.campbellcollaboration.org

Outcomes in Dubois et al.


Dubois et al. examined a broad array of youth outcomes
Attitudes toward school (including academic motivation)
Academic achievement Socio-emotional outcomes (e.g., self-esteem, levels of

depression) Conduct problems Physical activity levels

The Campbell Collaboration

www.campbellcollaboration.org

Inclusion / Exclusion Criteria


All systematic reviews should have explicit, operational

inclusion/exclusion criteria Most of these are addressed in the PICOS


E.g., if study is on at-risk students, then population = at-risk students Note that you still have to define what you mean by at-risk though

Others include contextual factors


Geographic/political boundaries, language restrictions, time period

study was conducted


These should be justified

The coding session will show you how to screen studies based

on inclusion/exclusion criteria
The Campbell Collaboration www.campbellcollaboration.org

Examples of How Dubois et al. Could Have Been More Narrow in Scope

Restrict the types of interventions / comparisons


More narrow definition of mentoring
Limited to a specific program (e.g., Big Brothers / Big Sisters)

Compared mentoring to another intervention (e.g., tutoring)

Limited to specific populations (e.g., children who are perceived to be at risk) Focused on particular outcomes Included only randomized experiments Etc.

Generally there are no right answers to questions of scope. The onus on the reviewer is to be explicit about the choices and to defend them
www.campbellcollaboration.org

The Campbell Collaboration

Using Logic Models to Help Problem Formulation


Logic models (see Anderson et al. 2011) describe the

connection between different determinates of outcomes and how these are related to a specific intervention
Developing your own logic model, or carefully studying

someone elses, can help formulate a problem for a systematic review

The Campbell Collaboration

www.campbellcollaboration.org

Zief et al. (2006) Logic Model for After School Programs (Partial)

Intermediate Outcomes Intervention Immediate Change Program activities Implementation Resources Staff Etc. Supervision Academic support Enriching activities Behavior Homework Television Attendance Socio-emotional Safety Self-esteem Longer Term Outcomes Grades Test Scores Educational Attainment

The Campbell Collaboration

www.campbellcollaboration.org

Using Logic Models to Formulate a Problem


Logic model can be used to narrow the question
Could focus on academic outcomes and not socio-emotional
Could focus on the more immediate outcomes (e.g., do

students in after school programs have more supervised time than students not in after school programs?)
Can also use the logic model to diagnose a failure to find an effect on

longer term outcomes

The Campbell Collaboration

www.campbellcollaboration.org

Elements of a Good Research Question: SAMPLE


To determine if you have a good research question, ask yourself:
Is it specific? Is it answerable? Are there measurable constructs? Is it practical? Is it relevant for policy/practice? Is it logical? Is it based on theory/logic model? Is it empirical? Can answers be attained using observable

evidence?
The Campbell Collaboration www.campbellcollaboration.org

When Should a Research Question Be Subject to a Systematic Review?


Generally, scholars undertaking a systematic review ultimately

expect to find multiple studies that will meet their inclusion criteria

If a good review already exists on the topic, you might be able to (a)

update the literature search, (b) test new relations, and/or (c) improve the methodology If a state-of-the-art review does not exist, you can provide it!

It can sometimes be informative to conduct a SR when it is

believed that no studies exist

A thorough literature search might find some! If a practice is widespread but has never been tested, might be good

to make this point (e.g., drug testing in HS for after school activity participation)

The Campbell Collaboration

www.campbellcollaboration.org

Exercise
With those around you, come up with a specific research

question that can be the focus of a SR


What intervention(s) are being considered? What outcomes are of importance? What is the key population or context? What purposes could be served by synthesizing the knowledge

in this area objectives of the review)?


Remember SAMPLE One person reports for the group
The Campbell Collaboration www.campbellcollaboration.org

P.O. Box 7004 St. Olavs plass 0130 Oslo, Norway E-mail: info@c2admin.org http://www.campbellcollaboration.org

The Campbell Collaboration

www.campbellcollaboration.org