You are on page 1of 36

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/324248454

A Literature Review of Organizational Behavior Management Interventions in


Human Service Settings from 1990 to 2016

Article  in  Journal of Organizational Behavior Management · April 2018


DOI: 10.1080/01608061.2018.1454872

CITATIONS READS

24 20,743

6 authors, including:

Nicole Gravina Jamie Villacorta


University of Florida Florida Institute of Technology
48 PUBLICATIONS   326 CITATIONS    4 PUBLICATIONS   37 CITATIONS   

SEE PROFILE SEE PROFILE

Kristin Albert Ronald Clark


Florida Institute of Technology Florida Institute of Technology
11 PUBLICATIONS   127 CITATIONS    2 PUBLICATIONS   24 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Evaluation of Presentation Order for Matching and Listener Selection View project

Differential effects of tokens production and exchange on responding of children with developmental disabilities View project

All content following this page was uploaded by Nicole Gravina on 06 April 2018.

The user has requested enhancement of the downloaded file.


Journal of Organizational Behavior Management

ISSN: 0160-8061 (Print) 1540-8604 (Online) Journal homepage: http://www.tandfonline.com/loi/worg20

A Literature Review of Organizational Behavior


Management Interventions in Human Service
Settings from 1990 to 2016

Nicole Gravina, Jamie Villacorta, Kristin Albert, Ronald Clark, Scott Curry &
David Wilder

To cite this article: Nicole Gravina, Jamie Villacorta, Kristin Albert, Ronald Clark, Scott Curry &
David Wilder (2018): A Literature Review of Organizational Behavior Management Interventions in
Human Service Settings from 1990 to 2016, Journal of Organizational Behavior Management, DOI:
10.1080/01608061.2018.1454872

To link to this article: https://doi.org/10.1080/01608061.2018.1454872

Published online: 05 Apr 2018.

Submit your article to this journal

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=worg20
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT
https://doi.org/10.1080/01608061.2018.1454872

A Literature Review of Organizational Behavior


Management Interventions in Human Service Settings
from 1990 to 2016
Nicole Gravina , Jamie Villacorta, Kristin Albert, Ronald Clark, Scott Curry,
and David Wilder
Florida Institute of Technology, Psychology, Florida Tech, School of Psychology, Melbourne, Florida,
USA

ABSTRACT ARTICLE HISTORY


We reviewed the Journal of Applied Behavior Analysis (JABA), Received 19 July 2017
Journal of Organizational Behavior Management (JOBM), and Revised 9 November 2017
Behavior Analysis in Practice (BAP) from 1990 to 2016, to iden- Accepted 4 January 2018
tify articles that evaluated organizational behavior manage- KEYWORDS
ment interventions in a human service setting. Of those organizational behavior
articles, 75 articles met the inclusion criteria for the review, management; human
44 from JABA (1990 to 2016), 22 from JOBM (1990 to 2016), and service; review; staff
7 from BAP (2008 to 2016). We categorized each selected performance
article by setting, employee population, client population,
assessment, dependent variable, independent variable, and
outcome measures. Results from the review are discussed for
all three journals. Recommendations are made to broaden the
scope of population and dependent variable targets, include
more assessments, and include outcome data when applicable.

Human service organizations provide treatment and support services to a


variety of populations, often with the goal of improving functioning, inde-
pendence, and quality of life for consumers and their families (Reid &
Parsons, 2000). Advances in research and practice have dramatically
improved service delivery over the years. For example, advances in treat-
ments based on applied behavior analysis (ABA) have substantially improved
the lives of individuals diagnosed with autism and other intellectual disabil-
ities (Peters-Scheffer, Didden, Korzilius, & Sturmey, 2011; Virues-Ortega,
2010). Yet, human service organizations often come under scrutiny for a
variety of reasons including inefficient use of resources, poor quality of
service delivery, poor record keeping, client and worker safety concerns,
and in rare cases, abuse and neglect (Este, 2007). These issues substantiate
the need for quality management practices to ensure that employees are
following procedure and delivering high quality and evidence-based treat-
ments in a cost effective and efficient manner.

CONTACT Nicole Gravina nicole.gravina@me.com Florida Institute of Technology, 150 W. University Blvd.,
Melbourne, FL 32901, USA
Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/worg.
© 2018 Taylor & Francis
2 N. GRAVINA ET AL.

Human service organizations have their own unique challenges for deli-
vering high-quality services (Leblanc, Gravina, & Carr, 2009). For example,
because payment is usually from a government agency, third party source, or
some other regulated system, there are limits for income. Thus, instead of
increasing the number of clients served, the focus is often on controlling
costs and maximizing resources (Riley & Frederiksen, 1984). In addition,
delivery of services, especially ABA services, can require precision and the
ability to make decisions quickly and direct care staff often enter the field
with very little training (Leblanc et al., 2009; Reid & Parsons, 2000). Staff
adherence to procedures is an important determinant of treatment effective-
ness (Riley & Frederiksen, 1984). This necessitates quality training and
performance management systems, yet high turnover can make these inter-
ventions costly over time. Practitioners need strategies for maximizing per-
formance, safety, and quality in a cost-effective way.
One approach that can be used to improve effectiveness and efficiency in
human service organizations is organizational behavior management (OBM).
OBM is a sub-discipline of ABA, which is concerned with applying operant,
and to a lesser degree, respondent principles to positively impact socially
relevant behaviors and outcomes. OBM uses a science-driven, behavioral
approach to strengthen individual and systems-level performance in organi-
zations. Applications of OBM have demonstrated success in a variety of
human service settings (HSS; Reid & Parsons, 2000). Yet, OBM has not
been widely adopted in HSS and more research is needed to improve
organization-wide adoption and application.
Researchers have offered several suggestions for advancing the literature
on OBM in human services (Reid & Parsons, 2000; Riley & Frederiksen,
1984; Sturmey, 1998). One of the primary suggestions was to conduct
research in settings other than residential centers and schools (Reid &
Parsons, 2000; Sturmey, 1998). Other settings could include nursing homes
and healthcare care facilities, which may present with different challenges.
Researchers have also suggested conducting research with professionals in
supervisory and managerial roles rather than with direct care staff only
(Methot, Williams, Cummings, & Bradshaw, 1996). Individuals in leadership
roles have the opportunity to impact a large number of employees who could,
in turn, improve their impact on the client. Lastly, they suggested broadening
the range of staff issues addressed with OBM techniques. For example,
Sturmey (1998) asserted that research on punishment procedures with staff
is needed, since it occurs in most human service organizations. He also
recommended that OBM researchers develop strategies to assess staff com-
petencies and the supporting environment and then create efficient training
strategies and other interventions to address deficits.
Several quantitative reviews of OBM research exist including reviews on
trends in OBM research (VanStelle et al., 2012), characteristics of feedback
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 3

(Alvero, Bucklin, & Austin, 2001), and assessment in OBM (Wilder,


Lipschultz, King, Driscoll, & Sigurdsson, 2018). Yet, to date, no comprehen-
sive review of OBM research specific to HSS has been conducted.
Understanding the current state of research related to OBM in human
services can provide more information about techniques with strong support
and opportunities for future research.
The purpose of this paper is to (a) compare trends/patterns of research
studies utilizing OBM interventions to improve staff performance across
different HSS, staff (participant) roles, and client populations; (b) reveal
opportunities for greater contribution in the OBM in human services litera-
ture; and (c) evaluate whether OBM research adequately addresses the
relevant concerns and priorities faced by practitioners in HSS.

Method
Research articles were selected from three behavior analytic journals between
1990 and 2016. A scan of these journals revealed that prior to 1990, few
research articles examined OBM in HSS. These journals were (a) The Journal
of Applied Behavior Analysis (JABA), (b) The Journal of Organizational
Behavior Management (JOBM), and (c) Behavior Analysis in Practice
(BAP). BAP was established in 2008 and therefore data were only available
from 2008 to 2016 for this journal. JOBM and JABA were included because
they serve as the flagship journals for OBM and ABA research, respectively.
BAP was included because it is devoted to practitioner issues in ABA; many
practitioner issues involve OBM.

Article selection and inclusion criteria


The authors conducted a manual search of these three journals for all issues
that were published from 1990 to 2016. The title, abstract, and method
sections of every article in these issues was reviewed to determine whether
they met the following inclusion criteria. First, articles had to describe data-
based studies that were experimental in nature, meaning, an intervention was
added in an attempt to change behavior. Nonexperimental designs (e.g.,
correlational studies) and conceptual articles were excluded.
Second, it was required that the OBM intervention was applied with the
goal of improving staff and/or supervisor performance and that it was
implemented in an applied HSS. This meant that researchers had to record
dependent measures of staff behavior in the actual clinical or applied setting
in which human services were provided, as opposed to a simulated or office
setting (e.g., analogue study).
The final inclusion criteria required that the human service organization
served individuals with disabilities or special needs. We defined “special
4 N. GRAVINA ET AL.

needs” as requiring additional assistance for basic communication skills and


daily living activities beyond what is required of typically-developing peers.
Therefore, research conducted in school settings where staff worked with
typically developing children only were excluded from the review. In addi-
tion, research conducted in healthcare settings such as emergency depart-
ments, with patients presenting with acute concerns, was also excluded from
the review.
After review, 82 articles were initially selected. However, upon more in-
depth analysis, 9 articles were excluded, resulting in a total of 73 articles for
review: 44 from JABA, 22 from JOBM, and 7 from BAP. The majority of
articles excluded after a more in depth analysis were preference assessment
studies that did not evaluate the impact of the reinforcers identified through
the assessment in the actual work environment.

Categories and definitions


To allow for ease of comparison across different literature reviews, we used
categories and definitions from previously published OBM reviews whenever
possible (Bucklin, Alvero, Dickinson, Austin,& Jackson, 2000; Nolan, Jarema,
& Austin, 1999; VanStelle et al., 2012). However, due to the small number of
OBM reviews specific to clinical and HSS, we also developed additional
definitions. Definitions for each category follow.

Client population
The clients receiving services were categorized across the following popula-
tions: (a) Brain Injury; (b) Intellectual Disabilities (e.g., Autism and mental
retardation); (c) Elders and Dementia (e.g., dementia and Alzheimer’s); (d)
Mental Illness (e.g., schizophrenia, depression, and mood disorders); and (e)
Other (e.g., William’s Syndrome). When a study’s participants fit more than
one category, all relevant categories were included.

Setting of client services


We collected data on the setting or location where the participants’ clients
received clinical services. Setting was divided into seven categories: (a) Group
Home (family- or home-based independent living with minimal support), (b)
Residential Facility (nonmedical institution where clients lived at the facility),
(c) Nursing Home (assisted living for elder care), (d) School (services of a
solely educational purpose), (e) Healthcare Setting (institution in which
clients required severe and ongoing medical attention/services), and (f) Day
Treatment Center (nonresidential outside facility where part-time services
were provided to teach daily activities or address other behavioral excesses or
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 5

deficits), (g) Other (did not fit into predefined category such as employment
training center), and (h) Unclear (article did not provide enough information
to determine). When a study’s setting included more than one category, all
relevant categories were included.

Employee population
Data on the roles served by study participants were coded across the follow-
ing categories: (a) Direct Care Staff (directly implemented nonacademic
behavioral procedures; included behavior therapists/technicians, front-line
staff);(b) Supervisors (oversaw and provided supervision to direct care
staff); (c) Managers (oversaw supervisors, team leaders, etc. at an organiza-
tion- or facility-wide level); (d) Students (enrolled in university program); (e)
Non-ABA Professionals (included speech pathologists, mental health coun-
selors, etc.); (f) Teachers (provided classroom-based instruction to children
or adults targeting academic skills such as math, reading, etc.); (g) Nurses
(provided medical care for sick or injured patients in hospitals, nursing
homes, patient homes, etc.); and (f) Other (did not fit predefined category
such as consultants). When a study’s participants fit more than one category,
all relevant categories were included. It should be noted that behavior
analysts were not included as a category because some of the studies reviewed
predated the establishment of certification.

Preintervention behavioral assessment


Data were collected on whether studies utilized indirect or direct behavioral
assessments prior to the development and implementation of OBM interven-
tions. Specific behavioral assessments that were utilized include: Performance
Diagnostic Checklist (Austin, 2000), Performance Diagnostic Checklist-Human
Services (Carr, Wilder, Majdalany, Mathisen, & Strain, 2013), PIC-NIC Analysis
(Daniels & Bailey, 2014), Systems Analysis (Diener, McGee, & Miguel, 2009),
and some descriptive assessment procedures.

Types of independent variables


Data were collected on all OBM-based interventions (independent variables)
used within each study, using definitions and categories from Bucklin and
Alvero (2000) and VanStelle et al. (2012). Two sets of categories were
combined, feedback and praise as well as training and antecedents, because
some intervention descriptions made it difficult to delineate between them.
In addition, two categories were added, Systems Re-Design and Monitoring
or Observing Self or Others.
6 N. GRAVINA ET AL.

The following categories were used: (a) Training and/or Antecedents (instruc-
tions, prompts, or changes in resource availability meant to teach, encourage, or
enable the performance); (b) Feedback and/or Praise (information or positive
statements based on performance); (c) Monitoring or Observing Self or Others
(the ongoing monitoring or tracking of work behaviors by oneself, coworker,
colleague, or peer, excluding typical supervision measures by supervisors); (d)
Goal-Setting (establishment of goals specific to the performance); (e) Monetary
Rewards (money or gift cards); (f) Non-Monetary Rewards (rewards without a
specified monetary value such as choice of task or a coffee mug); (g) Systems Re-
Design (any modification to a previously-established staff management, reinforce-
ment, or incentive system/procedure); (h) Punishment or Negative Reinforcement
(interventions that included potential aversive properties for not performing to
standard); and (i) Other (when categorized as other, the intervention is listed in
Appendix A, such as biweekly meetings or familiar vs. unfamiliar staff).

Types of dependent variables


Data were collected on the types of dependent variables used to monitor and
improve service staff performance. Categories common to behavioral mea-
sures utilized among a variety of HSS included (a) Treatment Integrity of
Behavior Intervention Programs (integrity on staff implementation of skill
acquisition programs, daily living or self-care skills routines, problem beha-
vior protocols, data collection); (b) Safety Procedures (behaviors aimed at
reducing the potential threats or harm to the client or others; including
physical restraint procedures, glove use, correct use of equipment, etc.); (c)
Preparation and Cleanliness (training or integrity measures on pre- or post-
session preparation or manipulation of materials not related to safety); (d)
Administration and Staff Management (any measure that assessed the ability
to more accurately or efficiently manage the performance of staff or other
administrative systems); (e) Attendance/Turnover (any measure that assessed
the occurrence of absences, leave-time, or workforce instability); and (f)
Social Validity Measures of Engagement with Clients and Job Enjoyment
(any measure that assessed the quality of engagement/interaction between the
staff and the target client/consumers or the staff member’s job satisfaction).

Client behaviors and outcome


Finally, data were collected on whether or not the studies reported quanti-
tative measures of client behavior directly associated with the intervention
when the primary dependent variable was treatment integrity or engagement
with the client. We also analyzed whether studies that directly measured
client behavior (e.g., rate of problem behavior) also measured overall client
outcome (e.g., achieving client’s ultimate goals).
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 7

Interrater agreement
Interrater agreement (IRA) for article inclusion was evaluated for 30% of the
articles in each journal by having a second reviewer independently examine
every article in an issue and identify those that met inclusion criteria.
Agreements and disagreements were compared across the two reviewers.
Inclusion IRA was calculated by dividing the number of agreements by the
number of agreements plus disagreements and then multiplying by 100 for
percentage. All disagreements were discussed between the reviewers until an
agreement was reached. Initial IRA averaged 99% (range, 97 to 99%) across the
three journals. After further discussion and agreement was reached among the
reviewers, IRA for inclusion was adjusted to 100% for the three journals.
IRA was also assessed for the classification of categories across all mea-
sures compared. Identical, but separate, Microsoft Excel™ spreadsheets were
used by primary and secondary reviewers. Once the data were entered by
both reviewers, item-by-item comparisons across the different measures were
analyzed. Of the 75 total articles included, 25 (33%) were independently
reviewed by a second reviewer. Initial IRA averaged 94% (range, 84 to
100%). For all disagreements the reviewers had further discussion until an
adjusted IRA of 100% was obtained.

Results
From 1990 to 2016, 2,104 articles were published in JABA and 488 articles
were published in JOBM. From 2008 to 2016, 250 articles were published in
BAP. Of those articles, 73 articles met the inclusion criteria for this review, 44
from JABA (2.1% of total JABA articles), 22 from JOBM (4.5% of total JOBM
articles), and 7 from BAP (2.8% of total BAP articles). Figure 1 depicts a
cumulative graph of the OBM in human services articles included from each
journal. JABA averaged 1.6 articles on OBM in human services published per
year, JOBM averaged .81 per year, and BAP averaged .78 per year. The
articles included in the review can be found in Appendix A.

Client population, setting of client services, and employee population


Table 1 displays the client population, setting of client services, and employee
population targeted for the studies included in the review.

Client population
The top panel of Table 1 displays the results for all three journals reviewed for the
client population in the study. Individuals diagnosed with intellectual disabilities
(e.g., autism) was the most common client population for all three journals,
found in 35 out of 44 (79.6%) JABA studies, 19 out of 22 (86.4%) JOBM studies,
8 N. GRAVINA ET AL.

50

Cumulative Number of Publications


45 JABA JOBM BAP
40
35
30
25
20
15
10
5
0
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
Year

Figure 1. Cumulative number of articles on organizational behavior management in human


service settings that met the inclusion criteria for this review from 1990 to 2016.

Table 1. Setting of Client Services, Client Population, and Staff for Articles Reviewed
JABA (44) JOBM (22) BAP (7)
Percent Num. Percent Num. Percent Num.
Client Population
Intellectual Disability 79.6% 35 86.4% 19 100% 7
Brain Injury 9.1% 4 9.1% 2 0.0% 0
Elders & Dementia 2.3% 1 9.1% 2 14.3% 1
Mental Illness 4.5% 2 4.3% 1 0.0% 0
Other 15.9% 7 4.3% 1 0.0% 0
Setting of Client Services
School 47.7% 21 13.0% 3 28.6% 2
Day Treatment Center 18.2% 8 18.2% 4 71.4% 5
Group Home 18.2% 8 43.5% 10 0% 0
Residential Facility 18.2% 8 47% 11 0% 0
Healthcare Setting 4.5% 2 4.5% 1 0% 0
Nursing Home 0% 0 4.5% 1 0% 0
Other 4.5% 2 4.5% 1 0% 0
Unclear 2.3% 1 0% 0 0% 0
Employee Population
Direct Care Staff 40.9% 18 63.6% 14 57.1% 4
Supervisors 13.6% 6 30.4% 7 14.3% 1
Managers 0.0% 0 18.2% 4 0.0% 0
Students 4.5% 2 9.1% 2 28.6% 2
Non-ABA Professionals 2.3% 1 18.2% 4 0.0% 0
Teachers 45.5% 20 9.1% 2 28.6% 2
Nurses 4.5% 2 9.1% 2 0.0% 0
Other 4.5% 2 4.5% 1 14.3% 1

and 7 out of 7 (100%) BAP studies. JABA also included four studies with clients
with traumatic brain injury (9.1%), one with older adults (2.3%), and two with
individuals diagnosed with mental illness (4.5%). Seven studies included indivi-
duals with other diagnoses (15.9%), for example, attention deficit hyperactivity
disorder (ADHD) or severe physical disabilities. JOBM included two studies with
clients with traumatic brain injuries (9.1%), two with older adults (9.1%), one
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 9

with individuals diagnosed with mental illness (4.3%), and one categorized as
other (4.3%). BAP included one study with older adults (14.3%).

Setting of client services


The middle panel of Table 1 depicts the results for client setting for the three
journals reviewed. Schools were the most common setting for studies pub-
lished in JABA (24 of 44, 47.7%), followed by residential centers (18.2%), day
treatment centers (18.2%), and group homes (18.2%), each found in eight
studies. Two studies were conducted in healthcare settings (4.5%), two were
categorized as other (4.5%), and one was unclear (2.3%). Residential centers
were the most common setting in JOBM (11 of 22, 47%) followed closely by
group homes (10 of 22, 43.5%). Four studies were conducted in day treat-
ment centers (18.2%), three in schools (13.0%), one in a healthcare setting
(4.5%), one in a nursing home (4.5%), and one in a setting categorized as
other (4.5%). Some studies included more than one setting. For example,
Williams, Vittorio, and Hausherr (2003) conducted two experiments across
three settings (i.e., group homes, a residential facility, and a day treatment
center). Five of the studies in BAP were conducted in day treatment centers
(71.4%) and two in schools (28.6%).

Employee population
The bottom panel of Table 1 displays the results for the employees targeted in
the studies for all three journals. Twenty studies in JABA targeted teachers
(45.5%) and 18 targeted direct care staff (40.9%). Six studies targeted super-
visors (13.6%), two targeted nurses (4.5%), two targeted students (4.5%), one
targeted non-ABA professionals like speech pathologists (2.3%), and two
targeted participants that did not fit the defined categories (4.5%). Fourteen
studies in JOBM targeted direct care staff (63.6%), seven targeted supervisors
(30.4%), and four targeted managers (18.2%). Two JOBM studies focused on
teacher behaviors (9.1%) and two measured nurse behaviors (9.1%). Four
studies were categorized as non-ABA professionals such as occupational
therapists and psychologists (18.2%), two as students (9.1%), and one as
other (4.5%). Four studies in BAP targeted direct care staff (57.1%), two
targeted teachers (28.6%), two targeted students (28.6%), and one also
included supervisors (14.3%). One study targeted participants categorized
as other (14.3%).

Preintervention behavioral assessment


Out of 44 JABA studies reviewed, three included a preintervention behavioral
assessment (6.8%). One study used the Performance Diagnostic Checklist–
Human Services (PDC–HS), one used a descriptive assessment, and one used
a descriptive assessment along with other assessment tools such as interviews.
10 N. GRAVINA ET AL.

Four out of 22 (18.2%) studies reviewed in JOBM included a preintervention


behavioral assessment including the Performance Diagnostic Checklist
(PDC) PDC–HS, surveys and interviews, and PIC/NIC Analysis. Two out
of 7 (28.6%) studies reviewed in BAP included assessments, which were the
PDC and the PDC–HS.

Dependent variable, independent variable, and client outcome measures


Tables 2 and 3, respectively, display the data for dependent and independent
variables as well as client behavior and outcome measures for all three
journals.

Dependent variable
Treatment integrity was the most common dependent variable targeted in all
three journals, addressed in 35 out of 44 (79.5%) studies in JABA, 8 out of 22
(36.4%) studies in JOBM, and 3 out of 7 (42.9%) studies in BAP.
Administrative and staff management was tied with treatment integrity in
JOBM with 8 out of 22 (36.4%) studies with that target. Only three studies in
JABA (6.8%) and no studies in BAP (0%) targeted administrative and staff
management. Safety procedures was the second most common target in
JABA (7 out of 44, 15.9%), but was only studied once each in JOBM (4.6%)
and BAP (14.3%). Engagement and job enjoyment was the second most
common target in BAP (2 out of 7, 28.6%) and was targeted five times
each in JABA (11.4%) and JOBM (22.7%). Articles reported research related
to preparation tasks twice in JABA (4.5%), three times in JOBM (13.6%), and

Table 2. Dependent Variables and Independent Variables for Articles Reviewed


JABA (44) JOBM (22) BAP (7)
Percent Num. Percent Num. Percent Num.
DVs
Treatment Integrity 79.6% 35 36.4% 8 42.9% 3
Safety 15.9% 7 4.6% 1 14.3% 1
Engagement 11.4% 5 22.7% 5 28.6% 2
Administrative and Staff Management 6.8% 3 36.4% 8 0.0% 0
Preparation 4.5% 2 13.6% 3 14.3% 1
Other 0.0% 0 0.0% 0 14.3% 1
Attendance/Turnover 0.0% 0 9.1% 2 0.0% 0
IVs
Training and/or Antecedents 97.7% 43 63.6% 14 100% 7
Feedback and/or Praise 75% 33 81.8% 18 85.7% 6
Monitoring or Observations by Self/Others 27.3% 12 36.4% 8 14.3% 1
Goal-Setting 15.9% 7 45.5% 10 0.0% 0
Monetary Rewards 2.3% 1 13.6% 3 0.0% 0
Non-Monetary Rewards 0.0% 0 13.6% 3 0.0% 0
Systems re-design 2.3% 1 22.7% 5 0.0% 0
Punishment or Negative Reinforcement 6.8% 3 0.0% 0 0.0% 0
Other 2.3% 1 0.0% 0 0.0% 0
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 11

Table 3. Percentage of Treatment Integrity and Engagement Studies That Included Outcome
Measures
JABA JOBM BAP
Dependent Variable Percent Num. Percent Num. Percent Num.
Treatment Integrity (Total of 35) (Total of 8) (Total of 3)
Client Behaviors Measured 34.3% 12 13.6% 3 0% 0
Engagement (Total of 5) (Total of 5) (Total of 2)
Client Behaviors Measured 100% 5 13.6% 3 28.6% 2
Studies with Behavioral (Total of 17) (Total of 6) (Total of 2)
Measures of Treatment Integrity 17.6% 3 33.3% 2 50% 1
Or Engagement that Also
Measured Client Outcomes

once in BAP (14.3%). Two articles focused on attendance/turnover in JOBM


(9.1%) and there were no articles on attendance/turnover in JABA (0%) or
BAP (0%).

Independent variable
Forty-three of the studies published in JABA included antecedent and train-
ing interventions (97.7%) and 33 included feedback and praise (75%). Twelve
studies included observation as an intervention (27.3%), seven included goal-
setting (15.9%), three included punishment or negative reinforcement (6.8%),
and monetary incentives (2.3%) and systems analysis (2.3%) were each used
once. In JOBM, 18 studies included feedback or praise (81.8%), 14 included
antecedents and training (63.6%), and eight included observations (36.4%).
Five studies included systems redesign (22.7%), three included monetary
rewards (13.6%), and three included nonmonetary rewards (13.6%). In
BAP, all seven studies included training and antecedents (100%), six included
feedback and praise (85.7%), and one included peer- or self-observa-
tion (14.3%).

Client behavior and outcome measures


In the 35 studies that targeted treatment integrity in JABA, 12 reported
quantitative client behavior measures (34.3%). Three out of eight (37.5%)
studies on treatment integrity in JOBM reported client behavior measures.
No studies that targeted treatment integrity in BAP reported quantitative
client behavior measures.
In JABA, all five of the studies on engagement included a measure of client
behavior (100%). In JOBM, three of the five studies that evaluated an inter-
vention to improve engagement also included a measure of client behavior
(60%). In BAP, both studies that measured engagement also measured client
behavior (100%).
Of the JABA, JOBM, and BAP studies that targeted treatment integrity
and/or engagement and reported measures of client behavior (e.g., rate of
12 N. GRAVINA ET AL.

problem behavior), 3 of 17, 2 of 6, and 1 of 2, respectively, also reported


client outcome measures (e.g., achieving client’s ultimate goals).

Discussion
This literature review demonstrates that a significant amount of research has
been conducted on using OBM in HSS. Much of the research has focused on
improving treatment integrity in schools, day treatment centers, and residential
facilities. Training and feedback were commonly used interventions across all
three journals. A low percentage of studies included pre-intervention assess-
ments and client outcome data, when appropriate. In addition, a small percen-
tage of studies targeted improvements with managers or supervisors while a
higher percentage targeted direct care staff and teachers.
The outcomes of this review supported several recommendations previously
made for OBM in human service research. First, OBM interventions in HSS
should focus on changing the behavior of supervisors and managers rather than
direct care workers (Methot, Williams, Cummings, & Bradshaw, 1996). In the
current review, close to 50% of the studies published in JOBM were in line with
this recommendation, while considerably fewer in JABA or BAP targeted these
employee populations. Intervening with staff who directly implement behavioral
programs may limit the maintenance and generalization of these interventions
over the long term at a company-wide level, especially given the high turnover
rate of direct service providers in HSS (Leblanc et al., 2009). Therefore, research-
ers are encouraged to incorporate supervisors and managers into their studies.
Methot, Williams, Cummings, and Bradshaw (1996) provide a nice example of
incorporating leaders into OBM in HSS research.
Second, researchers have previously recommended that research should
expand upon the specific staff behaviors addressed (Reid & Parsons, 2000;
Sturmey, 1998). We found research most often focused on treatment fidelity
or client engagement. Administration, staff management and safety were
each addressed moderately often in one journal, but rarely in the other
journals. Studies on attendance/turnover and preparation were largely absent
across all journals. Although poor treatment fidelity is a major barrier to
providing effective services, so too is the high turnover rate of staff in HSS
and the cascading effect of inefficient administrative practices. Moreover,
ineffective staff management procedures may directly impact poor treatment
fidelity. Therefore, researchers should consider addressing these more sys-
temic issues and evaluating the impact on treatment fidelity and outcomes.
Sturmey (1998) also called for expansion of OBM in human service research
to settings other than residential centers and schools. While these two settings
were the most researched, day treatment centers and group homes were not far
behind. Given the emphasis upon these locations, it was not surprising that the
most-studied client population, by far, was individuals with intellectual delay or
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 13

disabilities (e.g., autism). While many behavior analysts do work with this
population, there are numerous other populations served through human ser-
vice organizations. For example, researchers could expand into healthcare set-
tings, nursing homes, and job-training settings, which are likely to serve clients
suffering from dementia, traumatic brain injury, or mental illness. Agencies
serving these populations may attract staff who are very different from those
working with children with autism. For example, many staff working with
children with autism have formal education in ABA; this is far less likely to be
the case in other human service settings. This may, in turn, necessitate targeting
a wider variety of employee behaviors or incorporating less frequently studied
OBM interventions.
We also have several new recommendations based on the current review.
First, technological imprecision was a common theme across the articles
reviewed. In some cases, procedures were described but not named. Other
times the same name was applied to different procedural variations. For
example, most studies described the form of feedback as verbal or visual.
However, the type of feedback––corrective alone, praise alone, or a combina-
tion of the two––was often unclear. When training packages were used, the
specific components were often difficult to decipher (e.g., were checklists
used as antecedent interventions or as part of a comprehensive training
package). Often the description of components in a training package
resembled Behavior Skills Training (BST; Reid & Parsons, 2000), but this
method was rarely named as an intervention. This made it unclear if the
evidence-based technology or its individual components were used, an
important distinction given their differences in effectiveness. Finally, it was
often unclear if feedback was used as a stand-alone intervention or as part of
a comprehensive training package. From a research perspective, these incon-
sistencies made coding the interventions difficult. From a practical perspec-
tive, this could lead to inaccurate replication of the procedures, which could
seriously impact treatment outcomes. Greater precision in describing
research procedures could result in easier, more effective replication.
Finally, there were two important areas that were not addressed in many of
the studies reviewed. First, in each journal only 6.7 to 28.6% of the studies used
preintervention assessments to match the interventions to the function of the
challenging staff behavior. A functional-analytic approach to improving proble-
matic behaviors with clinical clients is not only common, but is considered best
practice; the same approach for improving problematic behaviors with staff was
rarely used. This may be, in part, due to the fact that the PDC and the PDC–HS
were established in the 2000s and this review started with studies published in
1990. However, a scan of the research conducted after the establishment of these
tools reveals that assessment is still not common practice. Better tools and more
research on when and how to use these tools could strengthen the use of
assessment in OBM in HSS research. Second, comparably low percentages of
14 N. GRAVINA ET AL.

studies reported simultaneously measuring client behaviors and/or outcomes


when targeting treatment fidelity or engagement. The main reason to improve
staff performance should be to improve outcomes for consumers or customers.
In the case of human service settings, the main reason to improve staff perfor-
mance should be to improve client behaviors and outcomes. If these client
variables are not simultaneously measured, we cannot determine the social
significance of the intervention nor its impact upon valued business results.
In summary, considering previous suggestions (e.g., Reid & Parsons, 2000;
Sturmey, 1998) in combination with the present results, the following recom-
mendations are made to researchers conducting studies to improve staff
behavior in HSS:

(1) Use a top-down approach when selecting employee participants and


target behaviors. Training supervisors and managers, who can in turn
train front-line staff, may be more efficient and promote generalized,
sustained change.
(2) Address the full range of problematic staff behaviors that are relevant
to improving business and client outcomes in HSS (e.g., attendance/
turnover, preparation).
(3) Although autism service agencies may be numerous and convenient,
conduct studies across the full range of HSS that serve clients of
varying populations.
(4) Provide the name of established treatment packages (e.g., BST) and/or
describe individual treatment components with enough technological
precision that human service professionals can replicate them without
guidance from an experienced researcher. Consider using flowcharts
or supplemental online materials like videos whenever possible to
enhance clarity.
(5) Consistently use preintervention assessments to drive a function-
based approach to selecting intervention techniques. This may lead
to a wider range of intervention procedures examined in research.
(6) Develop more preintervention assessment tools and conduct research
to clarify best practices in assessment (e.g., who should be interviewed
when using the PDC).
(7) Regularly report client measures and outcomes in addition to depen-
dent measures of staff behavior to validate the intervention’s social
significance and impact upon business results.

Limitations
One limitation of this review is that only three journals, JABA, JOBM, and
BAP were included and therefore, it is not exhaustive. It is possible that the
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 15

preferences of the editors and reviewers in each of the journals impacted the
findings. Other journals including Behavioral Interventions, Behavior
Modification, and Research in Developmental Disabilities should also be
reviewed to build upon these findings. In addition, research that targeted
typically developing individuals in HSS was excluded. Further, a review of
OBM research in schools could enhance the literature.
A second limitation is that although 36 years of research were included, no
trends across time were reported, except cumulative number of publications.
It is possible that some of the suggestions for future research are already on
an increasing trend in the most recent OBM in HSS research. For example,
the PDC was published in 2000 and the PDC–HS was published in 2013.
These were the most commonly employed assessment tools and therefore,
likely increased the use of assessment in these settings. However, anecdotally,
we still observed room for improvement in each of these areas in the most
recent research. Finally, because BAP began in 2008, the data from BAP only
represents research for eight years instead of 36. This may skew comparisons
across journals since BAP will only include more contemporary research and
procedures. We decided to include BAP because its mission is arguably the
most aligned with this area of research.
As with any review, the findings may be skewed due to publication bias. In
human service settings, it can be challenging to conduct OBM research with
enough experimental rigor to warrant publication in more selective journals
like the ones included in this review. In addition, journals often only publish
positive results. Reviewing journals that do not require the same rigor as
those selected for this review may help provide more balanced findings. In
addition, including theses and dissertations may allow for research with no
effects to be incorporated into the findings. Nonetheless, this review offers a
starting point for synthesizing the literature.
Lastly, this review did not include measures of quality in terms of the measure-
ment procedures, experimental designs, or intervention effectiveness. It also did
not include information about social validity or maintenance. All of these are
important factors in understanding OBM in HSS interventions and future
reviews should examine them. Going forward, reporting effect sizes in research
should also be encouraged to facilitate evaluations of intervention effectiveness.

Conclusion
In conclusion, a large literature examining OBM techniques in HSS exists.
While this research area is robust, there is a need to expand the populations
and settings included, the use of assessments to determine interventions,
intervention strategies, and dependent variables targeted. We hope this
review will encourage researchers to expand the scope of their research
OBM in HSS.
16 N. GRAVINA ET AL.

ORCID
Nicole Gravina http://orcid.org/0000-0001-8210-7159

References
Alavosius, M. P., & Sulzer-Azaroff, B. (1990). Acquisition and maintenance of health-care
routines as a function of feedback density. Journal of Applied Behavior Analysis, 23, 151–
162. doi:10.1901/jaba.1990.23-151
Alvero, A. M., Bucklin, B. R., & Austin, J. (2001). An objective review of the effectiveness and
essential characteristics of performance feedback in organizational settings. Journal of
Organizational Behavior Management, 21, 3–29. doi:10.1300/J075v21n01_02
Arco, L. (1997). Improving program outcome with process-based performance feedback.
Journal of Organizational Behavior Management, 17, 37–64. doi:10.1300/
J075v17n01_03
Austin, J. (2000). Performance analysis and performance diagnostics. In J. Austin & J.
Carr (Eds.), Handbook of applied behavior analysis (pp. 321–350). Reno, NV: Context
Press.
Babcock, R. A., Sulzer-Azaroff, B., & Sanderson, M. (1992). Increasing nurses’ use of feedback
to promote infection control practices in a head-injury treatment center. Journal of Applied
Behavior Analysis, 25, 621–627. doi:10.1901/jaba.1992.25-621
Belisle, J., Rowsey, K. E., & Dixon, M. R. (2016). The use of in situ behavioral skills training to
improve staff implementation of the PEAK relational training system. Journal of
Organizational Behavior Management, 36, 71–79. doi:10.1080/01608061.2016.1152210
Boudreau, C. A., Christian, W. P., & Thibadeau, S. (1993). Reducing absenteeism in a human
service setting. Journal of Organizational Behavior Management, 13, 37–50. doi:10.1300/
J075v13n02_04
Bucklin, B. R., Alvero, A. M., Dickinson, A. M., Austin, J., & Jackson, A. K. (2000). Industrial-
organizational psychology and organizational behavior management: An objective com-
parison. Journal of Organizational Behavior Management, 20, 27–75.
Carr, J. E., Wilder, D. A., Majdalany, L., Mathisen, D., & Strain, L. A. (2013). As assessment-
based solution to a human-service employee performance problem: An initial evaluation of
the performance diagnostic checklist-human services. Behavior Analysis and Practice, 6,
16–32. doi:10.1007/BF03391789
Casella, S. E., Wilder, D. A., Neidert, P., Rey, C., Compton, M., & Chong, I. (2010). The
effects of response effort on safe performance by therapists at an autism treatment facility.
Journal of Applied Behavior Analysis, 43, 729–734. doi:10.1901/jaba.2010.43-729
Casey, A. M., & McWilliam, R. A. (2011). The impact of checklist-based training on teachers’
use of the zone defense schedule. Journal of Applied Behavior Analysis, 44, 397–401.
doi:10.1901/jaba.2011.44-397
Catania, C., Almedia, D., Lui-Constant, B., & Digennaro-Reed, F. (2009). Video modeling to
train staff to implement discrete-trial instruction. Journal of Applied Behavior Analysis, 42,
387–392. doi:10.1901/jaba.2009.42-387
Chok, J. T., Shlesinger, A., Studer, L., & Bird, F. L. (2012). Description of a practitioner
training program on functional analysis and treatment development. Behavior Analysis and
Practice, 5, 25–36. doi:10.1007/BF03391821
Codding, R. S., Feinberg, A. B., Dunn, E. K., & Page, G. M. (2005). Effects of immediate
feedback on implementation of behavior support plans. Journal of Applied Behavior
Analysis, 38, 205–219. doi:10.1901/jaba.2005.98-04
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 17

Codding, R. S., Livanis, A., Pace, G. M., & Vaca, L. (2008). Using performance feedback to
improve treatment integrity of class wide behavior plans: An investigation of observer
reactivity. Journal of Applied Behavior Analysis, 41, 417–422. doi:10.1901/jaba.2008.41-417
Collins, S., Higbee, T. H., & Salzberg, C. L. (2009). The effects of video modeling on staff
implementation of a problem-solving intervention with adults with developmental disabil-
ities. Journal of Applied Behavior Analysis, 42, 849–854. doi:10.1901/jaba.2009.42-849
Cook, T., & Dixon, M. R. (2006). Performance feedback and probabilistic bonus contingen-
cies among employees in a human service organization. Journal of Organizational Behavior
Management, 25, 45–63. doi:10.1300/J075v25n03_04
Daniels, A. C., & Bailey, J. S. (2014). Performance management: Changing behavior that drives
organizational effectiveness. Atlanta, Georgia: Aubrey Daniels International Inc.
Dib, N., & Sturmey, P. (2007). Reducing student stereotypy by improving teachers’ imple-
mentation of discrete-trail teaching. Journal of Applied Behavior Analysis, 40, 339–343.
doi:10.1901/jaba.2007.52-06
Diener, L. H., McGee, H. M., & Miguel, C. F. (2009). An integrated approach for conducting
a behavioral systems analysis. Journal of Organizational Behavior Management, 29, 108–
135. doi:10.1080/01608060902874534
DiGennaro-Reed, F. D., Codding, R., Catania, C. N., & Maguire, H. (2010). Effects of video
modeling on treatment integrity of behavioral interventions. Journal of Applied Behavior
Analysis, 43, 291–295. doi:10.1901/jaba.2010.43-291
DiGennaro-Reed, F. D., Martens, B. K., & Kleinmann, A. E. (2007). A comparison of
performance feedback procedures on teachers’ treatment implementation integrity and
students’ inappropriate behavior in special education classrooms. Journal of Applied
Behavior Analysis, 40, 447–761. doi:10.1901/jaba.2007.40-447
Ditzian, K., King, A., Tanz, J., & Wilder, D. (2015). N evaluation of the performance
diagnostic checklist-human services to assess an employee performance problem in a
center-based autism treatment facility. Journal of Applied Behavior Analysis, 48, 199–
203. doi:10.1002/jaba.171.
Ducharme, J. M., & Feldman, M. A. (1992). Comparison of staff training strategies to
promote generalized teaching skills. Journal of Applied Behavior Analysis, 25, 165–179.
doi:10.1901/jaba.1992.25-165
Engleman, K. K., Altus, D. E., Mosier, M. C., & Mathews, R. M. (2003). Brief training to
promote the use of less intrusive prompts by nursing assistants in a dementia care unit.
Journal of Applied Behavior Analysis, 36, 129–132. doi:10.1901/jaba.2003.36-129
Este, S. (2007). The challenges of accountability in human services: Performance management
in the adult protective services program of Texas. Retrieved from: https://digital.library.
txstate.edu/handle/10877/3527
Fleming, R., & Sulzer-Azaroff, B. (1992). Reciprocal peer management: Improving staff
instruction in a vocational training program. Journal of Applied Behavior Analysis, 25,
611–620. doi:10.1901/jaba.1992.25-611
Fleming, R. K., Oliver, J. R., & Bolton, D. M. (1996). Training supervisors to train staff.
Journal of Organizational Behavior Management, 16, 3–25. doi:10.1300/J075v16n01_02
Giannakakos, A. R., Vladescu, J. C., Kisamore, A. N., & Reeve, S. A. (2016). Using video
modeling with voiceover instructions plus feedback to train staff to implement direct
teaching procedures. Behavior Analysis and Practice, 9, 126–134. doi:10.1007/s40617-015-
0097-5
Gil, P. J., & Carter, S. L. (2016). Graphic feedback, performance feedback, and goal setting
increased staff compliance with a data collection task at a large residential facility. Journal
of Organizational Behavior Management, 36, 56–70. doi:10.1080/01608061.2016.1152207
18 N. GRAVINA ET AL.

Graff, R. B., & Karsten, A. M. (2012). Evaluation of a self-instruction package for conducting
stimulus preference assessments. Journal of Applied Behavior Analysis, 45, 69–82.
doi:10.1901/jaba.2012.45-69
Gravina, N., VanWagner, M., & Austin, J. (2008). Increasing physical therapy equipment
preparation using task clarification, feedback, and environmental manipulations. Journal of
Organizational Behavior Management, 28, 110–122. doi:10.1080/01608060802100931
Green, C. W., Reid, D. H., Passante, S., & Canipe, V. (2008). Changing less-preferred duties to
more-preferred: A potential strategy for improving supervisor work enjoyment. Journal of
Organizational Behavior Management, 28, 90–109. doi:10.1080/01608060802100899
Green, C. W., Rollyson, J. H., & Passante, S. C. (2002). Maintaining proficient supervisor
performance with direct support personnel: An analysis of two management approaches.
Journal of Applied Behavior Analysis, 35, 205–208. doi:10.1901/jaba.2002.35-205
Guercio, J. M., & Dixon, M. R. (2010). Improving the quality of staff and participant
interaction in an acquired brain injury organization. Journal of Organizational Behavior
Management, 30, 49–56. doi:10.1080/01608060903529780
Guercio, J. M., & Dixon, M. R. (2011). The observer effect and its impact on staff behavior in
an acquired brain injury neurobehavioral treatment setting. Journal of Organizational
Behavior Management, 31, 43–54. doi:10.1080/01608061.2010.520142
Harchik, A. E., Sherman, J. A., Sheldon, J. B., & Strousse, M. C. (1992). Ongoing consulta-
tions as a method of improving performance of staff members in a group home. Journal of
Applied Behavior Analysis, 25, 599–610. doi:10.1901/jaba.1992.25-599
Hawkins, A. M., Burgio, L. D., Langford, A., & Engel, B. T. (1993). The effects of verbal and
written supervisory feedback on staff compliance with assigned prompted voiding in a
nursing home. Journal of Organizational Behavior Management, 13, 137–150. doi:10.1300/
J075v13n01_09
Hendrickson, J. M., Gardner, N., Kaiser, A., & Riley, A. (1993). Evaluation of a social
interaction coaching program in an integrated day-care setting. Journal of Applied
Behavior Analysis, 26, 213–225. doi:10.1901/jaba.1993.26-213
Huberman, W. L., & O’Brien, R. M. (1999). Improving therapist and patient performance in
chronic psychiatric group homes through goal-setting, feedback, and positive reinforce-
ment. Journal of Organizational Behavior Management, 19, 13–36. doi:10.1300/
J075v19n01_04
Hundert, J., & Hopkins, B. (1992). Training supervisors in a collaborative team approach to
promote peer interactions of children with disabilities in integrated preschools. Journal of
Applied Behavior Analysis, 25, 385–400. doi:10.1901/jaba.1992.25-385
Knerginer, M., & Page, T. J. (1999). Improving staff nutritional practices in community based
group homes: Evaluation, training, and management. Journal of Applied Behavior Analysis,
32, 221–224. doi:10.1901/jaba.1999.32-221
Lalli, J. S., Browder, D. M., Mace, F. C., & Brown, D. K. (1993). Teacher use of descriptive
analysis data to implement interventions to decrease students’ problem behaviors. Journal
of Applied Behavior Analysis, 26, 227–238. doi:10.1901/jaba.1993.26-227
Lambert, J. M., Blooms, S. E., Kunnavantana, S. S., Collins, S. D., & Clay, C. J. (2013).
Training residential staff to conduct trail-based functional analyses. Journal of Applied
Behavior Analysis, 46, 296–300. doi:10.1002/jaba.17
Langeland, K. L., Johnson, C. M., & Mawhinney, T. C. (1997). Improving staff performance
in a community mental health setting: Job analysis, training, goal setting, feedback, and
years of data. Journal of Organizational Behavior Management, 18, 21–43. doi:10.1300/
J075v18n01_03
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 19

Lavie, T., & Sturmey, P. (2002). Training staff to conduct a paired-stimulus preference
assessment. Journal of Applied Behavior Analysis, 35, 209–211. doi:10.1901/jaba.2002.35-
209
Lebbon, A., Austin, J., Rost, K., & Stanley, L. (2011). Improving safe consumer transfers in a
day treatment setting using training and feedback. Behavior Analysis and Practice, 4, 35–
43. doi:10.1007/BF03391782
Leblanc, L. A., Gravina, N., & Carr, J. E. (2009). Training issues unique to autism spectrum
disorder. In J. Matson (Ed.), Practitioner’s guide to applied behavior analysis for children
with autism spectrum disorders (pp. 225–235). New York, NY: Springer.
Lerman, D. C., Tetreault, A., Hovanetz, A., Strobel, M., & Garro, J. (2008). Further evaluation
of a brief, intensive, teacher-training model. Journal of Applied Behavior Analysis, 41, 243–
248. doi:10.1901/jaba.2008.41-243
Mayer, K. L., & DiGennaro-Reed, F. D. (2013). Effects of a training package to improve the
accuracy of descriptive analysis data recording. Journal of Organizational Behavior
Management, 33, 226–243. doi:10.1080/01608061.2013.843431
McGimsey, J. F., Greene, B. F., & Lutzker, J. R. (1995). Competence in aspects of behavioral
treatments and consultation: Implications for service delivery and graduate training.
Journal of Applied Behavior Analysis, 28, 301–315. doi:10.1901/jaba.1995.28-301
Methot, L. L., Williams, W. L., Cummings, A., & Bradshaw, B. (1996). Measuring the effects
of a manager-supervisor training program through the generalized performance of man-
agers, supervisors, front-line staff and clients in a human service setting. Journal of
Organizational Behavior Management, 16, 3–34. doi:10.1300/J075v16n02_02
Miller, M. V., Clarson, J., & Sigurdsson, S. O. (2014). Improving treatment integrity in a
human service setting using lottery-based incentives. Journal of Organizational Behavior
Management, 34, 29–38. doi:10.1080/01608061.2013.873381
Moore, J. W., Edwards, R. P., Sterling-Turner, H. E., Riley, J., DuBard, M., & McGeorge, A.
(2002). Teacher acquisition of functional analysis methodology. Journal of Applied
Behavior Analysis, 35, 73–77. doi:10.1901/jaba.2002.35-73
Moore, J. W., & Fisher, W. W. (2007). The effects of videotape modeling on staff acquisition
of functional analysis methodology. Journal of Applied Behavior Analysis, 40, 197–202.
doi:10.1901/jaba.2007.24-06
Mozingo, D. B., Smith, T., Riordan, M. R., & Bailey, J. S. (2006). Enhancing frequency
recording by developmental disabilities treatment staff. Journal of Applied Behavior
Analysis, 39, 253–256. doi:10.1901/jaba.2006.55-05
Nabeyama, B., & Sturmey, P. (2010). Using behavioral skills training to promote safe and
correct staff guarding and ambulation distance of students with multiple physical
disabilities. Journal of Applied Behavior Analysis, 43, 341–345. doi:10.1901/
jaba.2010.43-341
Neef, N. A., Trachtenberg, S., Loeb, J., & Sterner, K. (1991). Video-based training of respite
care providers: An interactional analysis of presentation format. Journal of Applied
Behavior Analysis, 24, 473–486. doi:10.1901/jaba.1991.24-473
Nigro-Bruzzi, D., & Sturmey, P. (2010). The effects of behavioral skills training on mand
training by staff and unprompted vocal mands by children. Journal of Applied Behavior
Analysis, 43, 757–761. doi:10.1901/jaba.2010.43-757
Nolan, T. V., Jarema, K. A., & Austin, J. (1999). An objective review of the journal of
organizational behavior management. Journal of Organizational Behavior Management,
20, 69–90.
Northup, J., Wacker, D. P., Berg, W. K., Kelley, L., Sasso, G., & DeRaad, A. (1994). The
treatment of severe behavior problems in school setting using a technical assistance model.
Journal of Applied Behavior Analysis, 27, 33–47. doi:10.1901/jaba.1994.27-33
20 N. GRAVINA ET AL.

Parsons, M. B., Bentley, E., Solari, T., & Reid, D. H. (2016). Familiarizing new staff for
working with adults with severe disabilities: A case for relationship building. Behavior
Analysis and Practice, 9, 211–222. doi:10.1007/s40617-016-0129-9
Parsons, M. B., & Reid, D. H. (1995). Comparing choice and questionnaire measures of the
acceptability of staff training procedure. Journal of Applied Behavior Analysis, 28, 95–96.
doi:10.1901/jaba.1995.28-95
Parsons, M. B., Rollyson, J. H., & Reid, D. H. (2004). Improving day-treatment services for
adults with severe disabilities: A norm-referenced application of outcome management.
Journal of Applied Behavior Analysis, 37, 365–377. doi:10.1901/jaba.2004.37-365
Parsons, M. B., Rollyson, J. H., & Reid, D. H. (2012). Evidence-based staff training: A guide
for practitioners. Behavior Analysis and Practice, 5, 2–11. doi:10.1007/BF03391819
Peters-Scheffer, N., Didden, R., Korzilius, H., & Sturmey, P. (2011). A meta-analytic study on
the effectiveness of comprehensive ABA-based early intervention programs for children
with Autism Spectrum Disorder. Research in Autism Spectrum Disorder, 5(1), 60–69.
doi:10.1016/j.rasd.2010.03.011
Petscher, E. S., & Bailey, J. S. (2006). Effects of training, prompting, and self-monitoring on
staff behavior in a classroom for students with disabilities. Journal of Applied Behavior
Analysis, 39, 215–226. doi:10.1901/jaba.2006.02-05
Plavnick, J. B., Ferreri, S. J., & Maupin, A. N. (2010). The effects of self-monitoring on the
procedural integrity of a behavioral intervention for young children with developmental
disabilities. Journal of Applied Behavior Analysis, 43, 315–320. doi:10.1901/jaba.2010.43-
315
Reid, D. H., & Parsons, M. B. (2000). Organizational behavior management in human service
settings. In J. Austin & J. E. Carr (Eds.), Handbook of applied behavior analysis (pp.
275–294). Reno, NV: Context Press.
Riley, A. W., & Frederiksen, L. W. (1984). Organizational behavior management in human
service settings: Problems and prospects. Journal of Organizational Behavior Management,
6, 3–16. doi:10.1300/J075v05n03_01
Rosales, R., Stone, K., & Rehfeldt, R. A. (2009). The effects of behavioral skills training on
implementation of the picture exchange communication system. Journal of Applied
Behavior Analysis, 42, 541–549. doi:10.1901/jaba.2009.42-541
Roscoe, E. M., Fisher, W. W., Glover, A. C., & Volkert, V. M. (2006). Evaluating the relative
effects of feedback and contingent money for staff training on stimulus preference assess-
ments. Journal of Applied Behavior Analysis, 39, 63–77. doi:10.1901/jaba.2006.7-05
Sarokoff, R. A., & Sturmey, P. (2004). The effects of behavioral skills training on staff
implementation of discrete-trial teaching. Journal of Applied Behavior Analysis, 37, 535–
538. doi:10.1901/jaba.2004.37-535
Schepis, M. H., Reid, D. H., Ownbey, J., & Parsons, M. B. (2001). Training support staff to
embed teaching within natural routines of young children with disabilities in an inclusive
preschool. Journal of Applied Behavior Analysis, 34, 313–327. doi:10.1901/jaba.2001.34-313
Schepis, M. M., & Reid, D. H. (1995). Effects of a voice output communication aid on
interactions between support personnel and an individual with multiple disabilities.
Journal of Applied Behavior Analysis, 28, 73–77. doi:10.1901/jaba.1995.28-73
Shore, B. A., Iwata, B. A., Vollmer, T. R., Lerman, D. C., & Zarcone, J. R. (1995). Pyramidal
staff training in the extension of treatment for severe behavior disorders. Journal of Applied
Behavior Analysis, 28, 323–332. doi:10.1901/jaba.1995.28-323
Squires, A., & Wilder, D. A. (2010). A preliminary investigation of the effect of rules on
employee performance. Journal of Organizational Behavior Management, 30, 57–69.
doi:10.1080/01608060903529756
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT 21

Strouse, M. C., Carroll-Hernandez, T. A., Sherman, J. A., & Sheldon, J. B. (2004). Turning
Over turnover. Journal of Organizational Behavior Management, 23, 45–63. doi:10.1300/
J075v23n02_04
Sturmey, P. (1998). History and contribution of organizational behavior management to
service for persons with developmental disabilities. Journal of Organizational Behavior
Management, 18, 7–32. doi:10.1300/J075v18n02_02
Szabo, T. G., Williams, W. L., Rafacz, S. D., Newsome, W., & Lydon, C. A. (2012). Evaluation
of the service review model with performance scorecards. Journal of Organizational
Behavior Management, 32, 274–296. doi:10.1080/01608061.2012.729408
Towery, D., Parsons, M. B., & Reid, D. H. (2014). Increasing independence within adult
services: A program for reducing staff completion of daily routines for consumers with
developmental disabilities. Behavior Analysis and Practice, 7, 61–69. doi:10.1007/s40617-
014-0013-4
VanStelle, S. E., Vicars, S. M., Harr, V., Miguel, C. F., Koerber, J. L., Kazbour, R., & Austin, J.
(2012). An objective review and analysis: 1998-2009. Journal of Organizational Behavior
Management, 32, 93–123. doi:10.1080/01608061.2012.675864
Virues-Ortega, J. (2010). Applied behavior analytic intervention for autism in early child-
hood: Meta-analysis, meta-regression and dose-response meta-analysis of multiple out-
comes. Clinical Psychology Review, 30(4), 387–399. doi:10.1016/j.cpr.2010.01.008
Vladescu, J. C., Carroll, R., Paden, A., & Kodak, T. M. (2012). The effects of video modeling
with voiceover instruction on accurate implementation of discrete-trial instruction. Journal
of Applied Behavior Analysis., 45, 419–423. doi:10.1901/jaba.2012.45-419
Weldy, C. R., Rapp, J. T., & Capocasa, K. (2014). Training staff to implement brief stimulus
preference assessments. Journal of Applied Behavior Analysis, 47, 214–218. doi:10.1002/
jaba.98
Whiting, S. W., Miller, J. M., Hensel, A. M., Dixon, M. R., & Szekely, S. (2014). Increasing the
accuracy of EpiPen administration with a brief behavioral skills training package in a
school for autism. Journal of Organizational Behavior Management, 34, 265–278.
doi:10.1080/01608061.2014.973632
Wilder, D. A., Lipschultz, J. L., King, A., Driscoll, S., & Sigurdsson, S. (2018). An analysis of
the commonality and type of pre-intervention assessment procedures in the Journal of
Organizational Behavior Management (2000-2015). Journal of Organizational, Behavior
Management, 38, 5–17.
Williams, W. L., & Gallinat, J. (2011). The effects of evaluating video examples of staffs’ own
versus others’ performance on discrete-trial training skills in a human service setting.
Journal of Organizational Behavior Management, 31, 97–116. doi:10.1080/
01608061.2011.570099
Williams, W. L., Vittorio, T. D., & Hausherr, L. (2003). A description and extension of a
human services management model. Journal of Organizational Behavior Management, 22,
47–71. doi:10.1300/J075v22n01_04
Appendix A
22

Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
JABA
N. GRAVINA ET AL.

Alavosius & Sulzer-Azaroff Residential Autism, Direct care No Feedback & praise, training & Treatment integrity, No
(1990) facility developmental staff antecedents safety
delay & severe
problem behavior
Babcock, Sulzer-Azaroff, & Health care Unclear Nurses No Feedback & praise, goal-setting, Safety No
Sanderson (1992) setting training & antecedents
Casella, Wilder, Neidert, Day Autism, Direct care No Training & antecedents Safety No
Rey, Compton, & Chong treatment developmental staff
(2010) center delay & severe
problem behavior
Casey & McWilliam (2011) School Other (Disabilities- Teachers No Feedback & praise, training & Treatment integrity No
not specified) antecedents
Catania, Almedia, Liu- School Autism, Direct care No Training & antecedents Treatment integrity No
Constant, & DiGennaro- developmental staff
Reed (2009) delay & severe
problem behavior
Codding, Livanis, Pace, & School Mental illness Teachers No Feedback & praise, monitoring or Treatment integrity No
Vaca (2008) (ADHD, conduct observations by self/others
disorder, anxiety
disorder)
Codding, Feinberg, Dunn, & School Brain injury Teachers No Feedback & praise Treatment integrity No
Page (2005)

(Continued )
(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Collins, Higbee, & Salzberg Group home,Autism, Direct care No Training & antecedents, Treatment integrity No
(2009) residentialdevelopmental staff Other
facility delay & severe
problem behavior
Dib & Sturmey (2007) School Autism, Teachers Descriptive Feedback & praise, training & Treatment integrity No
developmental Assessment antecedents, monitoring or
delay & severe observations by self/others
problem behavior
DiGennaro-Reed, Codding, Group home, Brain injury, Autism, Teachers No Feedback & praise, training &, Treatment integrity No
Catania, & Maguire school developmental antecedents
(2010) delay & severe
problem behavior
DiGennaro-Reed, Martens, School Brain injury Teachers No Feedback & praise, goal-setting, Treatment integrity No
& Kleinmann (2007) training & antecedents, punishment
& negative reinforcement
Ditzian, Wilder, King, & Day Autism, Supervisors PDC-HS Feedback & praise, training & Safety No
Tanz (2015) treatment developmental antecedents
center delay & severe
problem behavior
Ducharme & Feldman Group home Autism, Direct care No Feedback & praise, training & Treatment integrity No
(1992) Developmental staff antecedents, monitoring or
delay & severe observation by self/others
problem behavior
Engleman, Altus, Mosier, & Nursing Elderly & dementia Nurses No Feedback & praise, training & Treatment integrity No
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT

Mathews (2003) home antecedents

(Continued )
23
24

(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Fleming & Sulzar-Azaroff Residential Autism, Direct care No Feedback & praise, goal-setting, Treatment integrity No
(1992) facility developmental staff training & antecedents, monitoring
delay & severe or observations by self/others
N. GRAVINA ET AL.

problem behavior
Graff & Karsten (2012) School Autism, Teachers No Training & antecedents Treatment integrity No
developmental
delay & severe
problem behavior
Green, Rollyson, & Passante Residential Other (severe Supervisors No Feedback & praise, training and Staff Management No
(2002) facility disabilities) antecedents,
monitoring or observation by self/
others
Harchik, Sherman, Sheldon, Group home Autism, Direct care No Feedback & praise, training & Treatment integrity No
& Strousse (1992) developmental staff antecedents
delay & severe
problem behavior
Hendrickson, Gardner, Day Autism, Teachers No Feedback & praise, goal-setting, Engagement No
Kaiser, & Riley (1993) treatment developmental training & antecedents, monitoring
center delay & severe or observations by self/others
problem behavior,
ADHD, cerebral
palsy
Hundert & Hopkins (1992) School Autism, Supervisors, No Feedback & praise, goal-setting, Staff management, No
developmental teachers training & antecedents, monitoring engagement
delay & severe or observations by self/others
problem behavior

(Continued )
(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Kneringer & Page (1999) Group Home Autism, Direct care No Feedback & praise, training & Safety, Preparation No
developmental staff antecedents
delay & severe
problem behavior
Lalli, Browder, Mace, & School Autism, Teachers Problem- Feedback & praise, training & Treatment integrity Yes
Brown (1993) developmental Identification antecedents, other (error correction
delay & severe interview, Scatter procedure)
problem behavior Plot Analysis,
Narrative
Recordings,
Descriptive
Analysis
Lambert, Blooms, Group home Autism, Supervisors No Feedback & praise, training & Treatment integrity No
Kunnavantana, Collins, & developmental antecedents
Clay (2013) delay & severe
problem behavior
Lavie & Sturmey (2002) School Autism, Teachers No Feedback & praise training & Treatment integrity No
developmental antecedents
delay & severe
problem behavior
Lerman, Tetreault, School Autism, Teachers No Training & antecedents Treatment integrity No
Hovanetz, Strobel, & developmental
Garro (2008) delay & severe
problem behavior
Moore, Edwards, Sterling- School Other (learning Teachers No Feedback & praise, training & Treatment integrity No
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT

Turner, Riley, DuBard, & disorders & typically antecedents


McGeorge (2002) developing)
25

(Continued )
26

(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Moore & Fisher (2007) Day Autism, Other No Training & antecedents Treatment integrity No
treatment developmental (trainees w/
center delay & severe BA degrees in
N. GRAVINA ET AL.

problem behavior Psych)


Mozingo, Smith, Riordan, Residential Autism, Direct care No Feedback & praise, training & Other (events recorded No
Reiss, & Bailey (2006) facility developmental staff antecedents appropriately)
delay & severe
problem behavior
Nabeyama & Sturmey School Autism, Direct care No Feedback & praise, training & Treatment integrity No
(2010) developmental staff antecedents, punishment &
delay & severe negative reinforcement, monitoring
problem behavior, or observations by self/others
other (students
with physical
disabilities)
Neef, Trachtenberg, Loeb, & Other (room Autism, Non-ABA No Feedback & praise training & Treatment integrity, No
Sterner (1991) of the service developmental Professionals antecedents safety, preparation
agency) delay & severe
problem behavior
Nigro-Bruzzi & Sturmey School, other Autism, Teachers, No Feedback & praise, training & Treatment integrity No
(2010) (individual developmental Non-ABA antecedents
family home) delay & severe Professionals
problem behavior
Northup, Wacker, Berg, School Autism, Teachers No Training & antecedents Treatment integrity Yes
Kelley, Sasso, & DeRaad developmental
(1994) delay, & severe
problem behavior

(Continued )
(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Parsons & Reid (1995) Residential Other (severe Direct care No Feedback & praise, training & Treatment integrity No
facility disabilities; not staff, antecedents
specified) supervisors
Parsons, Rollyson, & Reid School, other Autism, Direct care No Feedback & praise, goal-setting, Engagement No
(2004) (various work developmental staff, teachers training & antecedents, system re-
settings) delay & severe design
problem behavior
Petschers & Bailey (2006) Autism, Other No Feedback & praise, goal-setting, Treatment integrity No
developmental (instruction training & antecedents, monitoring
delay & severe assistants) or observations by self/others,
problem behavior other (written posttests identifying
antecedents and appropriate
responses)
Plavnick, Ferreri & Maupin School Autism, Teachers No Training & antecedents, Monitoring Treatment integrity, No
(2010) developmental or observation by self/others engagement
delay & severe
problem behavior,
Other
Rosales, Stone & Rehfeldt Other Autism, Students No Feedback & praise, training & Treatment integrity No
(2009) (habilitation developmental antecedents
agency) delay & severe
problem behavior,
Other
Roscoe, Fisher, Glover & Day Autism, Other (adults No Feedback & praise, Monetary Treatment integrity No
Volkert ((2006) Treatment developmental as trainees) rewards, training & antecedents,
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT

Center delay & severe monitoring observation by self/


problem behavior others
27

(Continued )
28

(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Sarokoff & Sturmey Group home Autism, Teachers No Feedback & praise, training & Treatment integrity No
(2004) developmental antecedents
N. GRAVINA ET AL.

delay & severe


problem behavior
Schepis & Reid (1995) Residential Autism, Teachers No Training & antecedents Engagement Yes
facility, and developmental
school delay & severe
problem behavior
Schepis, Reid, Ownbey & Day Autism, Direct care No Feedback & praise, training & Treatment integrity No
Parsons (2001) treatment developmental staff antecedents
center delay & severe
problem behavior
Shore, Iwata, Vollmer, Residential Autism, Direct care No Feedback & praise, Training & Treatment integrity, No
Lerman & Zarcone (1995) facility developmental staff, antecedents staff management
delay & severe Supervisors
problem behavior
Vladescu, Carroll, Paden & Day Autism, Direct care No Training & antecedents Treatment integrity No
Kodak (2012) treatment developmental staff
center delay & severe
problem behavior
Weldy, Rapp & Capocasa Day Autism, Direct care No Training & antecedents Treatment integrity No
(2014) treatment developmental staff
center delay & severe
problem behavior

(Continued )
(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
JOBM
Arco (1997) Group home, Autism, Students No Feedback & praise, goal setting, Treatment integrity, Yes
day developmental monitoring or observation of others engagement
treatment delay & severe
center problem behavior
Belisle, Rowsey & Dixon School Autism, Direct care No Feedback & praise, training & Treatment integrity No
(2016) developmental staff antecedents
delay & severe
problem behavior
Boudreau, Christian & Group home, Autism, Direct care No Training & antecedents, systems Attendance & turnover No
Thibadeau (1993) residential developmental staff redesign
facility delay & severe
problem behavior
Cook & Dixon (2006) Group home Autism, Supervisors No Feedback & praise, monetary Treatment integrity No
developmental rewards
delay, & severe
problem behavior
Fleming, Oliver & Bolton Group home Autism, Direct care No Feedback & praise, training & Staff management No
(1996) developmental staff, antecedents
delay, & severe supervisors
problem behavior
Gil & Carter (2016) Group home, Autism, Direct care No Feedback & praise, goal setting Treatment integrity No
residential developmental staff
facility delay, & severe
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT

problem behavior

(Continued )
29
30

(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Gravina, VanWagner & Other Autism, Non-ABA PDC Feedback & praise, training & Preparation No
Austin (2008) (physical and developmental professionals antecedents, Monitoring or
N. GRAVINA ET AL.

occupational delay, & severe observation by self/others


therapy problem behavior
clinic)
Green, Reid, Passante, & Residential Autism, Supervisors No Non-monetary rewards, Engagement No
Canipe (2008) facility developmental antecedents & training, systems
delay, & severe redesign
problem behavior
Guercio & Dixon (2010) Residential Brain injury Direct care No Feedback & praise, monitoring or Engagement No
facility staff observation by self or others, other
(PEARL)
Guercio & Dixon (2011) Residential Autism, Direct care No Training & antecedents, monitoring Engagement No
facility developmental staff or observation by self/others
delay, & severe
problem behavior
Hawkins, Burgio, Langford Nursing Elders & Dementia Nurses No Feedback & praise, goal setting, Staff management No
& Engel (1993) home training & antecedents, systems
redesign, monitoring or
observation by self/others
Huberman & O’Brien (1999) Group home Mental illness Direct care No Feedback & praise, goal setting, Staff management, Yes
staff, Non- monetary rewards, non-monetary engagement
ABA rewards, training & antecedents
professionals

(Continued )
(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Langeland, Johnson & Group home, Autism, Direct care Indirect Feedback & praise, goal setting, Staff management No
Mawhinney (1997) residential developmental staff, Assessment training & antecedents, monitoring
facility delay, & severe supervisors, (Survey, Interviews) or observation by self/others
problem behavior nurses,
managers,
Non-ABA
professionals
Mayer & DiGennaro-Reed Residential Brain injury, Autism, Direct care No Feedback & praise, training & Treatment integrity No
(2013) facility developmental staff antecedents
delay, & severe
problem behavior
Methot, Williams, Residential Autism, Direct care No Feedback & praise, goal setting, Treatment integrity, No
Cummings & Bradshaw facility, other developmental staff, training & antecedents staff management
(1996) (Employee delay, & severe supervisor,
training problem behavior managers
center)
Miller, Clarson & Sigurdsson School Autism, Teachers PDC, PIC/NIC Feedback & praise, goal setting, Treatment integrity, No
(2014) developmental monetary rewards preparation
delay, & severe
problem behavior,
other (special
education)
Squires & Wilder (2010) Residential Autism, Direct care No Feedback & praise, goal setting Staff management No
facility developmental staff
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT

delay, & severe


problem behavior

(Continued )
31
32

(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Strouse, Carroll-Hernandez, Group home Autism, Direct care No Systems redesign Attendance & turnover No
N. GRAVINA ET AL.

Sherman & Sheldon developmental staff


(2004) delay, & severe
problem behavior
Szabo, Williams, Rafacz, Group home, Autism, Direct care No Feedback & praise, goal setting, Administrative and Yes
Newsome & Lydon residential developmental staff, non-monetary rewards, training & staff management
(2012) facility, day delay, & severe supervisors, antecedents, other(scorecards &
treatment problem behavior other biweekly meetings)
center (consultants)
Whiting, Miller, Hensel, School Autism, Teachers, No Feedback & praise, training & Safety procedures No
Dixon & Szekely (2014) developmental managers antecedents
delay, & severe
problem behavior
Williams, DiVittorio & Group home, Autism, Direct care No Feedback & praise, goal setting, Administrative and Yes
Hausherr (2003) residential developmental staff, systems redesign, monitoring or staff management
facility, day delay, & severe supervisors, observation by self/other
treatment problem behavior, managers,
center elders & dementia Non-ABA
professionals
Williams & Gallinat (2011) Day Autism, Direct care No Feedback & praise, training & Treatment integrity, No
treatment developmental staff antecedents, monitoring or preparation and
center delay, & severe observation by self/others cleanliness
problem behavior

(Continued )
(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
BAP
Carr, Wilder, Majdalany, Day Autism, Direct care PDC-HS Feedback & praise, training & Preparation and No
Mathisen & Strain (2013) treatment developmental staff antecedents cleanliness
center delay, & severe
problem behavior
Chok, Shlesinger, Studer & School Autism, Direct care No Feedback & praise, training & Treatment integrity, No
Bird (2012) developmental staff antecedents other (interpreting
delay, & severe graphs, describe next
problem behavior steps for
undifferentiated
results and
interventions
Giannakakos, Vladescu, Day Autism, Direct care No Feedback & praise, training & Treatment integrity No
Kisamore & Reeve (2016) treatment developmental staff antecedents, monitoring or
center delay, & severe observation by self/others
problem behavior
Lebbon, Austin, Rost & Day Autism, Direct care PDC Feedback & praise, training & Safety Procedures No
Stanley (2011) treatment developmental staff, antecedents
center delay, & severe supervisors
problem behavior
Parsons, Bentley, Solari & Day Autism, Teachers, No Training & antecedents, other Engagement and job Yes
Reid (2016) treatment developmental other (familiar vs. unfamiliar staff) enjoyment
center delay, & severe (recreation
problem behavior, interns)
JOURNAL OF ORGANIZATIONAL BEHAVIOR MANAGEMENT

elders & dementia

(Continued )
33
34

View publication stats


N. GRAVINA ET AL.

(Continued).
Client
Participants Outcome
Setting of (Employees Participant Behaviors Measures
Authors Services Client Diagnosis Targeted) Pre-Assessment Types of IV’s Measured (DV’s) Taken
Parsons, Rollyson, & Reid School Autism, Teachers No Feedback & praise, training & Treatment integrity No
(2012) developmental antecedents
delay, & severe
problem behavior
Towery, Parsons, & Reid Day Autism, Direct care No Feedback & praise, training, Engagement and job No
(2014) treatment developmental staff antecedents enjoyment
center delay, & severe
problem behavior

You might also like