You are on page 1of 16

European Journal of Special Needs Education

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/rejs20

The investigation of pre-service special education


teachers’ treatment integrity in the use of
response prompting strategies

Nuray Oncul

To cite this article: Nuray Oncul (2021): The investigation of pre-service special education
teachers’ treatment integrity in the use of response prompting strategies, European Journal of
Special Needs Education, DOI: 10.1080/08856257.2021.1872998

To link to this article: https://doi.org/10.1080/08856257.2021.1872998

Published online: 18 Jan 2021.

Submit your article to this journal

Article views: 63

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=rejs20
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION
https://doi.org/10.1080/08856257.2021.1872998

The investigation of pre-service special education teachers’


treatment integrity in the use of response prompting
strategies
Nuray Oncul
School of Education, Anadolu University, Eskisehir, Turkey

ABSTRACT ARTICLE HISTORY


The purpose of this study was to examine pre-service teachers’ Received 13 December 2020
treatment integrity (TI) in the use of constant time delay (CTD) Accepted 23 December 2020
and simultaneous prompting (SP) while teaching discrete and KEYWORDS
chained behaviours. A descriptive research design was used. Treatment integrity; pre-
A total of 28 pre-service special education teachers (16 females service teachers;
and 12 males), whose ages ranged from 20 to35 years, participated simultaneous prompting;
in the study. The data were collected by applying four different constant time delay;
observation and self-assessment checklists to video recordings. descriptive design
A descriptive analysis was conducted. The findings reveal that pre-
service teachers implemented CTD and SP in discrete behaviours
with higher TI than the CTD and SP in chained behaviours.
Recommendations for future research and practice are provided.

One of the main concerns for students with disabilities is ensuring high-quality and
effective instruction (Meyen 1990). Providing high-quality instruction to students with
disabilities requires the use of systematic and evidence-based instruction by effective
teachers (Walker et al. 2020). Effective teachers are characterised as teachers who manage
instructional time and student behaviours, accurately present instructional stimuli and
procedures, monitor student performance and provide instructional feedback (Wolery,
Ault, and Doyle 1992). Effective teachers increase their students’ achievement, and there is
also a significant relationship between teachers’ accurate and consistent instructional
behaviours and student success (Stronge, Ward, and Grant 2011). The Council for
Exceptional Children (CEC) developed standards for initial special education teacher
preparation (CEC 2015). The fifth standard focuses on instructional planning and strate­
gies. The standard states that ‘beginning special education professionals select, adapt,
and use a repertoire of evidence-based instructional strategies to advance the learning of
individuals with exceptionalities’ (CEC 2015). Therefore, one of the essential skills that
special education teachers should have is implementing effective and evidence-based
practices (EBPs) and strategies.
Response prompting strategies (e.g. time delay and least-to-most prompting) are EBPs
for students with autism spectrum disorder (ASD; Steinbrenner et al. 2020) and students
with intellectual disabilities (U.S. Department of Education Institute of Education Sciences

CONTACT Nuray Oncul noncul@anadolu.edu.tr


© 2021 Informa UK Limited, trading as Taylor & Francis Group
2 N. ONCUL

2018). When using response prompting strategies, the goal is to transfer stimulus control
from the prompts to the target stimulus (Wolery, Ault, and Doyle 1992). Constant time
delay (CTD) and simultaneous prompting (SP) are two of several response prompting
strategies used in teaching both discrete and chained behaviours to children with ASD
(e.g. Swain, Lane, and Gast 2015; Tekin-Iftar et al. 2017) and intellectual disabilities (e.g.
Aldosiry 2020; Seward et al. 2014).
CTD is a strategy in which the target stimulus and prompt are delivered to ensure the
student’s correct response and the prompt is faded by inserting a fixed amount of time
between the target stimulus and prompt (Wolery, Ault, and Doyle 1992). In general, CTD
consists of trials with 0-s delay during the initial session(s), and the prompt is delayed by
inserting a fixed amount of time (e.g. 3-s) in the subsequent sessions. CTD starts with
0-s delay trials. In 0-s delay trials, the teacher presents the target stimulus and then
provides the controlling prompt (i.e., a prompt resulting in a correct response) immedi­
ately after the target stimulus; the student is expected to perform the behaviour inde­
pendently by following the controlling prompt. If the student responds correctly, the
teacher provides a consequence for the correct response (e.g. praise) and presents the
next 0-s delay trial. If the student responds incorrectly or does not respond, the teacher
provides a consequence for an incorrect response (e.g. error correction, ignoring) and
presents the next 0-s delay trial. When the 0-s delay trials are completed, fixed/constant
time delay trials (e.g. 3-s delay) starts. In fixed/constant time delay trials, the teacher
delivers the target stimulus and wait for the specified delay interval (e.g. 3-s delay) for the
student’s response. If the student responds correctly before controlling prompt, the
teacher presents the consequence for the correct response and presents the next con­
stant time delay trial. If the student responds incorrectly or does not respond before
controlling prompt, the teacher provides a consequence for error or no response, delivers
the controlling prompt, and waits for the student’s response. If the student responds
correctly after controlling prompt, the teacher delivers the consequence; however, these
responses are not accepted correct response towards criterion (Wolery, Ault, and Doyle
1992; Yuan, Balint-Langel, and Hua 2019).
SP is another strategy in which a prompt is presented immediately after the target
stimulus to assure the student’s correct response, and then the student is probed to
determine whether they emit the correct response (Tekin-Iftar et al. 2017). SP is similar to
0-s delay CTD sessions. In SP trials, the teacher presents the target stimulus and immedi­
ately presents the controlling prompt. After the controlling prompt, the student is
expected to perform the correct response, followed by teacher praise. In SP, the probe/
test trials are needed to assess the student’s independent performance on the target
behaviour because a controlling prompt is delivered during each instructional trial. Probe
trials are conducted prior to the instructional trials until the criterion is met. In probe trials,
the teacher delivers the target stimulus and wait for the response interval for the student’s
response. If the student responds correctly, the teacher presents the consequence (e.g.
praise) and provides the next target stimulus. If the student responds incorrectly or does
not respond, the teacher presents the consequence (e.g. ignoring) and provides the next
target stimulus (Tekin-Iftar et al. 2017).
Response prompting strategies are among the most frequently studied topics across
different geographical areas around the world. In Turkey, there is also an extensive
literature on the systematic instruction of students with disabilities. There are 20 special
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 3

education teacher training programmes where response prompting strategies are taught
across the country (Collins, Tekin-Iftar, and Olçay-Gül 2017). Special education teacher
training programs embody professional knowledge, ability, and moral competencies that
are addressed in a 4-year undergraduate programme within 240 credit hours. Pre-service
special education teachers complete the theoretical courses in the first three years, and
then they need to pass the student teaching course to merit graduation (Cavkaytar,
Uyanik, and Yücesoy-Özkan 2017) Also, pre-service special education teachers with spe­
cialisation in ASD and intellectual disabilities, need to develop and implement at least 44
systematic instruction plans, including response prompting strategies during student
teaching course (Yücesoy-Özkan et al. 2019).
Although pre-service students in special education training programs successfully
complete the theoretical courses on systematic instruction and instructional strategies,
they face difficulties in transforming the acquired knowledge into practice (e.g. Rakap
2017, 2019). Results of studies conducted with pre-service teachers show that pre-service
teachers need coaching and feedback to use instructional strategies frequently with
fidelity (Coogle, Rahn, and Ottley 2015; Rakap 2017; Yücesoy-Özkan et al. 2019). The
professional development literature reveals that even newly graduated teachers or tea­
chers who participated in professional development activities have challenges and need
support in implementing the practices and strategies in natural settings while working
with students with disabilities (Nougaret, Scruggs, and Mastropieri 2005).
Since teachers’ accurate implementation of EBPs can improve student outcomes,
special education teachers need to be proficient in selecting, using and evaluating the
instructional strategies to teach various behaviours (Hill, Flores, and Kearley 2014; Kretlow
and Bartholomew 2010). The level of accurate implementation of instructional strategies
is also known as treatment integrity (TI; (Lakin and Shannon 2015). TI can be defined as
the degree to which a practice is implemented as planned and described (Peterson,
Homer, and Wonderlich 1982). Researchers have ethical and professional responsibilities
to assess and report the level of TI (Wheeler et al. 2006). For research studies, the ideal
level of TI is 90% and above (Wolery, Bailey, and Sugai 1988), while an acceptable level can
be classified as 80 to 89%. Since providing instruction with high TI is difficult in educa­
tional settings where interruptions and other distractions are common occurrences
(Holcombe, Wolery, and Snyder 1994), a TI score of 70% and above can be considered
as long as the implementation results in positive changes in student outcomes (Tekin-
Iftar, Kurt, and Cetin 2011).
TI is among the variables that impact students’ achievement; however, it is often not
measured in research studies (DiGennaro, Martens, and Kleinmann 2007; Kretlow and
Bartholomew 2010; Stahmer et al. 2015). Ensuring that teachers implement strategies
with high TI is also necessary to justify the cost of instruction for families and insurance
agencies (Brand et al. 2019). There are many studies focused on increasing the TI of pre-
and in-service special education teachers by using practices such as self-monitoring,
performance feedback, and coaching (e.g. Barton et al. 2019; Belfiore, Fritts, and
Herman 2008; Hill, Flores, and Kearley 2014; Rakap 2017). Both pre- and in-service teachers
need to have the opportunity to implement a strategy or practice, be assessed, and
receive performance feedback and coaching support on their performance. There is
a limited number of studies in the literature that investigate the TI level of pre-service
special education teachers before going to natural school settings (DiGennaro, Martens,
4 N. ONCUL

and Kleinmann 2007). Thus, we need to determine how pre-service teachers properly
implement EBP instructional strategies.
A system of controlled practice under controlled clinical conditions is required for pre-
service teachers before going into natural settings (Allen and Eve 1968; Cruickshank and
Metcalf 1993). In teacher training programs, although pre-service teachers’ pedagogical
knowledge is essential, ways should be discovered to determine whether they can transfer
their knowledge to real situations. As the academic staff of schools of education, we need
to identify and use strategies to improve teaching performance of pre-service teachers
using alternative on-campus approaches including microteaching, reflective teaching, and
simulation. Microteaching is a scaled-down (e.g. class size, time, content) program of
controlled practice that makes it possible to focus on specific teaching behaviours and to
practise teaching for a short period of time under controlled conditions (Allen and Eve
1968; He and Yan 2011). Reflective teaching is an instructional program that is used to make
pre-service teachers more thoughtful and wiser about their practice. In reflective teaching,
pre-service teachers practice brief teaching for a specified lesson to peers, then they
evaluate student learning and satisfaction. Simulation is an instructional program like
microteaching that pre-service teachers have the opportunity to make teaching practice,
decide, and assess their decision in a structured setting (Cruickshank and Metcalf 1993).
These alternative approaches are ways to evaluate the practice-based performance of
pre-service teachers in several forms of on-campus clinical experiences. These approaches
allow pre-service teachers to teach specific behaviours to peers or adults and receive
performance feedback from the academic staff (Cruickshank and Metcalf 1993). These
approaches have a crucial role in teacher training programs as they are the bridges
between theory and practice and help reduce the complexity of the teaching-learning
process (He and Yan 2011).
The purpose of this descriptive study is to extend the literature by focusing on the
determination of pre-service special education teachers’ TI in implementing response
prompting strategies before teaching in natural school settings. The following questions
were addressed in the study: (a) What are pre-service teachers’ TI levels in implementing
CTD and SP for both discrete and chained behaviours based on faculty/staff observation?
(b) What are pre-service teachers’ TI levels in implementing CTD and SP for both discrete
and chained behaviours based on their self-assessment?

Method
In the current study, a descriptive research design was used to examine pre-service special
education teachers’ TI levels in implementing response prompting strategies.
A descriptive research design determines and reports the way things are. Descriptive
data are mainly collected through questionnaires or surveys, interviews, and observation
(Gay, Mills, and Airasian 2012). In this study, four observation and a self-assessment
checklists were used to collect data.

Participants
To recruit the participants, a meeting was arranged in the first week of the fall semester,
and all senior students were invited. The aim of the study, confidentiality, and the
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 5

principle of voluntary participation was explained. The voluntary participants were


recruited among the students who passed applied behaviour analysis, teaching concepts
and skills, and teaching daily living and social skills courses. The content of these courses
includes mainly the principles of effective instruction; the stages of learning; assessment
and evaluation; preparing lesson and behaviour support plans, the use of target stimuli,
prompts, reinforcements, and error correction procedures; the teaching of skills and
concepts and implementing many teaching strategies and practices including response
prompting strategies. Although all participants had previous in-classroom training on the
use of response prompting strategies, they did not have teaching history with them.
Senior students were chosen as participants because they would start the student teach­
ing course in natural school settings in the next weeks. There was no direct training prior
to the assessment sessions. The written informed consent on volunteer participation and
video recordings were received from all participants.
A total of 28 pre-service special education teachers participated in the study. All
participants were undergraduate students enrolled in the special education program in
the school of education at a university in Turkey. Participant characteristics are presented
in Table 1.

Instruments
Direct observation and self-assessment techniques were used to collect data. Four differ­
ent observation and self-assessment checklists were developed as data collection tools.

Table 1. Characteristics of participants.


Participants Gender Age Grade-Point Average (%)
1 Female 23 91.13
2 Female 35 85.53
3 Female 21 83.20
4 Female 21 81.10
5 Male 21 75.03
6 Female 25 74.10
7 Female 20 72.46
8 Male 21 72.23
9 Female 21 72.23
10 Female 25 70.60
11 Female 22 70.13
12 Male 21 69.90
13 Female 21 68.73
14 Male 21 67.80
15 Male 23 66.63
16 Female 21 65.23
17 Male 24 64.53
18 Male 21 63.60
19 Female 21 61.73
20 Female 23 59.86
21 Female 22 58.93
22 Female 26 58.70
23 Male 21 58.23
24 Female 23 56.60
25 Male 23 55.90
26 Male 28 53.56
27 Male 21 51.46
28 Male 22 51.00
6 N. ONCUL

Before developing the observation and self-assessment checklists, two different beha­
viours, one of them a discrete behaviour (i.e. reading community signs), and the other one
a chained behaviour (i.e. setting the table) were identified, since the teaching of discrete
behaviours and chained behaviours require different kinds of instructor behaviours as
seen in Tables 2 and Tables 3. Then, two different response prompting strategies – CTD
and SP – that were identified (Collins, Tekin-Iftar, and Olçay-Gül 2017).
The checklists took into account the target behaviours and strategies. To create
the checklists, two target behaviours were performed and videotaped using CTD and
SP. Then the videoclips were watched to determine the steps to be included in the
checklists. After the steps in checklists were reviewed and revised, the checklists were
sent to five experts who had doctoral degrees in special education or measurement
and evaluation to examine the content validity. The checklists were finalised based
on the experts’ feedback. Final versions of the observation checklists and self-
assessment checklists were used in the present study (Tables 2 and Tables 3). Two
observation and self-assessment checklists were developed for CTD for discrete
behaviour (19 steps) and chained behaviour (17 steps); two were developed for SP
for discrete behaviour (19 steps), and chained behaviour (18 steps). The items on the
observation and self-assessment checklists were identical.

Table 2. Implementation steps of CTD for discrete and chained behaviours.


CTD Steps for Discrete Behaviour CTD Steps for Chained Behaviour
1 Arranging the setting Arranging the setting
2 Checking and having materials ready Checking and having materials ready
3 Getting a pencil and datasheet Getting a pencil and datasheet
4 Recording the descriptive information on the Recording the descriptive information on the datasheet
datasheet
5 Getting the student’s attention Getting the student’s attention
6 Introducing the materials Introducing the materials
7 Explaining the rules Explaining the rules
8 Presenting the target stimulus* Presenting the target stimulus*
9 Waiting for the student’s response during the time Waiting for the student’s response for the first step of the
delay (4 s) skill during the time delay (4 s)
For correct response For correct response
10 Recording student’s correct response as plus Recording student’s correct response as plus
11 Praising the student’s correct response Praising the student’s correct response for the first step
For incorrect response
12 Waiting for the intertrial time (2 s) Recording student’s incorrect response as minus
For incorrect response
13 Recording student’s incorrect response as minus Providing controlling prompt
14 Providing controlling prompt Waiting for the student’s response during the response
interval (5 s) for the first step
15 Waiting for the student’s response during the Praising the student’s correct response differentially
response interval (5 s)
16 Praising the student’s correct response Implementing error correction procedure for the second
differentially incorrect response
17 Implementing error correction procedure for Reinforcing the student’s participation
the second incorrect response
18 Presenting the target stimulus of the next trial
19 Reinforcing the student’s participation
* Implementing the 8–18th steps for each trial * Implementing the 8–16th steps for each step
CTD: Constant time delay.
SP: Simultaneous prompting.
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 7

Table 3. Implementation steps of SP for discrete and chained behaviours.


SP Steps for Discrete Behaviour SP Steps for Chained Behaviour
1 Arranging the instructional setting Arranging the instructional setting
2 Checking and having materials ready Checking and having materials ready
3 Getting a pencil and datasheet Getting a pencil and datasheet
4 Recording the descriptive information on the Recording the descriptive information on the datasheet
datasheet
5 Getting the student’s attention Getting the student’s attention
6 Introducing the materials Introducing the materials
7 Explaining the rules Explaining the rules
8 Conducting probe sessions or explaining the Conducting probe sessions or explaining the requirement
requirement of probe sessions* of probe sessions*
9 Presenting the target stimulus** Presenting the target stimulus**
10 Providing controlling prompt Providing controlling prompt for the first step
11 Waiting for the student’s response for the first trial Waiting for the student’s response for the first step of the
during the response interval (5 s) skill during the response interval (5 s)
For correct response For correct response
12 Praising the student’s correct response Praising the student’s correct response for the first step
13 Waiting for the intertrial time (2 s) Providing controlling prompt for the next step
For incorrect response For incorrect response
14 Providing controlling prompt again Providing controlling prompt again
15 Waiting for the student’s response during the Waiting for the student’s response during the response
response interval (5 s) interval (5 s) for the first step
16 Praising the student’s correct response Praising the student’s correct response
17 Implementing error correction procedure for Implementing error correction procedure for the second
the second incorrect response incorrect response
18 Waiting for the intertrial time (2 s) Reinforcing the student’s participation
19 Reinforcing the student’s participation
*Each participant should conduct probe trials * Each participant should conduct probe trials before/after
before/after instructional trials instructional trials
** Implementing the 9–18th steps for each trial ** Implementing the 9–17th steps for each step of the
skill
CTD: Constant time delay.
SP: Simultaneous prompting.

Data collection
The data collection procedure was carried out in a university research clinic. After the data
collection tools had been completed, the appointments were arranged to collect data.
A simulation setting was created in the clinic. A research assistant whom the participants
had never met and interacted was assigned to each participant. She assisted in the data
collection procedure, playing the role of a student being taught. She knew when she
would give correct and incorrect responses and what to do after pre-service teachers’
responses. After arranging the environment, the instructional plan, including all compo­
nents (e.g. target stimulus, prompt, time delay, intertrial interval, and positive reinforce­
ment) was developed. Then data collection procedure started in the simulation setting.
All materials (e.g. table, dinner set, water, community sign cards, and datasheets) to be
used to teach both behaviours were in the setting. A camera was placed in the corner of
the room to record the participants’ performances. The researcher was in the room to
record the sessions and to eliminate technical problems (e.g. died battery, wrong record­
ing perspective). There was nobody other than the researcher, participant, and the
research assistant in the setting. The instructional plan was given to the pre-service
teacher with time for his/her to read the plan. The materials were presented, and the
target stimulus was provided by asking the pre-service teacher to teach the behaviours
using CTD and SP. When a pre-service teacher finished reading the plan and felt ready, s/
8 N. ONCUL

he carried out the instruction. A participant performed a total of four instructional


sessions, including implementing CTD and SP for discrete and chained behaviours. The
participants determined the order of the instructional sessions, determining which strat­
egy they wanted to start with.
The target stimulus was delivered in the same manner for each performance before
the participant’s implementation. When a participant was implementing the strategies,
the researcher did not direct her/his, interfere with the sessions, and or reinforce
correct responses. If the participant asked a question, the researcher said, ‘Don’t
worry. Just do it, as you know.’ After the performance was completed, the researcher
thanked the pre-service teacher for her/his participation and cleaned up the classroom.
Each participant performance was recorded by using a different datasheet for each
performance.
When all participants completed their performances, video recordings of the perfor­
mance and the blank datasheet were given the pre-service teachers to self-assess their
performance. One-to-one meetings were arranged to compare the researcher’s record­
ings and the participants’ recordings and to provide performance feedback for voluntary
participants. Only 13 of the 28 pre-service teachers who chose to self-assess received
performance feedback. After comparing the datasheet recordings, the researcher and the
participant watched the video clips together. The performance feedback was presented
to the participant about his/her implementation of the strategies. In the performance
feedback, the researcher started with praise and positive feedback on the participant’s
performance and provided information about the steps implemented correctly. If there
was an error, the researcher asked the participant what the error was. After the participant
identified the error, the researcher confirmed and corrected the participant’s error. If
needed, the researcher provided a prompt to the participant to identify the error. If there
was no consensus, the disagreement was noted.

Data analysis
A descriptive analysis was conducted. The number of correct and incorrect responses
were determined and calculated the correct response percentages using the following
formula: ‘the number of correct responses/the number of correct responses + incorrect
responses x 100’ (DiGennaro, Martens, and Kleinmann 2007).

Interobserver agreement
Interobserver agreement (IOA) was calculated for 30% of the participants for each strategy
and each behaviour. The videoclips were selected randomly and assigned them to
the second observer who had a PhD degree in special education. The observer watched
these videoclips and collected data independently. The observer’s and the researcher’s
recordings were compared and classified as agreements or disagreements. An agreement
was indicated if both observers recorded the same step in the same way (correct-correct
/incorrect-incorrect); on the contrary, a disagreement was indicated if both observers
recorded the same step in a different way (correct-incorrect/incorrect-correct). The IOA
was calculated using the following formula: the number of agreements/the number of
agreements + disagreements x 100 (Kazdin 2011). The IOA was 94.3% for discrete
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 9

behaviour and 97.4% for chained behaviour for CTD. The IOA was 97.4% for discrete
behaviour and 98.0% for chained behaviour for SP.

Results
Treatment integrity levels based on faculty/staff observation
The first aim of the study was to determine pre-service teachers’ TI levels in implementing
CTD and SP to teach both discrete and chained behaviours based on faculty/staff
observation. The TI levels for both behaviours are shown in Table 4. The participants’ TI
percentage was 69.88% (range = 34.80–92.85%) for CTD in discrete behaviour, 29.11%
(range = 7.46–49.25%) for CTD in chained behaviour, 64.54% (range = 38.09–88.63%) for
SP in discrete behaviour, and 55.10% (range = 0.0–85.07%) for SP in chained behaviour. As
seen in Table 4, the highest TI mean (69.88%) and the highest TI percentage (92.85%) were
for CTD in discrete behaviour. Almost all participants (n = 26) implemented CTD in
discrete behaviour with higher TI than CTD in chained behaviour. Eighteen participants
used SP in discrete behaviour with higher TI than SP in chained behaviour. While the
highest TI was at 92.85% for CTD, it was at 88.63% for SP. Also, the TI mean for each
participant was 54.66%, the highest TI was 70.62% (Participant 27), and the lowest TI was
26.38% (Participant 11).

Table 4. Percentages of participants’ treatment integrity levels based on faculty/staff observation.


CTD SP
Participant Discrete (%) Chained (%) Discrete (%) Chained (%) Mean
1 82.92 49.25 56.09 76.11 66.09
2 72.72 32.83 65.11 85.07 63.93
3 80.95 19.11 76.86 31.88 52.20
4 62.79 43.28 58.53 00.00 41.15
5 77.27 40.19 61.90 31.34 52.67
6 65.85 23.88 71.42 37.14 49.57
7 85.71 19.44 57.5 73.52 59.04
8 75.00 23.88 88.63 62.68 62.54
9 86.04 22.38 65.00 67.16 60.14
10 85.00 20.89 61.90 61.19 57.24
11 48.78 16.41 42.85 23.88 32.98
12 75.00 33.82 65.90 52.23 56.73
13 76.19 15.38 61.90 28.12 45.39
14 57.14 16.41 66.66 67.16 51.84
15 83.33 17.91 86.04 60.29 61.89
16 34.80 43.28 68.18 46.26 48.13
17 57.77 29.85 52.38 6.45 36.61
18 80.95 37.31 61.90 49.25 57.35
19 75.06 37.31 78.57 71.64 65.64
20 38.63 47.76 38.09 77.61 50.52
21 73.80 41.79 61.36 62.68 59.90
22 75.00 07.46 60.00 61.19 50.91
23 46.34 20.28 30.95 73.13 42.67
24 56.09 44.77 62.50 65.57 57.23
25 64.28 23.88 66.66 53.73 52.13
26 92.85 22.38 73.80 64.06 63.27
27 73.80 44.77 83.33 80.59 70.62
28 72.72 19.40 83.33 73.13 62.14
Mean 69.88 29.11 64.54 55.10 54.66
CTD: Constant time delay.
SP: Simultaneous prompting.
10 N. ONCUL

Treatment integrity levels based on self-assessment


The other goal of this research was to identify pre-service teachers’ TI levels of CTD and SP
implementation in both discrete and chained behaviours based on participants’ self-
assessment. The TI levels for CTD and SP in both behaviours are presented in Table 5.
Since 13 of the 28 participants self-assessed their performance, the data only for these
participants was analysed. Pre-service teachers’ TI percentage was 75.72% (range = 53.­
65–95.23%) for CTD in discrete behaviour, 43.29% (range = 5.97–67.16%) for CTD in
chained behaviour, 66.00% (range = 9.50–92.85%) for SP in discrete behaviour, and
58.07% (range = 2.98–92.53%) for SP in chained behaviour. Based on Table 5, the highest
TI mean (75.72%) and the highest TI percentage (95.23%) were for CTD in discrete
behaviour. Most of the participants (n = 12) implemented CTD in discrete behaviour
with higher TI than CTD in chained behaviour. Seven participants used SP in discrete
behaviour with higher TI than SP in chained behaviour. While the highest TI was 95.23%
for CTD, it was 92.85% for SP. Also, the TI mean for each participant was 60.26%, the
highest TI was 81.36% (Participant 7), and the lowest TI was 22.46% (Participant 18).

Discussion
The purpose of this study was to examine pre-service teachers’ TI in implementing CTD
and SP based on faculty/staff observation and participants’ self-assessment. The findings
indicate that pre-service teachers implemented CTD and SP in discrete behaviour with
higher TI than CTD and SP in chained behaviour.
The first finding showed that pre-service teachers implemented the strategies in both
behaviours with low TI (under 70%). While two out of three participants implemented CTD
with 70% and above TI in discrete behaviour, none of them implemented CTD with 70%
and above TI in chained behaviour. Also, only eight participants applied SP with 70% and
above TI in both behaviours. This result is consistent with the findings of previous studies
aimed to increase pre-and in-service special education teachers’ TI (Barton et al. 2019;
Belfiore, Fritts, and Herman 2008; Hill, Flores, and Kearley 2014; Rakap 2017). Considering

Table 5. Percentage of participants’ treatment integrity levels based on participants’ self-assessment.


CTD SP
Participant Discrete (%) Chained (%) Discrete (%) Chained (%) Mean
1 87.50 56.71 56.09 76.11 69.17
3 57.14 32.35 65.21 56.52 52.80
6 75.60 25.37 75.60 29.85 51.60
7 85.71 62.50 87.50 89.76 81.36
8 72.00 64.17 86.36 92.53 78.76
10 90.47 64.17 78.57 85.07 79.57
11 53.65 32.83 52.38 47.76 46. 65
12 75.00 45.58 86.36 88.05 73.74
15 88.09 11.94 55.81 14.70 42.63
18 71.42 5.97 09.50 2.98 22.46
19 73.17 67.16 92.85 55.22 72.10
20 59.09 65.67 45.23 91.04 65.25
25 95.23 28.35 66.66 25.37 53.90
Mean 75.72 43.29 66.00 58.07 60.77
CTD: Constant time delay.
SP: Simultaneous prompting.
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 11

that implementing a strategy with under 70% TI threatens the effectiveness of the
strategy (Odluyurt, Tekin-Iftar, and Adalioglu 2012; Tekin-Iftar, Kurt, and Cetin 2011) and
that instruction with high TI is required to explain the cost of treatment (Brand et al. 2019),
pre-service teachers need to be supported in implementing strategies before going to
authentic settings (Nougaret, Scruggs, and Mastropieri 2005). These performance feed­
back or coaching supports could be provided in teaching practice courses by faculty/staff
during on-campus activities (Yücesoy-Özkan et al. 2019).
Based on faculty/staff observations, pre-service teachers implemented CTD with higher
TI than SP, and they also carried out instruction with higher TI for discrete behaviour than
chained behaviour. A large number of research outcomes demonstrate that instructors
stated that delivering SP instruction was easy (Odluyurt, Tekin-Iftar, and Adalioglu 2012).
SP can be easier than CTD for some teachers to implement because CTD requires a wider
range of instructor behaviours (Seward et al. 2014). Thus, the current finding is interesting
in terms of a higher TI percentage for CTD than SP. The most common SP implementation
errors made by these participants were not conducting daily probe trials and not pre­
senting the target stimulus. Although the participants were expected to conduct probe
trials and then move to the instructional trials during the SP, most of the participants did
not carry out the probe trials before the instructional trials, and they directly provided the
controlling prompt without delivering the target stimulus. Also, in the current study,
instruction of discrete behaviour yielded a higher TI percentage than the instruction of
chained behaviour. This finding is not surprising. Based on our subjective experiences, we
could claim that chained behaviours are more complex than discrete behaviours to teach,
and the teaching of these complex behaviour is more challenging since they consist of
more steps and require more materials.
The second finding indicates that based on their self-assessment, pre-service teachers
implemented CTD with higher TI than SP and delivered instruction with higher TI in
discrete behaviour than chained behaviour. This result supports the first finding. Measures
by faculty/staff are consistent with the participants’ self-assessment measures. However,
the pre-service teachers’ self-assessment TI percentage is greater than the faculty/staff’s
assessment. Thus, it can be assumed that perhaps participants are not aware of their TI
level of strategy usage, tend to overestimate their performance, and their perceived self-
efficacy is superior to their actual performance. This finding is consistent with previous
studies (e.g. Lakin and Shannon 2015). Self-assessment may also reflect problems with
measuring themselves as teachers, and it may be assumed that the self-assessment
method as a single evaluation method is not adequate. Thus, further studies are required
to compare the subjective (e.g. self-assessment, self-monitoring, and self-reporting) and
objective (e.g. observation and peer-assessment) evaluation methods on teachers’
performance.
The study contributes to the literature in some ways. There is a limited number of
studies in the literature that investigate pre-service special education teachers’ TI levels
(DiGennaro, Martens, and Kleinmann 2007). The current findings extend the literature by
showing that these pre-service special teachers implemented commonly used strategies
with low-to-moderate TI, although they had passed systematic instruction and instruc­
tional strategies courses. The outcomes also suggest that pre-service teachers likely need
substantial and ongoing support during on-campus activities before practice teaching
courses.
12 N. ONCUL

In all conditions, the most common errors for most participants were in the steps of
arranging the instructional setting, getting the student’s attention, and recording the
student’s correct/incorrect responses. Besides general errors, when taking into consid­
eration SP implementation errors also, the error patterns could be a pathfinder for
faculty/staff in teaching response prompting strategies to pre-service special education
teachers.
The main limitation of this study is that the instructional plan was not prepared by the
pre-service teachers themselves; using a pre-developed plan may have negatively
affected the pre-service teachers’ performance. In future studies, pre-service teachers’
instructional plans could be used when investigating their strategy use performance.
Some recommendations can be made for practice and future research. In their first
years, teachers can be supported by experts from the (Ministry of National Education)
MoNE or universities through online performance feedback or e-coaching to improve
their teaching practice capability. Content coverage and the time allotted for applied
courses in special education teacher preparation programs may be increased, and also,
on-campus implementation opportunities can be provided to all students in simulation
settings or natural environments. Performance feedback or coaching could be included in
every teacher training program to increase performance levels of implementing practices
and strategies. Future research should continue to examine pre-service special education
teachers’ TI on implementing various practices or strategies. In further studies, the
patterns of errors made by pre-service teachers when implementing the various strategies
can be analysed. Finally, the efficacy of performance feedback and coaching on the TI
levels of pre-service teachers may be compared in simulation and natural settings.
As a result, in the instructional procedure, after deciding the target behaviours to be
taught, the most important step to be followed is to determine which practice or strategy
will be used for teaching. It is crucial for pre-service teachers to know effective practices
and strategies and to implement them with high TI. These qualifications are vital to ensure
high-quality instruction. However, in this study, it was observed that pre-service teachers
have challenges in the use of strategies with high TI, and they need to be supported
through performance feedback or coaching.

Acknowledgments
The author would like to thank Serife Yucesoy-Ozkan, Aysun Colak, Fidan Gunes Gurgor-Kilic, Emine
Sema Batu, and Salih Rakap for their support.

Disclosure statement
There is no conflict of interest.

Funding
This study was not supported by any funding agency.
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 13

References
Aldosiry, N. 2020. “Comparison of Constant Time Delay and Simultaneous Prompting to Teach Word
Reading Skills to Students with Intellectual Disability.” International Journal of Developmental
Disabilities 64 (1): 1–15. doi:10.1080/20473869.2020.1771513.
Allen, D. W., and A. W. Eve. 1968. “Microteaching.” Theory into Practice 7 (5): 181–185. doi:10.1080/
00405846809542153.
Barton, E. E., M. N. Rigor, E. A. Pokorski, M. Velez, and M. Domingo. 2019. “Using Text Messaging to
Deliver Performance Feedback to Pre-service Early Childhood Teachers.” Topics in Early Childhood
Special Education 39 (2): 88–102. doi:10.1177/0271121418800016.
Belfiore, P. J., K. M. Fritts, and B. C. Herman. 2008. “The Role of Procedural Integrity: Using
Self-monitoring to Enhance Discrete Trial Instruction (DTI).” Focus on Autism and Other
Developmental Disabilities 23 (2): 95–102. doi:10.1177/1088357607311445.
Brand, D., A. J. Henley, F. D. D. Reed, E. Gray, and B. Crabbs. 2019. “A Review of Published Studies
Involving Parametric Manipulations of Treatment Integrity.” Journal of Behavioral Education 28
(1): 1–26. doi:10.1007/s10864-018-09311-8.
Cavkaytar, A., H. Uyanik, and Ş. Yücesoy-Özkan. 2017. “Republic of Turkey.” In The Praeger
International Handbook of Special Education. Volume 3: Asia and Oceania, edited by
M. L. Wehmeyer and J. R. Patton, 251–264, Santa Barbara, CA: ABC-CLIO, LLC.
Collins, B. C., E. Tekin-Iftar, and S. Olçay-Gül. 2017. “International Collaboration and Its Contributions:
Disseminating Knowledge and Supporting Evidence-based Practices across Countries.” Education
and Training in Autism and Developmental Disabilities 52 (3): 227–239. https://www.jstor.org/
stable/26420396
Coogle, C. G., N. L. Rahn, and J. R. Ottley. 2015. “Pre-service Teacher Use of Communication
Strategies upon Receiving Immediate Feedback.” Early Childhood Research Quarterly 32:
105–115. doi:10.1016/j.ecresq.2015.03.003.
Council for Exceptional Children (CEC). 2015. What Every Special Educator Must Know: Professional
Ethics and Standards. Author. Obtained from https://exceptionalchildren.org/standards/initial-
special-education-preparation-standards
Cruickshank, D. R., and K. K. Metcalf. 1993. “Improving Pre-service Teacher Assessment through On-
campus Laboratory Experiences.” Theory into Practice 32 (2): 86–92. doi:10.1080/
00405849309543580.
DiGennaro, F. D., B. K. Martens, and A. E. Kleinmann. 2007. “A Comparison of Performance Feedback
Procedures on Teachers’ Treatment Implementation Integrity and Students’ Inappropriate
Behavior in Special Education Classrooms.” Journal of Applied Behavior Analysis 40 (3): 447–461.
doi:10.1901/jaba.2007.40-447.
Gay, L. R., G. E. Mills, and P. Airasian. 2012. Educational Research: Competencies for Analysis and
Applications. 10th ed. Upper Saddle River, NJ: Pearson Education.
He, C., and C. Yan. 2011. “Exploring Authenticity of Microteaching in Pre-service Teacher Education
Programmes.” Teaching Education 22 (3): 291–302. doi:10.1080/10476210.2011.590588.
Hill, D. A., M. M. Flores, and R. F. Kearley. 2014. “Maximising ESY Services: Teaching Pre-service
Teachers to Assess Communication Skills and Implement Picture Exchange with Students with
Autism Spectrum Disorder and Developmental Disabilities.” Teacher Education and Special
Education 37 (3): 241–254. doi:10.1177/0888406414527117.
Holcombe, A., M. Wolery, and E. Snyder. 1994. “Effects of Two Levels of Procedural Fidelity with
Constant Time Delay on Children’s Learning.” Journal of Behavioral Education 4 (1): 49–73.
doi:10.1007/BF01560509.
Kazdin, A. E. 2011. Single-case Research Designs: Methods for Clinical and Applied Settings. 2nd ed.
New York, NY: Oxford University Press.
Kretlow, A. G., and C. C. Bartholomew. 2010. “Using Coaching to Improve the Fidelity of
Evidence-based Practices: A Review of Studies.” Teacher Education and Special Education 33 (4):
279–299. doi:10.1177/0888406410371643.
Lakin, J. M., and D. M. Shannon. 2015. “The Role of Treatment Acceptability, Effectiveness, and
Understanding in Treatment Fidelity: Predicting Implementation Variation in a Middle School
14 N. ONCUL

Science Program.” Studies in Educational Evaluation 47 (4): 28–37. doi:10.1016/j.


stueduc.2015.06.002.
Meyen, E. L. 1990. “Quality Instruction for Students with Disabilities.” Teaching Exceptional Children
20 (2): 12–13. doi:10.1177/004005999002200204.
Nougaret, A. A., T. E. Scruggs, and M. A. Mastropieri. 2005. “Does Teacher Education Produce Better
Special Education Teachers?” Exceptional Children 71 (3): 217–229. doi:10.1177/
001440290507100301.
Odluyurt, S., E. Tekin-Iftar, and I. Adalioglu. 2012. “Does Treatment Integrity Matter in Promoting
Learning among Children with Developmental Disabilities?” Topics in Early Childhood Special
Education 32 (3): 143–150. doi:10.1177/0271121410394208.
Peterson, L., A. L. Homer, and S. A. Wonderlich. 1982. “The Integrity of Independent Variables in
Behavior Analysis.” Journal of Applied Behavior Analysis 15 (4): 477–492. doi:10.1901/jaba.1982.15-
477.
Rakap, S. 2017. “Impact of Coaching on Pre-service Teachers’ Use of Embedded Instruction in
Inclusive Preschool Classrooms.” Journal of Teacher Education 68 (2): 125–139. doi:10.1177/
0022487116685753.
Rakap, S. 2019. “Re-visiting Transition-based Teaching: Impact of Pre-service Teacher’s
Implementation on Child Outcomes.” Learning and Instruction 59: 54–64. doi:10.1016/j.
learninstruc.2018.10.001.
Seward, J., J. W. Schuster, M. J. Ault, B. C. Collins, and M. Hall. 2014. “Comparing Simultaneous
Prompting and Constant Time Delay to Teach Leisure Skills to Students with Moderate
Intellectual Disability.” Education and Training in Autism and Developmental Disabilities 32 (3):
381–395. https://www.jstor.org/stable/23881258
Stahmer, A. C., S. Rieth, E. Lee, E. M. Reisinger, D. S. Mandell, and J. E. Connell. 2015. “Training
Teachers to Use Evidence-based Practices for Autism: Examining Procedural Implementation
Fidelity.” Psychology in The Schools 52 (2): 181–195. doi:10.1002/pits.21815.
Steinbrenner, J. R., K. Hume, S. L. Odom, K. L. Morin, S. W. Nowell, B. Tomaszewski, S. Szendrey,
N. S. McIntyre, S. Yucesoy-Ozkan, and M. N. Savage. 2020. Evidence-based Practices for Children,
Youth, and Young Adults with Autism. University of North Carolina at Chapel Hill, Frank Porter
Graham Child Development Institute, National Clearinghouse on Autism Evidence and Practice
Review Team. doi:10.1002/pits.21815.
Stronge, J. H., T. J. Ward, and L. W. Grant. 2011. “What Makes Good Teachers Good? A Cross-case
Analysis of the Connection between Teacher Effectiveness and Student Achievement.” Journal of
Teacher Education 62 (4): 339–355. doi:10.1177/0022487111404241.
Swain, R., J. D. Lane, and D. L. Gast. 2015. “Comparison of Constant Time Delay and Simultaneous
Prompting Procedures: Teaching Functional Sight Words to Students with Intellectual Disabilities
and Autism Spectrum Disorder.” Journal of Behavioral Education 24 (2): 210–229. doi:10.1007/
s10864-014-9209-5.
Tekin-Iftar, E., B. C. Collins, F. Spooner, and S. Olcay-Gul. 2017. “Coaching Teachers to Use
a Simultaneous Prompting Procedure to Teach Core Content to Students with Autism.” Teacher
Education and Special Education 40 (3): 225–245. doi:10.1177/0888406417703751.
Tekin-Iftar, E., O. Kurt, and O. Cetin. 2011. “A Comparison of Constant Time Delay Instruction with
High and Low Treatment Integrity.” Educational Sciences: Theory and Practice 11 (1): 375–381.
U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse. 2018.
System of least prompts: U.S. Department of Education, Institute of Education Science, WWC
Intervention Report. https://ies.ed.gov/ncee/wwc/Docs/InterventionReports/wwc_slp_101818.
pdf
Walker, V. L., S. N. Douglas, K. H. Douglas, and S. R. D’Agostino. 2020. “Paraprofessional-implemented
Systematic Instruction for Students with Disabilities: A Systematic Literature Review.” Education
and Training in Autism and Developmental Disabilities 55 (3): 303–317. http://www.daddcec.com/
etadd.html
Wheeler, J. J., B. A. Baggett, J. Fox, and L. Blevins. 2006. “Treatment Integrity: A Review of
Intervention Studies Conducted with Children with Autism.” Focus on Autism and Other
Developmental Disabilities 21 (1): 45–54. doi:10.1177/10883576060210010601.
EUROPEAN JOURNAL OF SPECIAL NEEDS EDUCATION 15

Wolery, M., M. J. Ault, and P. M. Doyle. 1992. Teaching Students with Moderate to Severe Disabilities:
Use of Response Prompting Strategies. Longman, NY: Longman Publishing Group.
Wolery, M. R., D. P. Bailey Jr., and G. Sugai. 1988. Effective Teaching: Principles and Procedures of
Applied Behavior Analysis with Exceptional Students. Boston, MA: Allyn & Bacon.
Yuan, C., K. Balint-Langel, and Y. Hua. 2019. “Effects of Constant Time Delay on Route Planning Using
Google Maps for Young Adults with Intellectual and Developmental Disabilities.” Education and
Training in Autism and Developmental Disabilities 54 (3): 215–224.
Yücesoy-Özkan, Ş., N. Öncül, A. Çolak, Ç. Acar, F. Aksoy, G. Bozkuş-Genç, and S. Çelik. 2019.
“Determination of Pre-service Special Education Teachers’ Expectations Related to Teaching
Practice Course and Practice Schools.” Elementary Education Online 18 (2): 808–836.
doi:10.17051/ilkonline.2019.562062.

You might also like