You are on page 1of 6

Robots for social training of autistic children

Empowering the therapists in intensive training programs

Emilia I. Barakova
Faculty of Industrial Design,
Eindhoven University of Technology Den Doleh 2 5612 AZ
Eindhoven The Netherlands
e.i.barakova@tue.nl

Abstract—We apply a participatory co-creation process to therapists in this process will ensure the uptake of robot
empower health researchers/practitioners with robot assistants mediated training.
or mediators in behavioral therapies for children with autism. Robots with different embodiment and level of
This process combines (a) a user centered design of a platform anthropomorphism have been used to train shared gaze and
to support therapists to create and share behavioral training joint attention abilities [2-3]. Attempts have been made to
scenarios with robots and (b) acquisition of domain specific use robots to improve imitation and turn-taking skills [4-9] ,
knowledge from the therapists in order to design robot-child to teach facial and body emotions [10] and to initiate social
interaction scenarios that accomplish specific learning goals. interaction [6,11-12]. Even though some of these studies
These two aspects of the process are mutually dependant and
show promising results, they are fragmented and do not
therefore require an iterative design of a technological
systematically train the children towards a sustainable
platform that will make gradual steps towards creating
optimal affordances for therapists to create robot-mediated
improvement. In addition, it is not clear if the therapists will
scenarios to the best of the technical capabilities of the robot, actually use these methods in daily practice. There is a range
i.e. through co-creation. For this purpose an end-user of problems such as, busy schedule, organizational
programming environment augmented with learning by difficulties, and, of course, fear of technical complexity of
demonstration tool and textual commands is being developed. controlling a robot.
The initial tests showed that this tool can be used by the The issue of creating meaningful training sessions with a
therapists to create own training scenarios with existing robot has been addressed previously in [13]. However, in this
behavioral components. We conclude that this platform study the therapist was a knowledge provider and he/she was
comply with the needs of the contemporary practices that focus not meant to take part in the training practice. Differently, we
on personalization of the training programs for every child. In engaged the therapists in a co-creation process for the
addition, the proposed framework makes possible to include development of scenarios that they would like to use as an
extensions to the platform for future developments. augmentation to their practice. The “Therapist-in-the-loop”
approach was proposed by Colton and colleagues in [15], in
Keywords - end-user interfaces; robots for training autistic which the authors attempt to engage the child and facilitate
children; robot-assisted therapy; co-creation design process; social interactions between the child and a team of therapists.
However, they do not consider designing a programming
I. INTRODUCTION tool that gives affordances to training practices.
Finding ways for the uptake and deployment of robots We are searching for ways to include the robot
and other unconventional technologies in therapies has the interventions within established and modern therapies with
potential to alleviate very intensive therapeutic interventions the far-reaching goal that in some of the training scenarios,
to be available to larger groups of the society. Autism parts of the work of the therapists can be replaced by a robot
spectrum disorders (ASD) are conditions where no curative companion. This paper will feature the technical aspects of
treatments are available, but intensive behavioral this enabling process within the framework of its societal
interventions by young children during one year or longer necessity.
may bring to substantial improvements [1]. Searching for In this paper, Section II outlines the framework for a user
the answer of how this training can benefit by the use of platform within user context and different building parts of
robots, we identified that therapists are a key element of this the platform. Section III reports the results of the first
process: (1) only the domain knowledge from the therapists iteration of the design process, both in technical and usability
can bring to creation of efficient training programs with context. Section IV discusses our findings and gives
robots and (2) the actual acceptance and participation of the suggestions for future research.

978-1-4673-0126-8/11/$26.00 2011
c IEEE 14
II. END-USER PROGRAMMING PLATFORM BASED ON industrial settings. However, it requires structured and highly
USER REQUIREMENTS dependable operating environments, i.e. it is not suitable for
natural environments and interaction with humans. In
unstructured environments the systems fail to perform
The intention of this work is to make programming of
adequately since they are unable to adapt to a new situation.
behavior of an arbitrary complexity possible for non-
Furthermore, planning of the trajectories requires a robotic
technically educated people such as therapists. For this
expert and can be very time consuming.
purpose we combine an end-user visual programming
A different approach to robotic programming is the use
environment TiViPE and learning by imitation tool as an
of learning techniques. This approach does not prescribe
interface between a robot and a therapist (Figure 1). With
trajectories but iteratively creates and enhances own
TiViPE scenarios for specific learning objectives can easily
trajectories that fulfill certain optimization requirements. The
be put together as if they were graphical LEGO-like building
advantage is that the robot can operate in heavily
blocks. The functionality of the behaviors from the existing
unstructured environments. However, learning the
building blocks can be fine-tuned by textual commands
trajectories can be very time consuming as the number of
typed within the blocks, or blocks with new functionality can
degrees of freedom (i.e. the search space) increases.
be created. This platform controls the NAO robot [14] .
Furthermore, the definition of a suitable optimization
criterion is a difficult process.
End-user Imitation learning that we use combines the two previous
programming methods. The desired behavior is first demonstrated to the
environment robot resulting in examples of good trajectories.
TiViPE Subsequently, learning techniques are applied to further
enhance and adjust the trajectories. The rationale is to make
use of human knowledge of a task that would normally be
Lego-like Textual language Learning by
encoded in prerecorded trajectories. By showing the
programming by to fine-tune demonstration examples to the robot the users effectively reduce the search
connecting behaviors tool space and the learning algorithm thereby avoids unnecessary
behavioral blocks learning trials [23]. Imitation learning potentially enables
non-experts to program robotic systems by a simple
demonstration of the correct behavior. Therefore, the term
Figure 1. Framework for an interface between a robot and a therapist. programming by demonstration (PbD) is often used.
In the current work, imitation learning techniques for
robotics could potentially help therapists to show movements
A. Learning by Imitation and movement sequences to the robot. However,
programming of the overall behavior and interaction of the
robot, that requires parallel movement commands,
Imitation learning is a technique for manual accompanied by speech and guided by interactive answer to
programming of robotic systems for basic movement skills. a human behavior is practically impossible to be entirely
The robotic system ’learns’ a skill either by observing human dependent on this method. So far we are mainly working
demonstrations of the desired behavior, or the behavior is with a graphical programming environment, and the learning
shown to the robot by a demonstrator moving the robot limbs by demonstration tool is developed as an additional feature.
and body parts. In the current project imitation learning
techniques for robotics could potentially help therapists to B. End-user Graphical Environment TiViPE
create their own training programs. Simple demonstration of The robot software framework includes commands that
the desired behavior to the robot could drastically reduce the interface with the robot, language that insures non-
required technical knowledge of the therapist. Furthermore, conflicting operation of these commands by parallel actions
demonstrating the desired behavior could potentially result in by and multiple conflicting action possibilities, and visual
a more human-like behavior of the robot in contrast to a programming environment that allows easy construction and
design made by graphical programming. visualization of the action stream. Actual programming from
Imitation learning differs from simply repeating the the perspective of the end user consists of connecting Lego-
demonstrated example behaviors, because in real-life like blocks. In addition, a textual robot language makes
situations the movement often has to be performed with possible that action behaviors are created and executed
different initial conditions, avoidance of obstacles and with a within the TiViPE interface. This graphical language insures
consideration of the different embodiment of the robot and that neither in depth knowledge of robot hardware or
the human demonstrator. software is required, nor the designer of behaviors is
In early research in imitation learning robotic system are confined to a specific robot.
programmed to replay prerecorded trajectories. A human A minimalistic language, such that the robot commands
operator designs the trajectory in detail and thereby uses his can be executed in a serial or parallel manner is proposed in
own knowledge of the task at hand. This static approach [21]. The language resembles adding and multiplying of
enables high accuracies and operational velocities required in symbols, where the symbols are equivalent to behavioral

2011 World Congress on Information and Communication Technologies 15


primitives and adding or multiplying defines parallel or serial children for whom different learning objectives are set.
actions, correspondingly. Within the first cycle, the enabling technology is created and
Using sensory information is crucial for constructing an its usability – tested. The Learning by demonstration tool is
interactive system. It needs to solve issues as how to developed with the intension to help facilitating the creation
construct parallel behaviors that have different cycle (or of individual movements and simple instrumental tasks. It
update) speeds. Moreover, there is a discrepancy between intends to augment the programming environment within
the capacities of a robot to perceive (read sensory which behaviors with higher complexity are created.
information) and act - the capacity to act is around two Examples of such behaviors are using multiple body
orders of magnitude smaller than the sensing capacity. movements and speech of the robot in parallel, performing
Creation of a lively behaviors that are prone to external sequences of movements and cycles of actions, taking
disturbances and unexpected behavior of the person, decisions to act as a result of a sensing and reasoning
interacting with the robot is possible through introduction of process.
a state-space concept. Within this concept a single behavior, As said in the previous section, learning by
which we call behavioral primitive, could be modeled as a demonstration differs from simply repeating the example
single state in a dynamical system. The state space is a behaviors, because in real-life situations the movement often
collection of primitives that describes a continuous sequence has to be performed with different initial conditions
of actions, and the evolution rule predicts the next state or (especially for a robot with technical accuracy of NAO,
states. Formally, the state can be defined as a tuple [N;A] which is not meant for precision handling) and avoidance of
where: N is a set of states and A is a set of transitions obstacles. When the demonstrator shows a behavior to the
connecting the states. An example of such a state space is robot he/she will leave the robot hand in an arbitrary position
given at Figure 2. before the repetition by the robot takes place - the
demonstrator can never succeed manually to position the
robot hand in the same initial position, even if desirable.
Therefore, the provided examples should be abstracted to
represent a generalized basic skill, which is defined as
movement primitive. The construction of such a movement
primitive is illustrated in Figure 3.

a)

Figure 2. An example of constructing behaviors in state space: a)flow-


chart diagram of a state space based behavior b)TiViPE implementation of
this behavior.

These states can be converted into a flow diagram as


given in the left part of this figure. In this diagram every
module has 2 internal parameters, the first one gives the
identifier of the state, and the second gives a list of
transitions (visualized as arcs). In the Fifure 2b) a TiViPE
implementation of this state is visualized.
III. RESULTS

So far two groups of results have been achieved. The first Figure 3. Construction of a movement primitive.
group consists of technical developments, and the second of
testing the usability of the technology. These results are Movement primitives contain a mathematical abstraction
interdependent and are obtained in an iterative co-creation of the task space trajectories xj and are created as a set of
process. The following steps have to be performed multiple nonlinear differential equations as suggested in [19]. The
times within this process: (1) a concept for a domain specific Equations (1) and (2) describe this process:
solution has to be developed, followed by (2) technical
implementation of this concept and (3) testing the concept in
a clinical setting. The lessons learned are analyzed to define
the tuned domain solution for the following design cycle of (1)
the process. The main goal of the developed platform is to
facilitate therapists to create custom robot scenarios for f j ( s) =
∑i wiψ i ( s) s (2)
∑ ψ ( s)
i i

16 2011 World Congress on Information and Communication Technologies


where gj represents the (possibly changing) goal state and yi trajectories that are obtained as a result of the force planning
are activation functions that become active depending on the approach. The two trajectories are perceptually similar.
normalized time s. The weighting factors wi are chosen such
that the solution to (1) equals to the originally demonstrated
trajectory. The advantage of using differential equations for
path-planning is that they represent a flow field of
trajectories rather than one single trajectory. The so created
trajectory is translated to the joint space of the robot.
Safe human robot interaction requires adaptation of the
traditional robot control paradigm. Using high-gain control
to minimize feedback errors introduces unacceptable safety
risks. Low gain control decreases the safety risks at expense
of low tracking performance. Furthermore, it puts a strain on
the accuracy of an inverse dynamic model to compute the
required feedforward forces for a given trajectory. The Figure 5. Two examples of execution trajectories, obtained as a result of
introduction of active force control can be used to overcome the force planning approach. The he demonstrated and the executed
some of the safety problems. However, in case of ‘severe’ trajectories are perceptually similar.
disturbances such as human interaction trajectory re-
planning is highly desirable. This effectively means the To allow the design of more complex behaviors, TiViPE
introduction of an additional high level control loop to the uses a box-wire approach to create flow-charts with
trajectory generator which re-plans the desired trajectory in behavioral sequences, parallel behaviors and processes, and
case of human interaction. repetitive behaviors that depend on the sensory state. This
Figure 4 shows an example when robot learns to move a process complies with the common practice in therapeutic
ball on a plain. First, the demonstrator grasps the robot hand institutions to create training exercises by putting together a
and moves it to do perform the desired action. Second, the learning goal and defining the steps and dependencies for its
robot repeats this action achievement by the children and the therapists. We match
the affordances of the robot and the programming
environment with the practice of the therapists, as shown in
[17]. Figure 6 depicts a part of a robot-child interaction
scenario as prepared in a co-creation process with the
therapist and its implementation with TiViPE.


Figure 4. The snapshot on the right shows a robot repeating the
movement demonstrated manually in left snapshot.

Learning by imitation, is constrained also by the different


embodiment of the demonstrator and the robot. In addition to
having the same effect, the demonstrated and repeated
trajectories have to be perceptually similar. In this case, the
path and trajectory planning problem changes to a force-
planning problem. This means that the force fj can be
interpreted as the force that should be applied so that the
system in Equation (1) has a solution that matches the
demonstration trajectory. The time evolution of the force fj(s)
is considered to be a key characteristic of the movement. Figure 6. A snapshot of a training scenario created with TiViPE
Figure 5 shows the demonstrated and the executed environment. In the extensions are correspondingly: right upper corner -
the network behind an integrated block; right down corner – a textual
command block.

2011 World Congress on Information and Communication Technologies 17


Each box in Figure 6 represents behavioral component Both test groups performed the four tasks within the
with different level of complexity. It can be connected to range of 25 minutes. The conclusions made during the tests
other components, creating a new network. Several are taken into account for the current iteration over the
components can be integrated in a single component just by technical re-design. These conclusions are directed to the
two mouse clicks – select and merge. The properties of a overall design of the graphical interface, the need of external
component can also be modified without recompilation of (remote) control, the overall experimental setting.
the TiViPE network to be required. Furthermore, networks
of components can be also merged into new components,
which enable users to build complex, interactive and IV. DISCUSSION AND FUTURE WORK.
intelligent behaviors. The experience so far and the initial results show that the
Along with these technical achievements, two tests were iterative co-creation process is an appropriate direction to
performed to explore whether the therapists actually are able follow if advanced technology is to be implemented in
to create training scenarios with the robot using TiViPE. The clinical practice or other application domains where non-
results show that at that point of development the graphical technical specialists have to be involved. The technical
environment interface could be used by therapists for specialists and the therapists from autistic clinic collaborated
adaptation of existing scenarios. on regular basis. Most of the time spent during this first
A pilot test on five participants with healthcare iteration phase was devoted to technology development and
background, and an actual test performed at the Dr. Leo domain knowledge creation. After the first iteration cycle we
Kannerhuis clinic for autistic children were made. The found out that the process of co-creation of domain specific
participants were given four different tasks, derived from the knowledge is more time consuming than expected. Within
Cognitive Dimensions Framework [22]. The tasks were the experimental evaluation of the initial technical platform
defined as incrementation, modification, transcription and was established that by providing the appropriate tools and
exploratory design. The tasks that were created were creating the domain relevant tasks makes technically
respectively: adding a new component from a library, editing possible high-tech platforms as robots to be used in clinics.
a component that is already integrated in the network, adding We cannot yet give guidelines of how the implementation of
a new section to the network and making an own such platform in clinical institutions is possible with respect
contribution to the network. For the tasks, a scenario was to the human factors involved. Based on the findings from
provided in flowchart form on paper, and a network the tests and additional discussions we concluded that
representing this scenario was also pre-made. The therapists need more control over the robot, so a remote
participants had to complete, change and run this scenario to control option will be introduced; a remote controller is also
accomplish the four tasks. needed to fake intelligent behavior when the creation of such
In this first evaluation, all participants were new to the is beyond the achievements of the state of the art research.
programming environment. The first task was described in The developments with respect to the technical platform
detail, and every following task - with less detail. This gave were directed towards creation of affordances that will make
the participants in the test the opportunity to gradually but possible multimodal interaction between a human and a
quickly become familiar with the functionalities of the robot. The robot needs to have or simulate intelligent
program, so they could make their own contribution in the behavior, such as spontaneous reaction to sensory
last task. information, and robustness to disturbances from human or
The programming environment was not demonstrated other environmental agents. These requirements are satisfied
beforehand. Instead, a short explanation on paper was by introduction of a state-space operation concept with
provided describing the used components and how these are continuous account of multimodal sensory information.
connected, along with the task list and the flowchart. Within this project two user tests with therapists were
Participants were asked to think aloud while performing the performed to explore the usability issues of the programming
tasks, and were free to ask questions when they could not environment for therapists. The results show that the
progress. graphical environment could be used by therapists for
After the tasks the experience of the participants were personalization of existing scenarios. We attempt to make
evaluated with a short questionnaire. The questionnaire possible that the therapists themselves will create training
featured 28 statements related to their experience with scenarios. For this purpose we aim at development of a
components and network in TiViPE, and 7 statements about critical mass of interactive behavioral components and
how the robot was perceived. There was extra room for scenarios. In addition, we develop a Wiki platform, where
comments at the end of the questionnaire. Statements could therapists from different clinics can share the created
be evaluated with a five-point Likert scale. They were related scenarios.
to one or more of the cognitive dimensions [22]. In the long term, imitation learning techniques could
The goal of the pilot study was to make sure the potentially help therapists to create their own treatment
questionnaire and tasks were defined properly and were programs. Simple demonstration of the desired behavior to
performed outside the clinic. After the pilot, the the robot could drastically reduce the required technical
questionnaire and task list were improved to make the knowledge of the therapist. Furthermore, imitation is closely
different dimensions in the questionnaire more balanced, and related to playful behavior, which is an important
improved the wording for the actual clinical test. exploratory activity in which known movements can be

18 2011 World Congress on Information and Communication Technologies


combined arbitrarily. This might lead to exploration of [9] B. Robins, K. Dautenhahn, R. Te Boekhorst, and A. Billard, “Robotic
resources unknown up to this point. Training of imitation assistants in therapy and education of children with autism: can a
small humanoid robot help encourage social interaction skills?,”
behavior could be especially helpful for autistic children [4]. Universal Access in the Information Society, vol. 4, no. 2, pp. 105-
Within the clinic that we work with training of imitation 120, 2005.
skills is not an essential learning goal, because of the age and [10] E. I. Barakova and T. Lourens, “Expressing and interpreting
IQ level of the children. emotional movements in social games with robots,” Personal and
Ubiquitous Computing, vol. 14, pp. 457–467, Jan. 2010.
ACKNOWLEDGMENT [11] E. I. Barakova, J. Gillesen, and L. Feijs, “Social training of autistic
children with interactive intelligent agents,” Journal of Integrative
Neuroscience, vol. 8, no. 1, pp. 23-34, 2009.
I gratefully acknowledge the support of the Innovation- [12] D. Feil-Seifer and M. Mataric’, “Robot-assisted therapy for children
Oriented Research Program ‘Integral Product Creation and with autism spectrum disorders,” Proceedings of the 7th international
Realization (IOP IPCR)’ of the Netherlands Ministry of conference on Interaction design and children - IDC ’08, no. 2005, p.
Economic Affairs, Agriculture and Innovation. Furthermore, 49, 2008.
many thanks to the Ph. D. students working on this project [13] T. Bernd, G. J. Gelderblom, S. Vanstipelen, and L. De Witte, “Short
Jan Gillesen and Stijn Boere, to professors Panos term effect evaluation of IROMEC involved therapy for children with
intellectual disabilities,” in Proceedings of the Second international
Markopoulos, Henk Nijmeijer, Loe Feijs, all from conference on Social robotics, 2010, pp. 259-264.
Eindhoven University of Technology, to Bibi Huskens, and
[14] http://www.aldebaran-robotics.com
Astrid Van Dijk from the autistic clinic Dr. Leo Kannerhuis,
[15] M. B. Colton, D. J. Ricks, M. A. Goodrich, B. Dariush, K. Fujimura,
The Netherlands, and Tino Lourens from TiViPE company and M. Fujiki, “Toward Therapist-in-the-Loop Assistive Robotics for
for their multiple contributions to this multidisciplinary Children with Autism and Specific Language Impairment,” in AISB
research. New Frontiers in Human-Robot Interaction Symposium, 2009, vol.
24, p. 25.
REFERENCES [16] K. Hume, S. Bellini, and C. Pratt, “The Usage and Perceived
Outcomes of Early Intervention and Early Childhood Programs for
Young Children With Autism Spectrum Disorder,” Topics in Early
[1] O. I. Lovaas OI. “Behavioral treatment and normal educational and Childhood Special Education, vol. 25, no. 4, pp. 195-207, 2009.
intellectual functioning in young autistic children.” Journal of
Consulting and Clinical Psychology, 1987, pp.55, 3-9. [17] J. C. C. Gillesen, E. I. Barakova, B. E. B. M. Huskens, and L. M. G.
Feijs, “From training to robot behavior: Towards custom scenarios for
[2] H. Kozima and C. Nakagawa, “Interactive Robots as Facilitators of robotics in training programs for ASD,” in IEEE International
Children ’ s Social Development,” Mobile robots: Toward new Conference on Rehabilitation Robotics, 2011, pp. 387 - 393.
applications, 2006, pp. 269–286.
[18] R. L. Simpson, “Evidence-Based Practices and Students With Autism
[3] B. Robins, P. Dickerson, P. Stribling, and K. Dautenhahn, “Robot- Spectrum Disorders,” Focus on Autism and Other Developmental
mediated joint attention in children with autism: A case study in Disabilities, vol. 20, no. 3, pp. 140-149, Jan. 2005.
robot-human interaction,” Interaction Studies, vol. 5, no. 2, pp. 161-
198, Jan. 2004. [19] A.J. Ijspeert, J. Nakanishi, and S. Schaal. Movement imitation with
nonlinear dynamical systems in humanoid robots. In Proceedings of
[4] G. Bird, J. Leighton, C. Press, and C. Heyes, “Intact automatic the 2002 IEEE International Conference on Robotics and
imitation of human and robot actions in autism spectrum disorders.,” Automation, pp. 1398–1403.
Proceedings. Biological sciences / The Royal Society, vol. 274, no.
1628, pp. 3027-31, Dec. 2007. [20] T. Lourens, “TiViPE—Tino’s Visual Programming Environment,” in
The 28th Annual International Computer Software & Applications
[5] J. C. J. Brok and E. I. Barakova, “Engaging Autistic Children in Conference, IEEE COMPSAC, 2004, pp. 10-15.
Imitation and Turn-Taking Games with Multiagent System of
Interactive Lighting Blocks,” Ifip International Federation For [21] T. Lourens and E. I. Barakova “User-Friendly Robot Environment for
Information Processing, pp. 115-126, 2010. Creation of Social Scenarios.” in Foundations on Natural and
Artificial Computation. J. Ferrández, et al., Eds., LNCS 6686,
[6] K. Dautenhahn and I. Werry, “Towards interactive robots in autism 2011,pp. 212-221.
therapy,” Pragmatics & Cognition, vol. 1, no. 12, pp. 1-35, 2004.
[22] T. Green and A. Blackwell, “Cognitive dimensions of information
[7] A. Duquette, F. Michaud, and H. Mercier, “Exploring the use of a artefacts: a tutorial,” BCS HCI Conference, 1998.
mobile robot as an imitation agent with children with low-functioning
autism,” Autonomous Robots, vol. 24, no. 2, pp. 147–157, 2008. [23] A. Billard, S. Calinon, R. Dillmann, and S. Schaal. “Robot
programming by demonstration.” In B. Siciliano and O. Khatib,
[8] G. Pioggia, R. Igliozzi, M. Sica, M. Ferro, and F. Muratori, editors, Handbook of Robotics, pages 1371–1394. Springer, 2008.
“Exploring emotional and imitational android-based interactions in
autistic spectrum disorders,” Journal of CyberTherapy &
Rehabilitation, vol. 1, no. 1, pp. 49-61, 2008.

2011 World Congress on Information and Communication Technologies 19

You might also like