Robot-child interactions – helping children with autism learn skills

Robot-child interactions – helping children with autism learn skills

Human-Computer Interaction research may go a lot further than just evaluating an office worker interacting with a stationary computer or laptop.

Published on

Thu 23 May. 2013

Human-Computer Interaction research may go a lot further than just evaluating an office worker interacting with a stationary computer or laptop. For example, researchers worldwide develop and evaluate robots in order to learn more about human-robot interaction, robot-robot interaction, and the roles robots can play in numerous processes taking place in our daily lives. Many years ago it all seemed too futuristic: who doesn’t know R2-D2 or C-3PO from the Star Wars movies? Now we know that the future has started.

Next to being friendly helpers, film robots such as R2-D2 and the robots in WALL-E also interact with people similarly to how a person would interact with another person. These processes are important to study, because, for example, autistic children often have serious difficulty learning social skills and could potentially benefit from learning these skills by interacting with a robot.

Evaluate interactions objectively

No matter what kind of HCI or Robotics research topic, behavioral observation is a valuable instrument in this area, being the tool to evaluate interactions objectively. Recently, Sandra Costa from the University of Minho, Guimares, Portugal and colleagues presented their work at the ACHI 2013, the sixth International Conference on Advances in Computer-Human Interactions.

They explored if a robot, in this case KASPAR (a humanoid robot with a static body, able to move its arms and head in order to simulate gestures in social interaction), could help elicit interactions between an autistic child and another person and if KASPAR could help the children learn more about body parts and learn to identify their own body parts.

Different research methods

Costa et al. used questionnaires, structured interviews, behavioral analysis, and a comparison between the pre- and post-test to gather data. The interaction was recorded on video for later analysis in The Observer XT. The researchers used two cameras to record the face of both the child and the experimenter during the tests. Costa et al. decided to code the following behaviors: Looking, touching, following, pointing, imitation, prompts, and identifying body parts. By accurately coding these behaviors, the researchers gained insight in the behavioral changes between the first and the last session.

Choose the location

The tests in which the robot and child discussed the body parts were executed in the child’s school. Being familiar with the room in which the test took place, the child experienced no additional stress.

Teaching skills

In order to teach the children about body parts, the researchers designed different scenarios with the robot and the child as participants. The children could touch the robot and, because of eight sensors, the robot was able to give direct feedback to the child. Teaching children with autism about body parts using a humanoid robot can be effective, but more research is needed.

Costa et al. found differences in knowledge about body parts and the teachers indicated to the researchers that they saw a positive behavior change in the children where it concerned the ability to identify parts of their own body with their own hands.

Robot - child interaction

It was found that the children directed their eye gaze increasingly less towards KASPAR when comparing the first with the last test session. In comparison, the children did pay more attention to the experimenter. So what happened? Could the robot have helped to stimulate interactions between an autistic child and another person, in this case the experimenter?

The researchers conclude that the robot could have certainly been useful in the facilitation of interaction between the child and the experimenter, but they found it difficult to draw causal conclusions because other factors could also have contributed to this increased level of interaction.








FREE TRIAL: Try The Observer XT yourself!

Request a free trial and see for yourself how easy behavioral research can be!

  • Work faster
  • Reduce costs
  • Get better data

Reference

Costa, S.; Lehmann, H.; Robins, B.; Dautenhahn, K.; Soares, F. (2013). „Where is your nose?”- Developing body awareness skills among children with autism using a humanoid robot. Proceedings of the ACHI 2013: The sixth International Conference on Advances in Computer-Human Interactions¸117-122.

Related Posts

3 debriefing steps to evaluate a training session
21 Nov human behavior research Education

3 debriefing steps to evaluate a training session

Debriefing digs into what and why things happened and explores implications for the future. You will gain insights in what to repeat and what to change.
Student facial expression analysis while watching instructional videos
06 May human behavior research Education

Student facial expression analysis while watching instructional videos

Is boredom the opposite of interest? Lídia Vinczéné Fekete and her research team used FaceReader and The Observer XT to find out what makes students get bored and which elements encourage interest.
Classroom observations including facial expression analysis
01 May human behavior research Education

Classroom observations including facial expression analysis

Internet and technology usage, including computers, tablets, and mobile phones, has increased exponentially in the past few years.