Our hands speak
Our hands speak: Robot-based intervention to promote gestural communication in children with Autism Spectrum Disorders (ASD)
妙手可言:使用社交機器人支援自閉症兒童手勢溝通計劃——用手告訴您
[A project led by Prof. So Wing-chee, Catherine]
Gesture, spontaneous hand movement produced with or without speech, is an effective channel for communication. Our previous studies are the first showing school-aged children with Autism Spectrum Disorders (ASD) are found to have gesture delay during their middle and late childhood (So, Lui, Wong, & Sit, 2015a; So, Wong, Lui, & Yip, 2015b; So & Wong, 2016). The difficulty with gesture use is more salient among those with weaker social and communication skills (So, Wong, & Lam, 2016a). Collaborating with Faculty of Engineering, Faculty of Arts, and Faculty of Medicine, our team thus design and implement home-based and school-based intervention programs to promote their gestural communication skills using humanoid social robot. We collaborate with a number of organizations that serve children with ASD in Hong Kong including Hong Chi Association, SAHK, Hong Kong Christian Service, STEP Center for Child Development, and Hong Kong Sheng Kung Hui. Our programs are funded by the CUHK Knowledge Transfer Fund, Quality Education Fund, and Social Innovation Enterprise Fund.
In the first intervention program, more than 90 ASD school-aged children were taught to recognize 20 iconic and marker gestures produced by the robot animation, to imitate them, and produce them in appropriate social contexts in three phases (So, Wong, Lam, & Cabibihan, Chan, & Qian, 2016b). The program was either conducted at homes or in the enrolled school. Our results reported significant differences between the results of the pretest and the immediate and follow-up posttests. The children generalized their acquired gestural skills to a novel setting with a human researcher. Thus, robot animation is effective in teaching children with ASD to recognize and produce gestures.
Following the success of the gestural training by animated robot, our team used the real social robot to teach school-aged children with ASD to recognize and produce gestures that express feelings and needs (e.g., NOISY). Compared to the students in the control group who had not received training, students in the intervention group were more capable in recognizing and producing the eight gestures produced in the trained and non-trained scenarios. Even more promising, these students could recognize the same gestures in human-to-human interactions.
Currently, we design new robot-based intervention programs to teach young children with ASD to incorporate gestures in their conversation, develop narrative skills, and build social competence through drama therapy.