Introducing the Furhat robot - a revolutionary platform that combines cutting-edge robotics with advanced artificial intelligence to transform the way we interact with and support children with Autism Spectrum Disorder (ASD). Our software for the Furhat robot is designed specifically to meet the unique needs of ASD children.
We have now designed two fully functional arms, and the robot may perform various gestures after hearing commands from participants, which makes it capable of engaging with children in a more natural and intuitive manner. We have incorporated innovative conversational design and leveraged the power of OpenAI for interaction, paving the way for a transformative experience in ASD therapy.
By augmenting traditional therapy methods with the potential of technology, the Furhat robot offers new possibilities for intervention and support, providing consistent, patient, and non-judgmental interaction. It serves as a valuable tool in helping ASD children navigate the complexities of social interaction and communication.
Looking ahead, the integration of AI with the Furhat robot holds immense promise for the future of ASD therapy. As we continue to refine and expand upon our software capabilities, we envision a world where children with ASD can access tailored support whenever and wherever they need it, breaking down barriers and empowering them to reach their full potential. Together, we are redefining what is possible in the training and support of ASD children, ushering in a new era of innovation and hope.
The musical training combines emotional music and human-like emotional facial expressions to help train autistic children to improve their ability in emotion recognition across domains.
The speech training combines emotional speech with neutral semantic information and human-like emotional facial expressions to help train autistic children to improve their ability in emotion recognition across domains.
Phonetics, phonology, Computational modeling of linguistic data and interdisciplinary studies involving human-robot interaction
Our research team mainly focuses on computational modelling of speech perception, production cross-linguistically (Chongming Chinese, Nanjing Chinese, Cantonese, Mandarin, Japanese, Korean and English). Recently, our team also started research projects in clinical linguistics such as production and perception of speech prosody in Cantonese-speaking children with and without autism. Our research team has also designed speech and musical training in improving use of prosody to mark information structure. We have also created robot-assisted training programs for autistic children.
We now have Ph.D. positions open for application in 2023 and 2024. If you are interested, please send your CV to Dr. Si Chen (sarah.chen@polyu.edu.hk)
We now have post-doc positions open for application in 2023 and 2024. If you are interested, please send your CV to Dr. Si Chen (sarah.chen@polyu.edu.hk)
For the application of research assistants, please go to the website for the recent updates: https://jobs.polyu.edu.hk/research.php
1. PI: Early diagnosis and treatment of children with autism spectrum disorder based on an integration of robot and augmented reality technology
Sin Wai Kin Foundation Limited 2023-2025 (HK$2,500,000).
2.PI: Early diagnosis and treatment of children with autism spectrum disorder based on an integration of robot and augmented reality technology
RGC matching fund (HK1,250,000)
3. PI: Robot-assisted speech &musical training in improving speech prosody production &processing by Cantonese-speaking autistic children
Project of strategic importance, PolyU 2022-2024 ($2,000,000)
4. PI: Developing a Social Robot for Cantonese &Mandarin Speech Prosody Training in Children with Autism Spectrum Disorder Application No.: 2122-01) submitted for Research &Development (R&D) Projects 2021-22 of the Standing Committee on Language Education &Research (SCOLAR) ($1,128,760)
5. PI: The effects of robot-assisted speech training in improving speech prosody by trilingual children with autism spectrum disorder. RGC grant 2021-2023 ($200,000)
PI: Multimodal and multilingual individualized training in improving empathy and emotion recognition by autistic children combining robotics and AR technology. HK$2,199,237.50 under the ITF scheme.
Wong, C. H., Wong, M. N., Chen, S., & Lin, W. Y. (accepted). Pitch-variation skills in Cantonese speakers with apraxia of speech after stroke: Preliminary findings of acoustic analyses. Journal of Speech, Language, and Hearing Research.
Zhang, Y. X., Chen, X. (co-first author), Chen, S., Meng Y. Z., Lee K. L, Mizuguchi S., & Tateishi, K. (2023) Visual-Auditory Perception of Prosodic Focus in Japanese by Native and Non-native Speakers. Frontiers in Psychology.
Hong Y. T., Chen, S., Zhou, F., Chan, A., & Tang, T. (2023) Phonetic Entrainment in Human-Robot Interaction: An Investigation of Children with and without Autism Spectrum Disorder. Frontiers in Psychology.
Chen, S., Hong Y. T., Li B., & Chun E. (2023). The f0 perturbation effects in focus marking: evidence from Korean and Japanese. PloS one.
Chan, W. S. A., Chen, S., Tse, B., Hamdani, S. Z., & Cheng, C. W. (2023) Story telling in bilingual Urdu-Cantonese ethnic minority children: macrostructure and its relation to microstructural linguistic skills. Frontiers in Psychology: Language Sciences. https://doi.org/10.3389/fpsyg.2023.924056
Chen, S., Zhang, C.C. Lau, P. Y., Yang, Y. K., Li, B. (2022). Modelling representations in speech normalization of prosodic cues. Scientific Reports 12(1), 1-21. https://www.nature.com/articles/s41598-022-18838-w
Chen, S., Li, B., He, Y., Chen, S., Yang, Y., & Zhou, F. (2022). The effects of perceptual training on speech production of Mandarin sandhi tones by tonal and non-tonal speakers. Speech Communication, 139, 10-21.
https://www.sciencedirect.com/science/article/abs/pii/S0167639322000334
Chen, S., Yang, Y. K. &Wayland, R. (2021) Categorical Perception of Mandarin Pitch Directions by Cantonese-Speaking Musicians &Non-musicians. Front. Psychol. 12: 713949. doi: 10.3389/fpsyg.
https://www.frontiersin.org/articles/10.3389/fpsyg.2021.713949/full
In a remarkable feat, Dr. Chen Si, an esteemed Assistant Professor at CBS, has been honored with the esteemed PolyU Young Innovative Researcher Award (YIRA) for the year 2022. Dr. Chen's exceptional contributions have propelled her to the forefront among 59 contenders spanning diverse faculties and departments. His remarkable achievement secures her a well-deserved position as one of the six distinguished YIRA awardees.
Dr. Chen's groundbreaking research focus lies in robot-assisted training for children with autism spectrum disorder. This pioneering work seamlessly aligns with the YIRA's core mission—to recognize and celebrate young PolyU researchers under the age of 35, who embody innovation, drive technological advancement, and steer transformative solutions that address societal challenges, paving the way for a brighter future.
To learn more about Dr. Chen's outstanding achievement and the impact of his pioneering research, you can watch the linked video below.