LILI: Lehigh Instrument for Learning Interaction
System Overview
The Lehigh Instrument for Learning Interaction (LILI) is a joint project with the Lehigh WiNS Lab lead by Prof. Mooi Choo Chuah. It is motivated by recent studies showing that children with autism spectrum disorder (ASD) tend to speak and interact more in the presence of an interactive robot. Unfortunately, most of the robotic experiments were conducted in highly controlled clinical settings or limited, selected home environments due to the high deployment cost. Our goal is to develop a low-cost, interactive robot that can be readily deployed to home environments. LILI interacts with users via gestures, voice commands, and an animated speaking avatar. LILI can recognize users’ faces, and her motion can be controlled either via gesture or voice.
LILI consists of an iRobot Create base for mobility, an Asus Xtion PRO sensor for perception, Hokuyo URG-04LX-UG01 Scanning Laser Rangefinder for obstacle avoidance, and a video monitor for avatar interaction. All computation is handled by the onboard computer which integrates an i3-3220T 64-bit processor running Windows 7 Enterprise.
Videos
LILI 1.0 Demo – Smart Spaces REU Site, Summer 2014
Publications
- M. Chuah, D. Coombe, C. Garman, C.Guerrero, J. Spletzer, “Lehigh Instrument for Learning Interaction (LILI): An interactive robot to aid development of social skills for Autistic Children”, 1st NSF REU Workshop on Networked Systems, Oct, 2014.
Full text PDF – Google Scholar.
Sponsors
This work is currently supported by a Lehigh University Faculty Innovation Grant. We are grateful for the support.
People
Present
- Mooi Choo Chuah
- John Spletzer
- Daniel Coombe
Past
- Christopher Garmen
- Cassandra Guerrero (Kutztown University)