About

Motivation: Social isolation is a challenge for all of us, but it can be particularly difficult for children. While adults (and teenagers) are accustomed to keeping in contact with friends and family using electronic means (through email, video conferencing, and text messages), elementary school-aged children typically interact with their peers through physical play.
 
Our group has developed an application that allows children to play remotely with their friends and distant family using an inexpensive, commercial robot. Using the software we have developed, a child in one home can use a phone or tablet to take control and “become” the robot in another child’s home. The remote child can see through the robot’s camera, control the robot as it drives around and engage in a variety of games with the child in the home with the robot. Via the robot, they can play hide and seek, build challenging obstacle courses for the robot, and engage in other simple games. The children can speak directly to each other, allowing the combined physical and social engagement that is essential at this age.
 
​OriginThis project was initiated by a team of Yale undergraduate and graduate students enrolled in a robotics lab course (CPSC 473/573 - Intelligent Robotics Laboratory). The class focused on a semester-long project to design a novel piece of intelligent software or an experiment in human-robot interaction. When the campus closed, many student groups were suddenly unable to complete the projects that they had spent months designing and implementing. Students from different teams joined together, and with the help of other Yale students, applied what they had learned to help some of the challenges associated with the global pandemic. This project is the outcome of that effort.
 
Robot and Software: The robot we are using is a commercially-available robot named Vector that was developed and sold by Anki until the company’s closure in 2018. There are several hundred thousand of these robots in the US, and many are still available from online retailers for under $120. The software that we have developed will be released as a free download for both Android and iOS devices.
 
Safety and Security: Our software enables a secure point-to-point video and audio link between two devices. Parents will be able to selectively allow only authorized users. Audio and video are streamed directly from device to device, just like typical video calls, and no information is transmitted or recorded to outside servers at Yale or elsewhere. 
 
Deployment: Our software will be released through the Apple AppStore and Google Play, making it instantly available to tens of thousands of existing robot owners. (Note that any number of authorized users could take turns controlling a single robot, allowing for an even greater network of users.). We also hope to be able to provide robots to select New Haven schools, with robots shipped directly to the school for distribution with school lunches.
    

About Us:

Our research (link to http://scazlab.yale.edu/)  focuses on building embodied computational models of human social behavior, especially the developmental progression of early social skills. Our work uses computational modeling and socially interactive robots in different methodological roles to explore questions about social development that are difficult or impossible to assail using methods of other disciplines.

Social robotics research at Yale is directed by Prof. Brian Scassellati.

 

Our group (https://interactive-machines.gitlab.io/studies fundamental problems at the intersection of Human-Computer Interaction and Robotics. In particular, our current research agenda is focused on enabling situated social interactions between robots and multiple people. We spend our day-to-day building novel computational tools, prototyping interactive systems, and running user experiments to both better understand interactions with technology and validate our methods.

 IMG is an interdisciplinary research group led by Marynel Vázquez in Yale’s Computer Science Department.