On this page, you will find videos showing the capability of this team working on other robot platforms, along with corresponding publications and links to software on Github.
Our goal to port this software to SoftBank Robotics' Pepper robot once the standard platform is received, and use the existing NAOqi software to create rich and seamless interactions.
In this task, one of the challenges will be for the robot to respond to guests calling the robot simultaneously. The following videos show our computer vision software capable of commanding robots from afar.
Finding HRI Partners in a Crowd [PDF] [Github]
In this task, the robot will be asked to carry an object and follow a user to its destination. The following two videos show software that can help in this task.
Robot Pushcart [PDF] [Github]
Instead of asking the robot to follow you, a human can virtually push the robot at a safe distance.
After You Door Negotiation [PDF] [Github]
It's difficult for robots to know what to do when faced with a human in a doorway. Our robots have figured it out.
One of the difficulties in communicating non-verbally with robots is the issue of fast, robust gesture detection. The detector running here on a UAV can be ported to the Pepper robot platform for interaction via gestures.