Robots learning from human teachers
published: July 28, 2015, recorded: June 2015, views: 234
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
In this talk I present recent work from the Socially Intelligent Machines Lab at Georgia Tech. The vision of our research is to enable robots to function in real human environments; such as, service robots helping at home, co-worker robots to revolutionize manufacturing, and assistive robots empowering healthcare workers and enabling aging adults to live longer in their homes. To do this, we need to build intelligent robots that can be embedded into human environments to interact with everyday people. Many of the successes of robotics to date rely on structured environments and repeatable tasks, but what all of these visions have in common is deploying robots into dynamic human environments where pre-programmed controllers won’t be an option. These robots will need to interact with end users in order to learn what they need to do on-the-job. Our research aims to computationally model mechanisms of human social learning in order to build robots and other machines that are intuitive for people to teach. We take Machine Learning interactions and redesign interfaces and algorithms to support the collection of learning input from end users instead of ML experts. This talk covers results on building models of reciprocal interactions for high-level task goal learning, low-level skill learning, and active learning interactions using humanoid robot platforms.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !