Computer Science Masters Thesis Presentation

Tuesday, June 23, 2015 - 11:00am


8102 Gates & Hillman Centers


STEVEN KLEE, 5th Year Masters Student

For a robot to perform a task, people have to instruct it, typically through programming. However, it does not seem feasible for a general user to be able program any robot. Instead, people talk and use language to instruct one another. In this thesis, we investigate how to provide language-based interactions to get a robot to perform a task. We consider robots equipped with a set of built-in motion and perception primitives known to the user. We are also interested in an approach that is only robot-primitive dependent, therefore applying to any robot hardware platform. As such, we can instruct multiple robots to coordinate, or use a planner to generate a task given a goal. We further investigate persistent interactions with such robots, where tasks are accumulated in a library and recalled during future interactions to assist the user. In this thesis, we contribute a task representation that can be automatically created from language instructions and corrected during test executions. Then, we extend this representation to support multi-robot coordination, and to allow a planner to provide tasks for the user given a goal. Finally, we consider persistent robots that accumulate task libraries, and present an algorithm that, given the initial steps of a new task, proposes an autocompletion based on a recognized past similar task. Thesis Committee:Manuela Veloso (Chair)M. Bernardine Dias Copy of Masters Document

For More Information, Contact:


Masters Thesis Presentation