Generalized Learning using Inference Based Graphical Models.

This project addresses the challenge of coordinating stable locomotion in a modular underactuated robot while facing an unstructured, dynamically changing environment or the robot morphology. The goal is thus to develop a new framework that reasons over and reactively adapts the behaviors of articulated robots based on imperfect models of the environment and the desired task objectives. Specifically, we aim to learn probabilistic models that infer distributions of parameterized behaviors that will be used to select, combine, and transfer useful motions between different platforms based on data collected over multiple time scales.
The proposed framework can be used as a generalized tool for a variety of robotic applications. Our milestones are:

  1. Using the Inference Model approach for online automated gait transition in an unknown environment.

    GraphicalmodelsF1

    For example: Transition between walk in place ( top row ) and a walk forward ( bottom row ) gait.

  2. Using the Graphical Model to infer behaviour parameters across different platforms in order to generalize learning. In other words, we can transfer the learned parameters across robots of different kinematic configurations.

    GraphicalmodelsF2

    A simple architecture is shown above where we ‘Learn’ a model for the snake Monster and use a graphical model to infer the parameters for a humanoid for performing the same task.

  3. Using the Inference Model to infer the environmental constraints and plan its trajectory.

    GraphicalmodelsF3

    This is a collaborative project between Dr Matthew Travers (Carnegie Mellon University) and Dr Tom Howard (University of Rochester).

  4. Research Team: Raghu Aditya Kiran Chavali (CMU), Nathan kent (UR), Mike Napoli (UR)