Learning Image-Conditioned Dynamics Models for Control of Under-actuated Legged Millirobots
Anusha Nagabandi
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2018-43
May 10, 2018
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-43.pdf
Millirobots are a promising robotic platform for many applications due to their small size and low manufac- turing costs. Legged millirobots, in particular, can provide increased mobility in complex environments and improved scaling of obstacles. However, controlling these small, highly dynamic, and underactuated legged systems is difficult. Hand-engineered controllers can sometimes control these legged millirobots, but they have difficulties with dynamic maneuvers and complex terrains. We present an approach for controlling a real-world legged millirobot that is based on learned neural network models. Using less than 17 minutes of data, our method can learn a predictive model of the robot’s dynamics that can enable effective gaits to be synthesized on the fly for following user-specified waypoints on a given terrain. Furthermore, by leveraging expressive, high-capacity neural network models, our approach allows for these predictions to be directly conditioned on camera images, endowing the robot with the ability to predict how different terrains might affect its dynamics. This enables sample-efficient and effective learning for locomotion of a dynamic legged millirobot on various terrains, including gravel, turf, carpet, and styrofoam.
Advisors: Ronald S. Fearing and Sergey Levine
BibTeX citation:
@mastersthesis{Nagabandi:EECS-2018-43, Author= {Nagabandi, Anusha}, Title= {Learning Image-Conditioned Dynamics Models for Control of Under-actuated Legged Millirobots}, School= {EECS Department, University of California, Berkeley}, Year= {2018}, Month= {May}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-43.html}, Number= {UCB/EECS-2018-43}, Abstract= {Millirobots are a promising robotic platform for many applications due to their small size and low manufac- turing costs. Legged millirobots, in particular, can provide increased mobility in complex environments and improved scaling of obstacles. However, controlling these small, highly dynamic, and underactuated legged systems is difficult. Hand-engineered controllers can sometimes control these legged millirobots, but they have difficulties with dynamic maneuvers and complex terrains. We present an approach for controlling a real-world legged millirobot that is based on learned neural network models. Using less than 17 minutes of data, our method can learn a predictive model of the robot’s dynamics that can enable effective gaits to be synthesized on the fly for following user-specified waypoints on a given terrain. Furthermore, by leveraging expressive, high-capacity neural network models, our approach allows for these predictions to be directly conditioned on camera images, endowing the robot with the ability to predict how different terrains might affect its dynamics. This enables sample-efficient and effective learning for locomotion of a dynamic legged millirobot on various terrains, including gravel, turf, carpet, and styrofoam.}, }
EndNote citation:
%0 Thesis %A Nagabandi, Anusha %T Learning Image-Conditioned Dynamics Models for Control of Under-actuated Legged Millirobots %I EECS Department, University of California, Berkeley %D 2018 %8 May 10 %@ UCB/EECS-2018-43 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-43.html %F Nagabandi:EECS-2018-43