Interactive Perception for Robotic Manipulation of Liquids, Grains, and Doughs
Carolyn Matl
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2021-174
August 9, 2021
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2021/EECS-2021-174.pdf
With the advent of robotic solutions in both the home and industrial environments comes the need for robots that can dependably perceive and manipulate unstructured or deformable materials like liquids and doughs. Robust handling of these complex materials often depends upon prior knowledge of key material properties such as viscosity or stiffness, which ultimately affect material dynamics. However, these material properties are difficult to observe with traditional sensing mechanisms, e.g. with passive, image-based observations. Furthermore, these materials are generally associated with complex models or high-dimensional representations, which make real-time dynamic predictions intractable.
Interactive perception enables robots to observe and learn from signals that would otherwise not be present. In this work, we develop and leverage non-traditional sensing techniques that are used within an interactive perception robotic framework to estimate physical properties of liquids, grains, and deformable objects. In particular, this research aims to extract key material parameters through direct interaction and observation of induced signals (e.g., sound or force). The estimated parameters are then used to reason about the materials for dynamic robotic manipulation tasks such as precision pouring and dough shaping. We additionally employ simplified representations of these complex materials for use in real-time applications. We hope that this work will highlight the importance of exploring new sensing modalities, which could enable robots to intelligently interact with and manipulate unknown and unstructured objects and materials.
Advisors: Ruzena Bajcsy
BibTeX citation:
@phdthesis{Matl:EECS-2021-174, Author= {Matl, Carolyn}, Title= {Interactive Perception for Robotic Manipulation of Liquids, Grains, and Doughs}, School= {EECS Department, University of California, Berkeley}, Year= {2021}, Month= {Aug}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2021/EECS-2021-174.html}, Number= {UCB/EECS-2021-174}, Abstract= {With the advent of robotic solutions in both the home and industrial environments comes the need for robots that can dependably perceive and manipulate unstructured or deformable materials like liquids and doughs. Robust handling of these complex materials often depends upon prior knowledge of key material properties such as viscosity or stiffness, which ultimately affect material dynamics. However, these material properties are difficult to observe with traditional sensing mechanisms, e.g. with passive, image-based observations. Furthermore, these materials are generally associated with complex models or high-dimensional representations, which make real-time dynamic predictions intractable. Interactive perception enables robots to observe and learn from signals that would otherwise not be present. In this work, we develop and leverage non-traditional sensing techniques that are used within an interactive perception robotic framework to estimate physical properties of liquids, grains, and deformable objects. In particular, this research aims to extract key material parameters through direct interaction and observation of induced signals (e.g., sound or force). The estimated parameters are then used to reason about the materials for dynamic robotic manipulation tasks such as precision pouring and dough shaping. We additionally employ simplified representations of these complex materials for use in real-time applications. We hope that this work will highlight the importance of exploring new sensing modalities, which could enable robots to intelligently interact with and manipulate unknown and unstructured objects and materials.}, }
EndNote citation:
%0 Thesis %A Matl, Carolyn %T Interactive Perception for Robotic Manipulation of Liquids, Grains, and Doughs %I EECS Department, University of California, Berkeley %D 2021 %8 August 9 %@ UCB/EECS-2021-174 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2021/EECS-2021-174.html %F Matl:EECS-2021-174