Perceptive Hexapod Legged Locomotion for Climbing Joist Environments

Zixian Zang and Avideh Zakhor

EECS Department
University of California, Berkeley
Technical Report No. UCB/EECS-2023-89
May 10, 2023

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2023/EECS-2023-89.pdf

Attics are one of the largest sources of energy loss in residential homes, but they are uncomfortable and dangerous for human workers to conduct air sealing and insulation. Hexapod robots are potentially suitable for carrying out those tasks in tight attic spaces since they are stable, compact, and lightweight. For hexapods to succeed in these tasks, they must be able to navigate inside tight attic spaces of single-family residential homes in the U.S., which typically contain rows of approximately 6 or 8-inch tall joists placed 16 inches apart from each other. Climbing over such obstacles is challenging for autonomous robotics systems. In this work, we develop a perceptive walking model for legged hexapods that can traverse over terrain with random joist structures using egocentric vision. Our method can be used on low-cost hardware not requiring real-time joint state feedback. We train our model in a teacher-student fashion in 2 phases: In phase 1, we use reinforcement learning with access to privileged information such as local elevation maps and joint feedback. In phase 2, we use supervised learning to distill the model into one with access to only onboard observations, consisting of egocentric depth images and robot orientation captured by a tracking camera. We demonstrate zero-shot sim-to-real transfer on a Hiwonder SpiderPi robot, equipped with a depth camera onboard, climbing over joist courses we construct to simulate the environment in the field. Our proposed method achieves nearly 100% success rate climbing over the test courses, significantly outperforming the model without perception and the controller provided by the manufacturer. Moreover, we develop an interactive visualization tool to improve the measurement and verification process of building retrofit. With drone-captured RGB and Infrared (IR) images of facades, our tool lets the user compare IR images capturing a specified location on a facade before and after the retrofit. We also design a pipeline to stitch IR images capturing different parts of a facade into a panorama, making it easier to find the metric locations of studs for envelop retrofit projects.

Advisor: Avideh Zakhor


BibTeX citation:

@mastersthesis{Zang:EECS-2023-89,
    Author = {Zang, Zixian and Zakhor, Avideh},
    Title = {Perceptive Hexapod Legged Locomotion for Climbing Joist Environments},
    School = {EECS Department, University of California, Berkeley},
    Year = {2023},
    Month = {May},
    URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2023/EECS-2023-89.html},
    Number = {UCB/EECS-2023-89},
    Abstract = {Attics are one of the largest sources of energy loss in residential homes, but they are uncomfortable and dangerous for human workers to conduct air sealing and insulation. Hexapod robots are potentially suitable for carrying out those tasks in tight attic spaces since they are stable, compact, and lightweight. For hexapods to succeed in these tasks, they must be able to navigate inside tight attic spaces of single-family residential homes in the U.S., which typically contain rows of approximately 6 or 8-inch tall joists placed 16 inches apart from each other. Climbing over such obstacles is challenging for autonomous robotics systems. In this work, we develop a perceptive walking model for legged hexapods that can traverse over terrain with random joist structures using egocentric vision. Our method can be used on low-cost hardware not requiring real-time joint state feedback. We train our model in a teacher-student fashion in 2 phases: In phase 1, we use reinforcement learning with access to privileged information such as local elevation maps and joint feedback. In phase 2, we use supervised learning to distill the model into one with access to only onboard observations, consisting of egocentric depth images and robot orientation captured by a tracking camera. We demonstrate zero-shot sim-to-real transfer on a Hiwonder SpiderPi robot, equipped with a depth camera onboard, climbing over joist courses we construct to simulate the environment in the field. Our proposed method achieves nearly 100% success rate climbing over the test courses, significantly outperforming the model without perception and the controller provided by the manufacturer.
Moreover, we develop an interactive visualization tool to improve the measurement and verification process of building retrofit. With drone-captured RGB and Infrared (IR) images of facades, our tool lets the user compare IR images capturing a specified location on a facade before and after the retrofit. We also design a pipeline to stitch IR images capturing different parts of a facade into a panorama, making it easier to find the metric locations of studs for envelop retrofit projects.}
}

EndNote citation:

%0 Thesis
%A Zang, Zixian
%A Zakhor, Avideh
%T Perceptive Hexapod Legged Locomotion for Climbing Joist Environments
%I EECS Department, University of California, Berkeley
%D 2023
%8 May 10
%@ UCB/EECS-2023-89
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2023/EECS-2023-89.html
%F Zang:EECS-2023-89