David Deng

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2022-228

September 19, 2022

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-228.pdf

With the prevalence of LiDAR and depth sensors, 3D point clouds have been in increasingly prevalent form of visual data, especially in the field of autonomous driving. As such, there is a need for algorithms that can process raw, unlabelled sequences of point clouds. In this report, we present two such methods for processing temporal LiDAR data: 1. a method for multi-body rigid scene flow and segmentation, and 2. a novel neural network for 3D point cloud prediction. Our prediction network is based on FlowNet3D and trained to minimize the Chamfer Distance (CD) and Earth Mover's Distance (EMD) to the next point cloud. Compared to directly using state of the art existing methods such as FlowNet3D, our proposed architectures achieve CD and EMD nearly an order of magnitude lower on the nuScenes dataset. In addition, we show that our predictions generate reasonable scene flow approximations without using any labelled supervision. Our other work exploits the multi-body rigidity present in dynamic scenes encountered in autonomous driving by parameterizing scene flow as the composition a global ego-motion and a set of bounding boxes associated with their own rigid motions. We construct a novel loss function and differentiable bounding box formulation to optimize these parameters. Our approach achieves state of the art accuracy on the KITTI Scene Flow benchmark, outperforming all previous approaches without using any annotated labels. Additionally, we demonstrate the effectiveness of our approach on motion segmentation and ego-motion estimation and produce visualizations of our predictions to corroborate our results.

Advisors: Avideh Zakhor


BibTeX citation:

@mastersthesis{Deng:EECS-2022-228,
    Author= {Deng, David},
    Title= {Rigid Scene Flow Estimation and Prediction on Temporal LiDAR for Autonomous Driving},
    School= {EECS Department, University of California, Berkeley},
    Year= {2022},
    Month= {Sep},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-228.html},
    Number= {UCB/EECS-2022-228},
    Abstract= {With the prevalence of LiDAR and depth sensors, 3D point clouds have been in increasingly prevalent form of visual data, especially in the field of autonomous driving. As such, there is a need for algorithms that can process raw, unlabelled sequences of point clouds. In this report, we present two such methods for processing temporal LiDAR data: 1. a method for multi-body rigid scene flow and segmentation, and 2. a novel neural network for 3D point cloud prediction. Our prediction network is based on FlowNet3D and trained to minimize the Chamfer Distance (CD) and Earth Mover's Distance (EMD) to the next point cloud. Compared to directly using state of the art existing methods such as FlowNet3D, our proposed architectures achieve CD and EMD nearly an order of magnitude lower on the nuScenes dataset. In addition, we show that our predictions generate reasonable scene flow approximations without using any labelled supervision. Our other work exploits the multi-body rigidity present in dynamic scenes encountered in autonomous driving by parameterizing scene flow as the composition a global ego-motion and a set of bounding boxes associated with their own rigid motions. We construct a novel loss function and differentiable bounding box formulation to optimize these parameters. Our approach achieves state of the art accuracy on the KITTI Scene Flow benchmark, outperforming all previous approaches without using any annotated labels. Additionally, we demonstrate the effectiveness of our approach on motion segmentation and ego-motion estimation and produce visualizations of our predictions to corroborate our results.},
}

EndNote citation:

%0 Thesis
%A Deng, David 
%T Rigid Scene Flow Estimation and Prediction on Temporal LiDAR for Autonomous Driving
%I EECS Department, University of California, Berkeley
%D 2022
%8 September 19
%@ UCB/EECS-2022-228
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2022/EECS-2022-228.html
%F Deng:EECS-2022-228