Jacky Kwok

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2024-76

May 10, 2024

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-76.pdf

This thesis brings together two reports that focus on optimizing a dataflow programming language for machine learning workloads using the reactor model. The first paper introduces an efficient parallel reinforcement learning framework that outperforms existing solutions, such as Ray, in simulation throughput, multi-agent inference and training on a single node. The proposed approach achieves this by reducing the work needed for synchronization using the reactor model and decreasing the I/O overhead through optimizing the coordination of Python worker threads. This work has been accepted as a full paper at the 36th ACM Symposium on Parallelism in Algorithms and Architectures. The second paper presents a High-Performance Robotic Middleware (HPRM), which builds on top of the reactor model and employs optimizations including in-memory object store, adaptive serialization, and eager protocol with real-time sockets to ensure low-latency and deterministic communication for autonomous systems. HPRM demonstrates substantial latency reduction compared to the Robot Operating System (ROS) 2 and achieves higher throughput in CARLA autonomous driving applications. The work presented in these two papers contributes to the goal of developing high-performance and reliable systems for machine learning by leveraging the benefits of the reactor model and optimized communication mechanisms.

Advisors: Edward A. Lee


BibTeX citation:

@mastersthesis{Kwok:EECS-2024-76,
    Author= {Kwok, Jacky},
    Editor= {Lee, Edward A. and Stoica, Ion},
    Title= {Towards Efficient and Deterministic Dataflow Systems for Machine Learning},
    School= {EECS Department, University of California, Berkeley},
    Year= {2024},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-76.html},
    Number= {UCB/EECS-2024-76},
    Abstract= {This thesis brings together two reports that focus on optimizing a dataflow programming language for machine learning workloads using the reactor model. The first paper introduces an efficient parallel reinforcement learning framework that outperforms existing solutions, such as Ray, in simulation throughput, multi-agent inference and training on a single node. The proposed approach achieves this by reducing the work needed for synchronization using the reactor model and decreasing the I/O overhead through optimizing the coordination of Python worker threads. This work has been accepted as a full paper at the 36th ACM Symposium on Parallelism in Algorithms and Architectures. The second paper presents a High-Performance Robotic Middleware (HPRM), which builds on top of the reactor model and employs optimizations including in-memory object store, adaptive serialization, and eager protocol with real-time sockets to ensure low-latency and deterministic communication for autonomous systems. HPRM demonstrates substantial latency reduction compared to the Robot Operating System (ROS) 2 and achieves higher throughput in CARLA autonomous driving applications. The work presented in these two papers contributes to the goal of developing high-performance and reliable systems for machine learning by leveraging the benefits of the reactor model and optimized communication mechanisms.},
}

EndNote citation:

%0 Thesis
%A Kwok, Jacky 
%E Lee, Edward A. 
%E Stoica, Ion 
%T Towards Efficient and Deterministic Dataflow Systems for Machine Learning
%I EECS Department, University of California, Berkeley
%D 2024
%8 May 10
%@ UCB/EECS-2024-76
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-76.html
%F Kwok:EECS-2024-76