Shrey Aeron

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2024-91

May 10, 2024

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-91.pdf

AlphaGarden Remote Sensing applies in a variety of situations, ranging from monitoring the environment, generating actionable insights, and object classification and detection techniques.

AlphaGarden is an automated test-bed for indoor polyculture farming with a gantry robot, censors, plant and leaf phenotyping, growth cycle simulator, custom tools. Image data from an overhead camera along with soil sensors from the testbed are analyzed using a Tracking and Phenotyping network, estimating the garden's state at each time step and being fed into the simulator to decide on the next set of pruning and irrigation actions to take to maximize coverage and diversity of the garden. Results suggest the system can autonomously achieve 94% normalized plant diversity with pruning shears while maintaining an average canopy coverage of 84% by the end of the cycles.

Push-MOG Recently, robots have seen rapidly increasing use in homes and warehouses to declutter by collecting objects from a planar surface and placing them into a container. While current techniques grasp objects individually, Multi-Object Grasping (MOG) can improve efficiency by increasing the average number of objects grasped per trip (OpT). However, grasping multiple objects requires the objects to be aligned and in close proximity. In this work, we propose Push-MOG, an algorithm that computes “fork pushing’” actions using a parallel-jaw gripper to create graspable object clusters. In physical decluttering experiments, we find that Push-MOG enables multi-object grasps, increasing the average OpT by 34%.

This thesis explores the nexus of these various Remote Actuation and Sensing tasks, from macro to micro scale.

Advisors: Ken Goldberg


BibTeX citation:

@mastersthesis{Aeron:EECS-2024-91,
    Author= {Aeron, Shrey},
    Title= {Industrial Applications of Remote Monitoring, Learning, and Actuation},
    School= {EECS Department, University of California, Berkeley},
    Year= {2024},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-91.html},
    Number= {UCB/EECS-2024-91},
    Abstract= {AlphaGarden
Remote Sensing applies in a variety of situations, ranging from monitoring the environment, generating actionable insights, and object classification and detection techniques.

AlphaGarden is an automated test-bed for indoor polyculture farming with a gantry robot,  censors, plant and leaf phenotyping, growth cycle simulator, custom tools. Image data from an overhead camera along with soil sensors from the testbed are analyzed using a Tracking and Phenotyping network, estimating the garden's state at each time step and being fed into the simulator to decide on the next set of pruning and irrigation actions to take to maximize coverage and diversity of the garden. Results suggest the system can autonomously achieve 94% normalized plant diversity with pruning shears while maintaining an average canopy coverage of 84% by the end of the cycles.

Push-MOG
Recently, robots have seen rapidly increasing use in homes and warehouses to declutter by collecting objects from a planar surface and placing them into a container. While current techniques grasp objects individually, Multi-Object Grasping (MOG) can improve efficiency by increasing the average number of objects grasped per trip (OpT). However, grasping multiple objects requires the objects to be aligned and in close proximity. In this work, we propose Push-MOG, an algorithm that computes “fork pushing’” actions using a parallel-jaw gripper to create graspable object clusters. In physical decluttering experiments, we find that Push-MOG enables multi-object grasps, increasing the average OpT by 34%. 

This thesis explores the nexus of these various Remote Actuation and Sensing tasks, from macro to micro scale.},
}

EndNote citation:

%0 Thesis
%A Aeron, Shrey 
%T Industrial Applications of Remote Monitoring, Learning, and Actuation
%I EECS Department, University of California, Berkeley
%D 2024
%8 May 10
%@ UCB/EECS-2024-91
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2024/EECS-2024-91.html
%F Aeron:EECS-2024-91