Eldon Schoop and James Smith and Björn Hartmann

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2018-190

December 19, 2018

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-190.pdf

Our perception of our surrounding environment is limited by the constraints of human biology. The field of augmented perception asks how our sensory capabilities can be usefully extended through computational means. We argue that spatial awareness can be enhanced by exploiting recent advances in computer vision which make high-accuracy, real-time object detection feasible in everyday settings. We introduce HindSight, a wearable system that increases spatial awareness by detecting relevant objects in live 360-degree video and sonifying their position and class through bone conduction headphones. HindSight uses a deep neural network to locate and attribute semantic information to objects surrounding a user through a head-worn panoramic camera. It then uses bone conduction headphones, which preserve natural auditory acuity, to transmit audio notifications for detected objects of interest. We develop an application using HindSight to warn cyclists of approaching vehicles outside their field of view. To evaluate HindSight, we first conduct an exploratory study with 15 users. We next create a VR platform to simulate realistic traffic scenarios and use it to evaluate HindSight in a controlled user study with 21 participants. Participants using HindSight had fewer collisions, increased their space to other vehicles, experienced reduced cognitive load, and reported a perceived increase in awareness.

Advisors: Björn Hartmann


BibTeX citation:

@mastersthesis{Schoop:EECS-2018-190,
    Author= {Schoop, Eldon and Smith, James and Hartmann, Björn},
    Title= {HindSight: Enhancing Spatial Awareness by Sonifying Detected Objects in Real-Time 360-Degree Video},
    School= {EECS Department, University of California, Berkeley},
    Year= {2018},
    Month= {Dec},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-190.html},
    Number= {UCB/EECS-2018-190},
    Abstract= {Our perception of our surrounding environment is limited by the constraints of human biology. The field of augmented perception asks how our sensory capabilities can be usefully extended through computational means. We argue that spatial awareness can be enhanced by exploiting recent advances in computer vision which make high-accuracy, real-time object detection feasible in everyday settings. We introduce HindSight, a wearable system that increases spatial awareness by detecting relevant objects in live 360-degree video and sonifying their position and class through bone conduction headphones. HindSight uses a deep neural network to locate and attribute semantic information to objects surrounding a user through a head-worn panoramic camera. It then uses bone conduction headphones, which preserve natural auditory acuity, to transmit audio notifications for detected objects of interest. We develop an application using HindSight to warn cyclists of approaching vehicles outside their field of view. To evaluate HindSight, we first conduct an exploratory study with 15 users. We next create a VR platform to simulate realistic traffic scenarios and use it to evaluate HindSight in a controlled user study with 21 participants. Participants using HindSight had fewer collisions, increased their space to other vehicles, experienced reduced cognitive load, and reported a perceived increase in awareness.},
}

EndNote citation:

%0 Thesis
%A Schoop, Eldon 
%A Smith, James 
%A Hartmann, Björn 
%T HindSight: Enhancing Spatial Awareness by Sonifying Detected Objects in Real-Time 360-Degree Video
%I EECS Department, University of California, Berkeley
%D 2018
%8 December 19
%@ UCB/EECS-2018-190
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2018/EECS-2018-190.html
%F Schoop:EECS-2018-190