Evan Finnigan

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2019-79

May 17, 2019

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-79.pdf

Many webcam based eye gaze tracking systems are intended for use where eye gaze is limited to a computer screen. While these methods can work well for certain applications like improving the usability of webpages, they do not work well when the when gaze tracking is needed in non screen based tasks. In our case, we want to track the user’s gaze on a tabletop to allow for eye control of a planar robotic device with just one webcam and no other equipment. To allow for eye gaze tracking in this application, we need to first find the user’s visual axis then use the visual axis to infer where the user is looking on the tabletop. This report discusses how we can find a user’s visual axis in a way that is flexible to the wide variety of head positions and eye rotations that are present when a user is attempting to naturally complete tasks on a tabletop. The method used here has three steps. First the user’s eyeball centers are located using a system that tracks facial landmarks in 3D. Second, the user’s pupil centers are located in 2D. Third simple projective geometry is used to find the user’s visual axis based on 3D eye location and 2D pupil location. This method is capable of 22◦ error in the detected visual axis angle which is an improvement over OpenFace, a previous good webcam based eye tracker that works with any head orientation. Using just OpenFace, the visual axis angle error is over 28◦.

Advisors: Ruzena Bajcsy


BibTeX citation:

@mastersthesis{Finnigan:EECS-2019-79,
    Author= {Finnigan, Evan},
    Title= {Eye Gaze Tracking for Assistive Devices},
    School= {EECS Department, University of California, Berkeley},
    Year= {2019},
    Month= {May},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-79.html},
    Number= {UCB/EECS-2019-79},
    Abstract= {Many webcam based eye gaze tracking systems are intended for use where eye gaze is limited to a computer screen. While these methods can work well for certain applications like improving the usability of webpages, they do not work well when the when gaze tracking is needed in non screen based tasks. In our case, we want to track the user’s gaze on a tabletop to allow for eye control of a planar robotic device with just one webcam and no other equipment. To allow for eye gaze tracking in this application, we need to first find the user’s visual axis then use the visual axis to infer where the user is looking on the tabletop. This report discusses how we can find a user’s visual axis in a way that is flexible to the wide variety of head positions and eye rotations that are present when a user is attempting to naturally complete tasks on a tabletop. The method used here has three steps. First the user’s eyeball centers are located using a system that tracks facial landmarks in 3D. Second, the user’s pupil centers are located in 2D. Third simple projective geometry is used to find the user’s visual axis based on 3D eye location and 2D pupil location. This method is capable of 22◦ error in the detected visual axis angle which is an improvement over OpenFace, a previous good webcam based eye tracker that works with any head orientation. Using just OpenFace, the visual axis angle error is over 28◦.},
}

EndNote citation:

%0 Thesis
%A Finnigan, Evan 
%T Eye Gaze Tracking for Assistive Devices
%I EECS Department, University of California, Berkeley
%D 2019
%8 May 17
%@ UCB/EECS-2019-79
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2019/EECS-2019-79.html
%F Finnigan:EECS-2019-79