Armin Samii and James O'Brien

EECS Department, University of California, Berkeley

Technical Report No. UCB/EECS-2014-204

December 1, 2014

http://www2.eecs.berkeley.edu/Pubs/TechRpts/2014/EECS-2014-204.pdf

In this paper we describe a perceptual model that accounts for the time-varying changes to perceived color and brightness that occur due to time-varying adaptation and the transition between cone- and rod-mediated vision. Given a multispectral, high-dynamic-range image of a scene and the viewer's current adaptive state, this model produces a low-dynamic-range image that, when viewed photopically, creates a perception similar to what would have been experienced by the viewer in the original scene. When applied to a video sequence, our model produces temporally coherent output which models the viewer's adaptation state. A calibrated four-color-channel video camera is used to obtain video-rate data for scenes with moving objects. We describe a new demosaicing algorithm for working with this type of camera, and how the four color channels can be use to estimate independent rod and cone responses.

Advisors: James O'Brien


BibTeX citation:

@mastersthesis{Samii:EECS-2014-204,
    Author= {Samii, Armin and O'Brien, James},
    Title= {A Perceptually Based Model of Visual Adaptation},
    School= {EECS Department, University of California, Berkeley},
    Year= {2014},
    Month= {Dec},
    Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2014/EECS-2014-204.html},
    Number= {UCB/EECS-2014-204},
    Abstract= {In this paper we describe a perceptual model that accounts for the time-varying changes to perceived color and brightness that occur due to time-varying adaptation and the transition between cone- and rod-mediated vision.
Given a multispectral, high-dynamic-range image of a scene and the viewer's current adaptive state, this model produces a low-dynamic-range image that, when viewed photopically, creates a perception similar to what would have been experienced by the viewer in the original scene.
When applied to a video sequence, our model produces temporally coherent output which models the viewer's adaptation state.
A calibrated four-color-channel video camera is used to obtain video-rate data for scenes with moving objects.
We describe a new demosaicing algorithm for working with this type of camera, and how the four color channels can be use to estimate independent rod and cone responses.},
}

EndNote citation:

%0 Thesis
%A Samii, Armin 
%A O'Brien, James 
%T A Perceptually Based Model of Visual Adaptation
%I EECS Department, University of California, Berkeley
%D 2014
%8 December 1
%@ UCB/EECS-2014-204
%U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2014/EECS-2014-204.html
%F Samii:EECS-2014-204