Rutishauser, Ueli and Koch, Christof (2007) Probabilistic modeling of eye movement data during conjunction search via feature-based attention. Journal of Vision, 7 (6). Art. No. 5. ISSN 1534-7362. http://resolver.caltech.edu/CaltechAUTHORS:RUTjov07
- Published Version
See Usage Policy.
Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:RUTjov07
Where the eyes fixate during search is not random; rather, gaze reflects the combination of information about the target and the visual input. It is not clear, however, what information about a target is used to bias the underlying neuronal responses. We here engage subjects in a variety of simple conjunction search tasks while tracking their eye movements. We derive a generative model that reproduces these eye movements and calculate the conditional probabilities that observers fixate, given the target, on or near an item in the display sharing a specific feature with the target. We use these probabilities to infer which features were biased by top-down attention: Color seems to be the dominant stimulus dimension for guiding search, followed by object size, and lastly orientation. We use the number of fixations it took to find the target as a measure of task difficulty. We find that only a model that biases multiple feature dimensions in a hierarchical manner can account for the data. Contrary to common assumptions, memory plays almost no role in search performance. Our model can be fit to average data of multiple subjects or to individual subjects. Small variations of a few key parameters account well for the intersubject differences. The model is compatible with neurophysiological findings of V4 and frontal eye fields (FEF) neurons and predicts the gain modulation of these cells.
|Additional Information:||© 2007 by The Association for Research in Vision and Ophthalmology. Received July 11, 2006; published April 12, 2007. We would like to acknowledge the insightful comments of the anonymous reviewers that greatly benefited the paper. We would like to thank Ralph Adolphs for providing the Eyetracker equipment we used in the first experiment and Dirk Neumann and Wolfgang Einhaeuser for discussion. This work was supported by the National Geospatial Intelligence Agency (NGA), NIMH, NSF, ONR, and DARPA. Commercial relationships: none.|
|Group:||Koch Laboratory, KLAB|
|Subject Keywords:||visual attention, eye movements, computational model, probabilistic, visual search, V4, FEF|
|Official Citation:||Rutishauser, U., & Koch, C. (2007). Probabilistic modeling of eye movement data during conjunction search via feature-based attention. Journal of Vision, 7(6):5, 1–20, http://journalofvision.org/7/6/5/, doi:10.1167/7.6.5.|
|Usage Policy:||No commercial reproduction, distribution, display or performance rights in this work are provided.|
|Deposited By:||Rutishauser Ueli|
|Deposited On:||25 Jan 2008|
|Last Modified:||20 Mar 2017 22:40|
Repository Staff Only: item control page