Photo
Gregory Zelinsky, Ph.D.
Brown University 1994
Associate Professor, Cognitive Science
Joint Associate Professor, Computer Science
Office: Psychology B-240
Office Hours: Flexible, by appointment
Phone Number: (631) 632-7827

e-mail: Gregory.Zelinsky@stonybrook.edu
Website: http://www.psychology.sunysb.edu/gzelinsky-/

Areas of Interest: Visual cognition, visual search, eye movements, visual attention, visual working memory, scene perception.

Current Research:
My work attempts to integrate cognitive, computational, and neuroimaging techniques to better understand a broad range of visual cognitive behaviors, including search, object representation, working memory, and scene perception. In a current research project I monitor how people move their eyes as they perform various visual search tasks. I then describe this oculomotor behavior in the context of an image-based neurocomputational model. The model "sees" the same stimuli presented to the human observers, then outputs a sequence of simulated eye movements as it performs the identical task. This simulated pattern of eye movements is then compared to the human behavior in order to evaluate and refine the model.

Recent Publications:

Schmidt, J., & Zelinsky, G.J. (2009). Search guidance is proportional to the categorical specificity of a target cue. Quarterly Journal of Experimental Psychology, 62(10), 1904-1914.

Yang, H., Chen, X., & Zelinsky, G.J. (2009). A new look at novelty effects: Guiding search away from old distractors. Attention, Perception, & Psychophysics, 71(3), 554-564.

Yang, H., & Zelinsky, G.J. (2009). Visual search is guided to categorically-defined targets. Vision Research, 49, 2095-2103.

Zelinsky, G.J., & Loschky, L.C. (2009). Using eye movements to study working memory rehearsal for objects in visual scenes. In N. A. Taatgen & H. van Rijn (Eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society (pp. 1312-1317). Austin, TX: Cognitive Science Society.

Zelinsky, G.J., & Schmidt, J. (2009). An effect of referential scene constraint on search implies scene segmentation. Visual Cognition, 17(6), 1004-1028.

Brennan, S., Chen, X., Dickinson, C., Neider, M., & Zelinsky, G.J. (2008). Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition, 106, 1465-1477.

Neider, M., & Zelinsky, G.J. (2008). Exploring set size effects in scenes: Identifying the objects of search. Visual Cognition, 16(1), 1-10.

Zelinsky, G.J. (2008). A theory of eye movements during target acquisition. Psychological Review, 115(4), 787-835.

Zelinsky, G.J., & Neider, M. (2008). An eye movement analysis of multiple object tracking in a realistic environment. Visual Cognition, 16, 553-566.

Zhang, W., Samaras, D., & Zelinsky, G.J. (2008). Classifying objects based on their visual similarity to target categories. Proceedings of the 30th Annual Conference of the Cognitive Science Society (pp. 1856-1861). Austin, TX: Cognitive Science Society.

Current Research Support:

National Institute of Mental Health (R01 MH063748-06A1), "Eye Movements During Real-world Visual Search: A behavioral and computational study."
4/1/09 - 3/31/14. $1,348,046 (total costs)
Gregory Zelinsky (Primary Investigator)

National Science Foundation (#0527585), "HSD: See Where I'm Looking: Using Shared Gaze to Coordinate Time-Critical Collaborative Tasks."
9/1/05 - 8/31/08. $742,006 (total costs)
Gregory Zelinsky (Primary Investigator)