Greg Zelinsky
Professor and Lab Director
Ph.D., Brown University, 1994
gregory.zelinsky@stonybrook.edu
My goal is to better understand visual cognition by following two interrelated research paths. First, I monitor
and analyze how people move their eyes as they perform various visual search and visual working memory tasks.
I do this in order to obtain an on-line and directly observable measure of how a behavior intimately associated
with the selection and accumulation of information (i.e., eye movements) changes in space and time during a task. Second, I
attempt to describe this occulomotor behavior in the context of image-based neurocomputational models. These models perform
the same task and “see” the same stimuli as human observers, and output a sequence of simulated eye movements that can be
compared to human behavior. These comparisons are then used to generate new hypotheses to further test the representations and
processes underlying task performance.
Robert Alexander
Ph.D. Candidate in Cognitive Science
robert.alexander@stonybrook.edu
My research is focused on the effects of item similarity and object part structure on eye movements during
visual search tasks. Both target-distractor similarity and distractor-distractor similarity are assumed to
determine the difficulty of search tasks, but the evidence for these effects has only been shown on accuracy
and reaction time using simple, synthetic stimuli. I am working to extend the research on these effects into more realistic search
tasks using complex, photorealistic images and to examine how these forms of item similarity affects eye movements. I am
examining both semantic similarity and visual feature similarity through the use of a variety of similarity measures. I am also
exploring how object part structure affects encoding of information from search previews and affects search for objects which can be
encoded in terms of parts.
Justin Maxfield
Ph.D. Candidate in Cognitive Science
justin.maxfield@stonybrook.edu
I research the relationship between visual search and object categorization. I’m currently studying how varying
the hierarchical level in which a target is cued influences search guidance and target verification, as measured
by the time needed to first fixate the target after search display onset and the time needed to decide that it is
the target once it has been fixated, respectively. I am also interested in how other factors know to affect category verification times,
such as object typicality or the feature overlap between categories of objects, might also affect search guidance and target
verification. By investigating these relationships between categorization and search, we can better understand the representation of
targets in categorical search tasks.
eye movements and visual cognition
Ashley Sherman
Ph.D. Candidate in Cognitive Science
ashley.sherman@stonybrook.edu
I’m interested in various aspects of visual perception, attention, and memory. Recently, I have been focusing on
understanding how motion information is used during multiple object tracking. For example, I am exploring how
the predictability of object motion affects tracking performance (from both low and high level perspectives).
Hossein Adeli
Ph.D. Candidate in Cognitive Science
hossein.adelijelodar@stonybrook.edu
I am interested in the problem of Recognition and how people carve up (segment) the visual sensory input to
perceptual categories. I am currently working on computational modeling of saccade programming in the superior
colliculus (and oculomotor system) to extend these models to input complex visual stimuli. I'm also working on
Lab Alums
decoding how complex scenes are understood and described by observers from their eye-movement behavior (sequential sampling of
the image).