CCN Colloquium: "Informational shortcuts in attentional guidance"
Speaker(s):Joy Geng, PhD (UC Davis)
Visual search is a complex task that humans perform many times throughout the day. Examples include searching for the perfect snack in a supermarket, looking for a misplaced phone in a cluttered room, or finding a friend in a busy restaurant. The efficiency with which we perform these tasks has a significant impact on the quality of our lives. When we can find what we're looking for quickly, we easily move through our daily tasks. However, when it is difficult, we waste time or even fail to achieve our final goals of making a meal, getting to an appointment, or socializing with a friend. Successful search processes rely on attentional mechanisms to organize knowledge of the past and expectations of the future to optimally sample information from the external world to guide behavior. But how do we do this? In this talk, I'll describe work we've been doing in the lab that looks at how attentional guidance uses coarse but adaptive information to guide attention toward targets, whereas target decisions rely on more precise and complete target information. We suggest that visual search guidance and decision stages use different template information to maximize search efficiency.
Duke Institute for Brain Sciences (DIBS)
Center for Cognitive Neuroscience; Psychology and Neuroscience