Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Neural dynamics of motion grouping: from aperture ambiguity to object speed and direction

Not Accessible

Your library or personal account may give you access

Abstract

A neural network model of visual motion perception and speed discrimination is developed to simulate data concerning the conditions under which components of moving stimuli cohere or not into a global direction of motion, as in barberpole and plaid patterns (both type 1 and type 2). The model also simulates how the perceived speed of lines moving in a prescribed direction depends on their orientation, length, duration, and contrast. Motion direction and speed both emerge as part of an interactive motion grouping or segmentation process. The model proposes a solution to the global aperture problem by showing how information from feature tracking points, namely, locations from which unambiguous motion directions can be computed, can propagate to ambiguous motion direction points and capture the motion signals there. The model does this without computing intersections of constraints or parallel Fourier and non-Fourier pathways. Instead, the model uses orientationally unselective cell responses to activate directionally tuned transient cells. These transient cells, in turn, activate spatially short-range filters and competitive mechanisms over multiple spatial scales to generate speed-tuned and directionally tuned cells. Spatially long-range filters and top–down feedback from grouping cells are then used to track motion of featural points and to select and propagate correct motion directions to ambiguous motion points. Top–down grouping can also prime the system to attend a particular motion direction. The model hereby links low-level automatic motion processing with attention-based motion processing. Homologs of model mechanisms have been used in models of other brain systems to simulate data about visual grouping, figure–ground separation, and speech perception. Earlier versions of the model have simulated data about short-range and long-range apparent motion, second-order motion, and the effects of parvocellular and magnocellular lateral geniculate nucleus lesions on motion perception.

© 1997 Optical Society of America

Full Article  |  PDF Article
More Like This
Neural model of first-order and second-order motion perception and magnocellular dynamics

Aijaz A. Baloch, Stephen Grossberg, Ennio Mingolla, and C. A. M. Nogueira
J. Opt. Soc. Am. A 16(5) 953-978 (1999)

Perceived speed and direction of complex gratings and plaids

A. T. Smith and G. K. Edgar
J. Opt. Soc. Am. A 8(7) 1161-1171 (1991)

Factors affecting motion integration

Gunter Loffler and Harry S. Orbach
J. Opt. Soc. Am. A 20(8) 1461-1471 (2003)

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Figures (19)

You do not have subscription access to this journal. Figure files are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Tables (1)

You do not have subscription access to this journal. Article tables are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Equations (21)

You do not have subscription access to this journal. Equations are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All Rights Reserved