10,011 research outputs found
Neural Dynamics of Motion Grouping: From Aperture Ambiguity to Object Speed and Direction
A neural network model of visual motion perception and speed discrimination is developed to simulate data concerning the conditions under which components of moving stimuli cohere or not into a global direction of motion, as in barberpole and plaid patterns (both Type 1 and Type 2). The model also simulates how the perceived speed of lines moving in a prescribed direction depends upon their orientation, length, duration, and contrast. Motion direction and speed both emerge as part of an interactive motion grouping or segmentation process. The model proposes a solution to the global aperture problem by showing how information from feature tracking points, namely locations from which unambiguous motion directions can be computed, can propagate to ambiguous motion direction points, and capture the motion signals there. The model does this without computing intersections of constraints or parallel Fourier and non-Fourier pathways. Instead, the model uses orientationally-unselective cell responses to activate directionally-tuned transient cells. These transient cells, in turn, activate spatially short-range filters and competitive mechanisms over multiple spatial scales to generate speed-tuned and directionally-tuned cells. Spatially long-range filters and top-down feedback from grouping cells are then used to track motion of featural points and to select and propagate correct motion directions to ambiguous motion points. Top-down grouping can also prime the system to attend a particular motion direction. The model hereby links low-level automatic motion processing with attention-based motion processing. Homologs of model mechanisms have been used in models of other brain systems to simulate data about visual grouping, figure-ground separation, and speech perception. Earlier versions of the model have simulated data about short-range and long-range apparent motion, second-order motion, and the effects of parvocellular and magnocellular LGN lesions on motion perception.Office of Naval Research (N00014-920J-4015, N00014-91-J-4100, N00014-95-1-0657, N00014-95-1-0409, N00014-91-J-0597); Air Force Office of Scientific Research (F4620-92-J-0225, F49620-92-J-0499); National Science Foundation (IRI-90-00530
Neural Dynamics of Motion Perception: Direction Fields, Apertures, and Resonant Grouping
A neural network model of global motion segmentation by visual cortex is described. Called the Motion Boundary Contour System (BCS), the model clarifies how ambiguous local movements on a complex moving shape are actively reorganized into a coherent global motion signal. Unlike many previous researchers, we analyse how a coherent motion signal is imparted to all regions of a moving figure, not only to regions at which unambiguous motion signals exist. The model hereby suggests a solution to the global aperture problem. The Motion BCS describes how preprocessing of motion signals by a Motion Oriented Contrast Filter (MOC Filter) is joined to long-range cooperative grouping mechanisms in a Motion Cooperative-Competitive Loop (MOCC Loop) to control phenomena such as motion capture. The Motion BCS is computed in parallel with the Static BCS of Grossberg and Mingolla (1985a, 1985b, 1987). Homologous properties of the Motion BCS and the Static BCS, specialized to process movement directions and static orientations, respectively, support a unified explanation of many data about static form perception and motion form perception that have heretofore been unexplained or treated separately. Predictions about microscopic computational differences of the parallel cortical streams V1 --> MT and V1 --> V2 --> MT are made, notably the magnocellular thick stripe and parvocellular interstripe streams. It is shown how the Motion BCS can compute motion directions that may be synthesized from multiple orientations with opposite directions-of-contrast. Interactions of model simple cells, complex cells, hypercomplex cells, and bipole cells are described, with special emphasis given to new functional roles in direction disambiguation for endstopping at multiple processing stages and to the dynamic interplay of spatially short-range and long-range interactions.Air Force Office of Scientific Research (90-0175); Defense Advanced Research Projects Agency (90-0083); Office of Naval Research (N00014-91-J-4100
An unconvincing transformation? Michelson's interferential spectroscopy
Albert Abraham Michelson (1852-1931), the American optical physicist best known for his precise determination of the velocity of light and for his experiments concerning aether drift, is less often acknowledged as the creator of new spectroscopic instrumentation and new spectroscopies. He devised a new method of light analysis relying upon his favourite instrument – a particular configuration of optical interferometer – and published investigations of spectral line separation, Doppler-broadening and simple high-resolution spectra (1887-1898). Contemporaries did not pursue his method. Michelson himself discarded the technique by the end of the decade, promoting a new device, the ‘echelon spectroscope’, as a superior instrument. High-resolution spectroscopy was taken up by others at the turn of the century using the echelon, Fabry-Pérot etalon and similar instruments. Michelson’s ‘Light Wave Analysis’ was largely forgotten, but was rediscovered c1950 and developed over the following three decades into a technique rechristened ‘Fourier transform spectroscopy’. This paper presents Michelson’s interferometric work as a continuum of personal interests and historical context as an example of 'research technology' and 'peripheral science'
Rotation Curves of Spiral Galaxies
Rotation curves of spiral galaxies are the major tool for determining the
distribution of mass in spiral galaxies. They provide fundamental information
for understanding the dynamics, evolution and formation of spiral galaxies. We
describe various methods to derive rotation curves, and review the results
obtained. We discuss the basic characteristics of observed rotation curves in
relation to various galaxy properties, such as Hubble type, structure,
activity, and environment.Comment: 40 pages, 6 gif figures; Ann. Rev. Astron. Astrophys. Vol. 39, p.137,
200
Neural Dynamics of Motion Processing and Speed Discrimination
A neural network model of visual motion perception and speed discrimination is presented. The model shows how a distributed population code of speed tuning, that realizes a size-speed correlation, can be derived from the simplest mechanisms whereby activations of multiple spatially short-range filters of different size are transformed into speed-tuned cell responses. These mechanisms use transient cell responses to moving stimuli, output thresholds that covary with filter size, and competition. These mechanisms are proposed to occur in the Vl→7 MT cortical processing stream. The model reproduces empirically derived speed discrimination curves and simulates data showing how visual speed perception and discrimination can be affected by stimulus contrast, duration, dot density and spatial frequency. Model motion mechanisms are analogous to mechanisms that have been used to model 3-D form and figure-ground perception. The model forms the front end of a larger motion processing system that has been used to simulate how global motion capture occurs, and how spatial attention is drawn to moving forms. It provides a computational foundation for an emerging neural theory of 3-D form and motion perception.Office of Naval Research (N00014-92-J-4015, N00014-91-J-4100, N00014-95-1-0657, N00014-95-1-0409, N00014-94-1-0597, N00014-95-1-0409); Air Force Office of Scientific Research (F49620-92-J-0499); National Science Foundation (IRI-90-00530
Recommended from our members
Theory of the perceived motion direction of equal-spatial-frequency plaid stimuli.
At an early stage, 3 different systems independently extract visual motion information from visual inputs. At later stages, these systems combine their outputs. Here, we consider a much studied (>650 publications) class of visual stimuli, plaids, which are combinations of 2 sine waves. Currently, there is no quantitative theory that can account for the perceived motion of plaids. We consider only perceived plaid direction, not speed, and obtain a large set of data exploring the various dimensions in which same-spatial-frequency plaids differ. We find that only 2 of the 3 motion systems are active in plaid processing, and that plaids with temporal frequencies 10 Hz or greater typically stimulate only the first-order motion system, which combines the plaid components by vector summation: Each plaid component is represented by a contrast-strength vector whose length is contrast-squared times a factor representing the relative effectiveness of that component's temporal frequency. The third-order system, which becomes primary at low temporal frequencies, also represents a plaid as 2 vectors that sum according to their contrast strength: a pure plaid in which both components have equal contrast and a residual sine wave. Second-order motion is irrelevant for these plaids. These principles enable a contrast-strength-vector summation theory for the responses of the first-order and third-order motion systems. With zero parameters estimated from the data, the theory captures the essence of the full range of the plaid data and supports the counterintuitive hypothesis that motion direction is processed independently of speed at early stages of visual processing. (PsycInfo Database Record (c) 2020 APA, all rights reserved)
Cortical Dynamics of Visual Motion Perception: Short-Range and Long Range Apparent Motion
This article describes further evidence for a new neural network theory of biological motion perception that is called a Motion Boundary Contour System. This theory clarifies why parallel streams Vl-> V2 and Vl-> MT exist for static form and motion form processing among the areas Vl, V2, and MT of visual cortex. The Motion Boundary Contour System consists of several parallel copies, such that each copy is activated by a different range of receptive field sizes. Each copy is further subdivided into two hierarchically organized subsystems: a Motion Oriented Contrast Filter, or MOC Filter, for preprocessing moving images; and a Cooperative-Competitive Feedback Loop, or CC Loop, for generating emergent boundary segmentations of the filtered signals. The present article uses the MOC Filter to explain a variety of classical and recent data about short-range and long-range apparent motion percepts that have not yet been explained by alternative models. These data include split motion; reverse-contrast gamma motion; delta motion; visual inertia; group motion in response to a reverse-contrast Ternus display at short interstimulus intervals; speed-up of motion velocity as interfiash distance increases or flash duration decreases; dependence of the transition from element motion to group motion on stimulus duration and size; various classical dependencies between flash duration, spatial separation, interstimulus interval, and motion threshold known as Korte's Laws; and dependence of motion strength on stimulus orientation and spatial frequency. These results supplement earlier explanations by the model of apparent motion data that other models have not explained; a recent proposed solution of the global aperture problem, including explanations of motion capture and induced motion; an explanation of how parallel cortical systems for static form perception and motion form perception may develop, including a demonstration that these parallel systems are variations on a common cortical design; an explanation of why the geometries of static form and motion form differ, in particular why opposite orientations differ by 90°, whereas opposite directions differ by 180°, and why a cortical stream Vl -> V2 -> MT is needed; and a summary of how the main properties of other motion perception models can be assimilated into different parts of the Motion Boundary Contour System design.Air Force Office of Scientific Research (90-0175); Army Research Office (DAAL-03-88-K0088); Defense Advanced Research Projects Agency (AFOSR-90-0083); Hughes Aircraft Company (S1-903136
- …