735,324 research outputs found

    The significance of memory in sensory cortex

    Get PDF
    Early sensory cortex is typically investigated in response to sensory stimulation, masking the contribution of internal signals. Recently, van Kerkoerle and colleagues reported that attention and memory signals segregate from sensory signals within specific layers of primary visual cortex, providing insight into the role of internal signals in sensory processing

    Evolution of Symbolisation in Chimpanzees and Neural Nets

    Get PDF
    from Introduction: Animal communication systems and human languages can be characterised by the type of cognitive abilities that are required. If we consider the main semiotic distinction between communication using icons, signals, or symbols (Peirce, 1955; Harnad, 1990; Deacon, 1997) we can identify different cognitive loads for each type of reference. The use and understanding of icons require instinctive behaviour (e.g. emotions) or simple perceptual processes (e.g. visual similarities between an icon and its meaning). Communication systems that use signals are characterised by referential associations between objects and visual or auditory signals. They require the cognitive ability to learn stimulus associations, such as in conditional learning. Symbols have double associations. Initially, symbolic systems require the establishment of associations between signals and objects. Secondly, other types of relationships are learned between the signals themselves. The use of rule for the logical combination of symbols is an example of symbolic relationship. Symbolisation is the ability to acquire and handle symbols and symbolic relationships

    Sensitivity to Timing and Order in Human Visual Cortex

    Get PDF
    Visual recognition takes a small fraction of a second and relies on the cascade of signals along the ventral visual stream. Given the rapid path through multiple processing steps between photoreceptors and higher visual areas, information must progress from stage to stage very quickly. This rapid progression of information suggests that fine temporal details of the neural response may be important to the how the brain encodes visual signals. We investigated how changes in the relative timing of incoming visual stimulation affect the representation of object information by recording intracranial field potentials along the human ventral visual stream while subjects recognized objects whose parts were presented with varying asynchrony. Visual responses along the ventral stream were sensitive to timing differences between parts as small as 17 ms. In particular, there was a strong dependency on the temporal order of stimulus presentation, even at short asynchronies. This sensitivity to the order of stimulus presentation provides evidence that the brain may use differences in relative timing as a means of representing information.Comment: 10 figures, 1 tabl

    Signaling in a polluted world: oxidative stress as an overlooked mechanism linking contaminants to animal communication

    Get PDF
    The capacity to communicate effectively with other individuals plays a critical role in the daily life of an individual and can have important fitness consequences. Animals rely on a number of visual and non-visual signals, whose production brings costs to the individual. The theory of honest signaling states that these costs are higher for low than for high-quality individuals, which prevents cheating and makes signals, such as skin and plumage colouration, indicators of individual’s quality or condition. The condition-dependent nature of signals makes them ideally suited as indicators of environmental quality, implying that signal production might be affected by contaminants. In this mini-review article, we have made the point that oxidative stress (OS) is one overlooked mechanism linking exposure to contaminants to signaling because (i) many contaminants can influence the individual’s oxidative balance, and (ii) generation of both visual and non-visual signals is sensitive to oxidative stress. To this end, we have provided the first comprehensive review on the way both non-organic (heavy metals, especially mercury) and organic (persistent organic pollutants) contaminants may influence either OS or sexual signaling. We have also paid special attention to emerging classes of pollutants like brominated flame-retardants and perfluoroalkoxy alkanes in order to stimulate research in this area. We have finally provided suggestions and warnings for future work on the links among OS, sexual signaling and contaminant exposure

    Luminous Intensity for Traffic Signals: A Scientific Basis for Performance Specifications

    Get PDF
    Humnan factors experiments on visual responses to simulated traffic signals using incandescent lamps and light-emitting diodes are described

    Information Optimization in Coupled Audio-Visual Cortical Maps

    Get PDF
    Barn owls hunt in the dark by using cues from both sight and sound to locate their prey. This task is facilitated by topographic maps of the external space formed by neurons (e.g., in the optic tectum) that respond to visual or aural signals from a specific direction. Plasticity of these maps has been studied in owls forced to wear prismatic spectacles that shift their visual field. Adaptive behavior in young owls is accompanied by a compensating shift in the response of (mapped) neurons to auditory signals. We model the receptive fields of such neurons by linear filters that sample correlated audio-visual signals, and search for filters that maximize the gathered information, while subject to the costs of rewiring neurons. Assuming a higher fidelity of visual information, we find that the corresponding receptive fields are robust and unchanged by artificial shifts. The shape of the aural receptive field, however, is controlled by correlations between sight and sound. In response to prismatic glasses, the aural receptive fields shift in the compensating direction, although their shape is modified due to the costs of rewiring.Comment: 7 pages, 1 figur

    Learning Visual Features from Snapshots for Web Search

    Full text link
    When applying learning to rank algorithms to Web search, a large number of features are usually designed to capture the relevance signals. Most of these features are computed based on the extracted textual elements, link analysis, and user logs. However, Web pages are not solely linked texts, but have structured layout organizing a large variety of elements in different styles. Such layout itself can convey useful visual information, indicating the relevance of a Web page. For example, the query-independent layout (i.e., raw page layout) can help identify the page quality, while the query-dependent layout (i.e., page rendered with matched query words) can further tell rich structural information (e.g., size, position and proximity) of the matching signals. However, such visual information of layout has been seldom utilized in Web search in the past. In this work, we propose to learn rich visual features automatically from the layout of Web pages (i.e., Web page snapshots) for relevance ranking. Both query-independent and query-dependent snapshots are considered as the new inputs. We then propose a novel visual perception model inspired by human's visual search behaviors on page viewing to extract the visual features. This model can be learned end-to-end together with traditional human-crafted features. We also show that such visual features can be efficiently acquired in the online setting with an extended inverted indexing scheme. Experiments on benchmark collections demonstrate that learning visual features from Web page snapshots can significantly improve the performance of relevance ranking in ad-hoc Web retrieval tasks.Comment: CIKM 201
    corecore