14,208 research outputs found

    Cortical Dynamics of Contextually-Cued Attentive Visual Learning and Search: Spatial and Object Evidence Accumulation

    Full text link
    How do humans use predictive contextual information to facilitate visual search? How are consistently paired scenic objects and positions learned and used to more efficiently guide search in familiar scenes? For example, a certain combination of objects can define a context for a kitchen and trigger a more efficient search for a typical object, such as a sink, in that context. A neural model, ARTSCENE Search, is developed to illustrate the neural mechanisms of such memory-based contextual learning and guidance, and to explain challenging behavioral data on positive/negative, spatial/object, and local/distant global cueing effects during visual search. The model proposes how global scene layout at a first glance rapidly forms a hypothesis about the target location. This hypothesis is then incrementally refined by enhancing target-like objects in space as a scene is scanned with saccadic eye movements. The model clarifies the functional roles of neuroanatomical, neurophysiological, and neuroimaging data in visual search for a desired goal object. In particular, the model simulates the interactive dynamics of spatial and object contextual cueing in the cortical What and Where streams starting from early visual areas through medial temporal lobe to prefrontal cortex. After learning, model dorsolateral prefrontal cortical cells (area 46) prime possible target locations in posterior parietal cortex based on goalmodulated percepts of spatial scene gist represented in parahippocampal cortex, whereas model ventral prefrontal cortical cells (area 47/12) prime possible target object representations in inferior temporal cortex based on the history of viewed objects represented in perirhinal cortex. The model hereby predicts how the cortical What and Where streams cooperate during scene perception, learning, and memory to accumulate evidence over time to drive efficient visual search of familiar scenes.CELEST, an NSF Science of Learning Center (SBE-0354378); SyNAPSE program of Defense Advanced Research Projects Agency (HR0011-09-3-0001, HR0011-09-C-0011

    Data-Driven Shape Analysis and Processing

    Full text link
    Data-driven methods play an increasingly important role in discovering geometric, structural, and semantic relationships between 3D shapes in collections, and applying this analysis to support intelligent modeling, editing, and visualization of geometric data. In contrast to traditional approaches, a key feature of data-driven approaches is that they aggregate information from a collection of shapes to improve the analysis and processing of individual shapes. In addition, they are able to learn models that reason about properties and relationships of shapes without relying on hard-coded rules or explicitly programmed instructions. We provide an overview of the main concepts and components of these techniques, and discuss their application to shape classification, segmentation, matching, reconstruction, modeling and exploration, as well as scene analysis and synthesis, through reviewing the literature and relating the existing works with both qualitative and numerical comparisons. We conclude our report with ideas that can inspire future research in data-driven shape analysis and processing.Comment: 10 pages, 19 figure

    How to improve robustness in Kohonen maps and display additional information in Factorial Analysis: application to text mining

    Full text link
    This article is an extended version of a paper presented in the WSOM'2012 conference [1]. We display a combination of factorial projections, SOM algorithm and graph techniques applied to a text mining problem. The corpus contains 8 medieval manuscripts which were used to teach arithmetic techniques to merchants. Among the techniques for Data Analysis, those used for Lexicometry (such as Factorial Analysis) highlight the discrepancies between manuscripts. The reason for this is that they focus on the deviation from the independence between words and manuscripts. Still, we also want to discover and characterize the common vocabulary among the whole corpus. Using the properties of stochastic Kohonen maps, which define neighborhood between inputs in a non-deterministic way, we highlight the words which seem to play a special role in the vocabulary. We call them fickle and use them to improve both Kohonen map robustness and significance of FCA visualization. Finally we use graph algorithmic to exploit this fickleness for classification of words

    A Study of Self- Organizing Maps(SOM) Neural Network Using Matlab

    Get PDF
    Kohonen's Self-Organizing Map (SOM) is one of the most popular artificial neural network algorithms. Bing unsupervised neural network , Self Organizing Maps(SOM) have applications in different fields such as speech recognition, image processing and so on . This project includes a study of Self Organizing Maps neural network  using  MATLAB The structure, characteristics, implementation, applications and testing of this neural network for styles of one dimension and two dimensions have been considered It includes finding out the functions used for topology, and then finding out the distance function for this network throughout illustrative examples. Neural network implementation consists of three stages: Initialization (creation), Training, and Simulating. Explanation of the neighborhood concept is done. MATLAB software is used to perform how creating, training and simulating of a Self Organizing Map. Creation process consists of choosing a network parameters, plotting the results, then illustrating and identifying all functions used to create a Self Organizing Map. Training consists of weight initialization and weight vector creation. Simulating means testing the neural network using the initialization parameters and training vector created in the past two stages. The Study includes also four tests: The first test is used manual calculation procedure of the network mathematically.  The other three tests are procedures for different applications using MATLAB language. It becomes evident from the graphs of the results that it’s essential to have the weight vectors for the coordination field greater than the density of input vectors in the case of employing the network to seclude the training styles in output cells. Keywords: Neural Network, Simulating, Self- Organizing Maps, Competitive layer, training DOI: 10.7176/RHSS/10-6-01 Publication date:March 31st 202
    • …
    corecore