501 research outputs found

    Inter-Guiana geological conference

    Get PDF

    Bridging the Gap between Probabilistic and Deterministic Models: A Simulation Study on a Variational Bayes Predictive Coding Recurrent Neural Network Model

    Full text link
    The current paper proposes a novel variational Bayes predictive coding RNN model, which can learn to generate fluctuated temporal patterns from exemplars. The model learns to maximize the lower bound of the weighted sum of the regularization and reconstruction error terms. We examined how this weighting can affect development of different types of information processing while learning fluctuated temporal patterns. Simulation results show that strong weighting of the reconstruction term causes the development of deterministic chaos for imitating the randomness observed in target sequences, while strong weighting of the regularization term causes the development of stochastic dynamics imitating probabilistic processes observed in targets. Moreover, results indicate that the most generalized learning emerges between these two extremes. The paper concludes with implications in terms of the underlying neuronal mechanisms for autism spectrum disorder and for free action.Comment: This paper is accepted the 24th International Conference On Neural Information Processing (ICONIP 2017). The previous submission to arXiv is replaced by this version because there was an error in Equation

    Affective value in the predictive mind

    Get PDF
    Although affective value is fundamental in explanations of behavior, it is still a somewhat alien concept in cognitive science. It implies a normativity or directionality that mere information processing models cannot seem to provide. In this paper we trace how affective value can emerge from information processing in the brain, as described by predictive processing. We explain the grounding of predictive processing in homeostasis, and articulate the implications this has for the concept of reward and motivation. However, at first sight, this new conceptualization creates a strong tension with conventional ideas on reward and affective experience. We propose this tension can be resolved by realizing that valence, a core component of all emotions, might be the reflection of a specific aspect of predictive information processing, namely the dynamics in prediction errors across time and the expectations we, in turn, form about these dynamics. Specifically, positive affect seems to be caused by positive rates of prediction error reduction, while negative affect is induced by a shift in a state with lower prediction errors to one with higher prediction errors (i.e., a negative rate of error reduction). We also consider how intense emotional episodes might be related to unexpected changes in prediction errors, suggesting that we also build (meta)predictions on error reduction rates. Hence in this account emotions appear as the continuous non-conceptual feedback on evolving —increasing or decreasing—uncertainties relative to our predictions. The upshot of this view is that the various emotions, from “basic” ones to the non-typical ones such as humor, curiosity and aesthetic affects, can be shown to follow a single underlying logic. Our analysis takes several cues from existing emotion theories but deviates from them in revealing ways. The account on offer does not just specify the interactions between emotion and cognition, rather it entails a deep integration of the two

    MELODI : Semantic Similarity of Words and Compositional Phrases using Latent Vector Weighting

    Get PDF
    International audienceIn this paper we present our system for the SemEval 2013 Task 5a on semantic similar- ity of words and compositional phrases. Our system uses a dependency-based vector space model, in combination with a technique called latent vector weighting. The system computes the similarity between a particular noun in- stance and the head noun of a particular noun phrase, which was weighted according to the semantics of the modifier. The system is en- tirely unsupervised; one single parameter, the similarity threshold, was tuned using the train- ing data

    A Tensor-based Factorization Model of Semantic Compositionality

    Get PDF
    International audienceIn this paper, we present a novel method for the computation of compositionality within a distributional framework. The key idea is that compositionality is modeled as a multi-way interaction between latent factors, which are automatically constructed from corpus data. We use our method to model the composition of subject verb object triples. The method consists of two steps. First, we compute a latent factor model for nouns from standard co-occurrence data. Next, the latent factors are used to induce a latent model of three-way subject verb object interactions. Our model has been evaluated on a similarity task for transitive phrases, in which it exceeds the state of the art

    MELODI : A Supervised Distributional Approach for Free Paraphrasing of Noun Compounds

    Get PDF
    National audienceThis paper describes the system submitted by the MELODI team for the SemEval-2013 Task 4 : Free Paraphrases of Noun Compounds (Hendrickx et al., 2013). Our approach combines the strength of an unsupervised distributional word space model with a supervised maximum-entropy classification model; the distributional model yields a feature representation for a particular compound noun, which is subsequently used by the classifier to induce a number of appropriate paraphrases

    Skal undervisningen i naturfagene fremover IBSE’s?

    Get PDF
    Kommentar til artiklen “Inquiry-based science education – har naturfagsundervisningen i Danmarkbrug for det?” af Lars Domino Østergaard, Martin Sillasen, Jens Hagelskjér og Henrik Bavnhþj bragti MONA, 2010(4)
    • 

    corecore