35,820 research outputs found

    Precis of neuroconstructivism: how the brain constructs cognition

    Get PDF
    Neuroconstructivism: How the Brain Constructs Cognition proposes a unifying framework for the study of cognitive development that brings together (1) constructivism (which views development as the progressive elaboration of increasingly complex structures), (2) cognitive neuroscience (which aims to understand the neural mechanisms underlying behavior), and (3) computational modeling (which proposes formal and explicit specifications of information processing). The guiding principle of our approach is context dependence, within and (in contrast to Marr [1982]) between levels of organization. We propose that three mechanisms guide the emergence of representations: competition, cooperation, and chronotopy; which themselves allow for two central processes: proactivity and progressive specialization. We suggest that the main outcome of development is partial representations, distributed across distinct functional circuits. This framework is derived by examining development at the level of single neurons, brain systems, and whole organisms. We use the terms encellment, embrainment, and embodiment to describe the higher-level contextual influences that act at each of these levels of organization. To illustrate these mechanisms in operation we provide case studies in early visual perception, infant habituation, phonological development, and object representations in infancy. Three further case studies are concerned with interactions between levels of explanation: social development, atypical development and within that, developmental dyslexia. We conclude that cognitive development arises from a dynamic, contextual change in embodied neural structures leading to partial representations across multiple brain regions and timescales, in response to proactively specified physical and social environment

    The case for the development and use of "ecologically valid" measures of executive function in experimental and clinical neuropsychology

    Get PDF
    This article considers the scientific process whereby new and better clinical tests of executive function might be developed, and what form they might take. We argue that many of the traditional tests of executive function most commonly in use (e.g., the Wisconsin Card Sorting Test; Stroop) are adaptations of procedures that emerged almost coincidentally from conceptual and experimental frameworks far removed from those currently in favour, and that the prolongation of their use has been encouraged by a sustained period of concentration on “construct-driven” experimentation in neuropsychology. This resulted from the special theoretical demands made by the field of executive function, but was not a necessary consequence, and may not even have been a useful one. Whilst useful, these tests may not therefore be optimal for their purpose. We consider as an alternative approach a function-led development programme which in principle could yield tasks better suited to the concerns of the clinician because of the transparency afforded by increased “representativeness” and “generalisability.” We further argue that the requirement of such a programme to represent the interaction between the individual and situational context might also provide useful constraints for purely experimental investigations. We provide an example of such a programme with reference to the Multiple Errands and Six Element tests

    Application of Computational Intelligence Techniques to Process Industry Problems

    Get PDF
    In the last two decades there has been a large progress in the computational intelligence research field. The fruits of the effort spent on the research in the discussed field are powerful techniques for pattern recognition, data mining, data modelling, etc. These techniques achieve high performance on traditional data sets like the UCI machine learning database. Unfortunately, this kind of data sources usually represent clean data without any problems like data outliers, missing values, feature co-linearity, etc. common to real-life industrial data. The presence of faulty data samples can have very harmful effects on the models, for example if presented during the training of the models, it can either cause sub-optimal performance of the trained model or in the worst case destroy the so far learnt knowledge of the model. For these reasons the application of present modelling techniques to industrial problems has developed into a research field on its own. Based on the discussion of the properties and issues of the data and the state-of-the-art modelling techniques in the process industry, in this paper a novel unified approach to the development of predictive models in the process industry is presented

    On Neuromechanical Approaches for the Study of Biological Grasp and Manipulation

    Full text link
    Biological and robotic grasp and manipulation are undeniably similar at the level of mechanical task performance. However, their underlying fundamental biological vs. engineering mechanisms are, by definition, dramatically different and can even be antithetical. Even our approach to each is diametrically opposite: inductive science for the study of biological systems vs. engineering synthesis for the design and construction of robotic systems. The past 20 years have seen several conceptual advances in both fields and the quest to unify them. Chief among them is the reluctant recognition that their underlying fundamental mechanisms may actually share limited common ground, while exhibiting many fundamental differences. This recognition is particularly liberating because it allows us to resolve and move beyond multiple paradoxes and contradictions that arose from the initial reasonable assumption of a large common ground. Here, we begin by introducing the perspective of neuromechanics, which emphasizes that real-world behavior emerges from the intimate interactions among the physical structure of the system, the mechanical requirements of a task, the feasible neural control actions to produce it, and the ability of the neuromuscular system to adapt through interactions with the environment. This allows us to articulate a succinct overview of a few salient conceptual paradoxes and contradictions regarding under-determined vs. over-determined mechanics, under- vs. over-actuated control, prescribed vs. emergent function, learning vs. implementation vs. adaptation, prescriptive vs. descriptive synergies, and optimal vs. habitual performance. We conclude by presenting open questions and suggesting directions for future research. We hope this frank assessment of the state-of-the-art will encourage and guide these communities to continue to interact and make progress in these important areas

    Laruelle Qua Stiegler: On Non-Marxism and the Transindividual

    Get PDF
    Alexander R. Galloway and Jason R. LaRiviére’s article “Compression in Philosophy” seeks to pose François Laruelle’s engagement with metaphysics against Bernard Stiegler’s epistemological rendering of idealism. Identifying Laruelle as the theorist of genericity, through which mankind and the world are identified through an index of “opacity,” the authors argue that Laruelle does away with all deleterious philosophical “data.” Laruelle’s generic immanence is posed against Stiegler’s process of retention and discretization, as Galloway and LaRiviére argue that Stiegler’s philosophy seeks to reveal an enchanted natural world through the development of noesis. By further developing Laruelle and Stiegler’s Marxian projects, I seek to demonstrate the relation between Stiegler's artefaction and “compression” while, simultaneously, I also seek to create further bricolage between Laruelle and Stiegler. I also further elaborate on their distinct engagement(s) with Marx, offering the mold of synthesis as an alternative to compression when considering Stiegler’s work on transindividuation. In turn, this paper seeks to survey some of the contemporary theorists drawing from Stiegler (Yuk Hui, Al-exander Wilson and Daniel Ross) and Laruelle (Anne-Françoise Schmidt, Gilles Grelet, Ray Brassier, Katerina Kolozova, John Ó Maoilearca and Jonathan Fardy) to examine political discourse regarding the posthuman and non-human, with a particular interest in Kolozova’s unified theory of standard philosophy and Capital

    Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware

    Full text link
    In recent years the field of neuromorphic low-power systems that consume orders of magnitude less power gained significant momentum. However, their wider use is still hindered by the lack of algorithms that can harness the strengths of such architectures. While neuromorphic adaptations of representation learning algorithms are now emerging, efficient processing of temporal sequences or variable length-inputs remain difficult. Recurrent neural networks (RNN) are widely used in machine learning to solve a variety of sequence learning tasks. In this work we present a train-and-constrain methodology that enables the mapping of machine learned (Elman) RNNs on a substrate of spiking neurons, while being compatible with the capabilities of current and near-future neuromorphic systems. This "train-and-constrain" method consists of first training RNNs using backpropagation through time, then discretizing the weights and finally converting them to spiking RNNs by matching the responses of artificial neurons with those of the spiking neurons. We demonstrate our approach by mapping a natural language processing task (question classification), where we demonstrate the entire mapping process of the recurrent layer of the network on IBM's Neurosynaptic System "TrueNorth", a spike-based digital neuromorphic hardware architecture. TrueNorth imposes specific constraints on connectivity, neural and synaptic parameters. To satisfy these constraints, it was necessary to discretize the synaptic weights and neural activities to 16 levels, and to limit fan-in to 64 inputs. We find that short synaptic delays are sufficient to implement the dynamical (temporal) aspect of the RNN in the question classification task. The hardware-constrained model achieved 74% accuracy in question classification while using less than 0.025% of the cores on one TrueNorth chip, resulting in an estimated power consumption of ~17 uW
    corecore