382,661 research outputs found

    On Reverse Engineering in the Cognitive and Brain Sciences

    Get PDF
    Various research initiatives try to utilize the operational principles of organisms and brains to develop alternative, biologically inspired computing paradigms and artificial cognitive systems. This paper reviews key features of the standard method applied to complexity in the cognitive and brain sciences, i.e. decompositional analysis or reverse engineering. The indisputable complexity of brain and mind raise the issue of whether they can be understood by applying the standard method. Actually, recent findings in the experimental and theoretical fields, question central assumptions and hypotheses made for reverse engineering. Using the modeling relation as analyzed by Robert Rosen, the scientific analysis method itself is made a subject of discussion. It is concluded that the fundamental assumption of cognitive science, i.e. complex cognitive systems can be analyzed, understood and duplicated by reverse engineering, must be abandoned. Implications for investigations of organisms and behavior as well as for engineering artificial cognitive systems are discussed.Comment: 19 pages, 5 figure

    Brain Modularity Mediates the Relation between Task Complexity and Performance

    Full text link
    Recent work in cognitive neuroscience has focused on analyzing the brain as a network, rather than as a collection of independent regions. Prior studies taking this approach have found that individual differences in the degree of modularity of the brain network relate to performance on cognitive tasks. However, inconsistent results concerning the direction of this relationship have been obtained, with some tasks showing better performance as modularity increases and other tasks showing worse performance. A recent theoretical model (Chen & Deem, 2015) suggests that these inconsistencies may be explained on the grounds that high-modularity networks favor performance on simple tasks whereas low-modularity networks favor performance on more complex tasks. The current study tests these predictions by relating modularity from resting-state fMRI to performance on a set of simple and complex behavioral tasks. Complex and simple tasks were defined on the basis of whether they did or did not draw on executive attention. Consistent with predictions, we found a negative correlation between individuals' modularity and their performance on a composite measure combining scores from the complex tasks but a positive correlation with performance on a composite measure combining scores from the simple tasks. These results and theory presented here provide a framework for linking measures of whole brain organization from network neuroscience to cognitive processing.Comment: 47 pages; 4 figure

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    The role of the individual in the coming era of process-based therapy

    Get PDF
    For decades the development of evidence-based therapy has been based on experimental tests of protocols designed to impact psychiatric syndromes. As this paradigm weakens, a more process-based therapy approach is rising in its place, focused on how to best target and change core biopsychosocial processes in specific situations for given goals with given clients. This is an inherently more idiographic question than has normally been at issue in evidence-based therapy over the last few decades. In this article we explore methods of assessment and analysis that can integrate idiographic and nomothetic approaches in a process-based era.Accepted manuscrip

    Computational physics of the mind

    Get PDF
    In the XIX century and earlier such physicists as Newton, Mayer, Hooke, Helmholtz and Mach were actively engaged in the research on psychophysics, trying to relate psychological sensations to intensities of physical stimuli. Computational physics allows to simulate complex neural processes giving a chance to answer not only the original psychophysical questions but also to create models of mind. In this paper several approaches relevant to modeling of mind are outlined. Since direct modeling of the brain functions is rather limited due to the complexity of such models a number of approximations is introduced. The path from the brain, or computational neurosciences, to the mind, or cognitive sciences, is sketched, with emphasis on higher cognitive functions such as memory and consciousness. No fundamental problems in understanding of the mind seem to arise. From computational point of view realistic models require massively parallel architectures
    • …
    corecore