1,481 research outputs found
Being-in-the-world-with: Presence Meets Social And Cognitive Neuroscience
In this chapter we will discuss the concepts of âpresenceâ (Inner Presence) and âsocial presenceâ (Co-presence) within a cognitive and ecological perspective. Specifically, we claim that the concepts of âpresenceâ and âsocial presenceâ are the possible links between self, action, communication and culture. In the first section we will provide a capsule view of Heideggerâs work by examining the two main features of the Heideggerian concept of âbeingâ: spatiality and âbeing withâ. We argue that different visions from social and cognitive sciences â Situated Cognition, Embodied Cognition, Enactive Approach, Situated Simulation, Covert Imitation - and discoveries from neuroscience â Mirror and Canonical Neurons - have many contact points with this view. In particular, these data suggest that our conceptual system dynamically produces contextualized representations (simulations) that support grounded action in different situations. This is allowed by a common coding â the motor code â shared by perception, action and concepts. This common coding also allows the subject for natively recognizing actions done by other selves within the phenomenological contents. In this picture we argue that the role of presence and social presence is to allow the process of self-identification through the separation between âselfâ and âother,â and between âinternalâ and âexternalâ. Finally, implications of this position for communication and media studies are discussed by way of conclusion
Cognitive Phonetics: The Transduction of Distinctive Features at the Phonology-Phonetics Interface
We propose that the interface between phonology and phonetics is mediated by a transduction process that converts elementary units of phonological computation, features, into temporally coordinated neuromuscular patterns, called âTrue Phonetic Representationsâ, which are directly interpretable by the motor system of speech production. Our view of the interface is constrained by substance-free generative phonological assumptions and by insights gained from psycholinguistic and phonetic models of speech production. To distinguish transduction of abstract phonological units into planned neuromuscular patterns from the biomechanics of speech production usually associated with physiological phonetics, we have termed this interface theory âCognitive Phoneticsâ (CP). The inner workings of CP are described in terms of Marrâs (1982/2010) tri-level approach, which we used to construct a linking hypothesis relating formal phonology to neurobiological activity. Potential neurobiological correlates supporting various parts of CP are presented. We also argue that CP augments the study of certain phonetic phenomena, most notably coarticulation, and suggest that some phenomena usually considered phonological (e.g., naturalness and gradience) receive better explanations within CP
Lennebergâs Contributions to the Biology of Language and Child Aphasiology: Resonation and Brain Rhythmicity as Key Mechanisms
This paper aims to re-evaluate the legacy of Eric Lennebergâs monumental Biological Foundations of Language, with special reference to his biolinguistic framework and view on (child) aphasiology. The argument draws from the following concepts from Lennebergâs work: (i) language (latent struc- ture vs. realized structure) as independent of externalization; (ii) resonance theory; (iii) brain rhythmicity; and (iv) aphasia as temporal dysfunction. Specifically, it will be demonstrated that Lennebergâs original version of the critical period hypothesis and his child aphasiology lend themselves to elucidating a child aphasia of epileptic origin called Landau-Kleffner syndrome (LKS), thereby opening a possible hope for recovery from the disease. Moreover, it will be claimed that, to the extent that the language disorder in LKS can be couched in these terms, it can serve as strong âliv- ingâ evidence in support of Lennebergâs critical period hypothesis and his view on child aphasiology
Explaining Schizophrenia: Auditory Verbal Hallucination and SelfâMonitoring
Do selfâmonitoring accounts, a dominant account of the positive symptoms of schizophrenia, explain auditory verbal hallucination? In this essay, I argue that the account fails to answer crucial questions any explanation of auditory verbal hallucination must address. Where the account provides a plausible answer, I make the case for an alternative explanation: auditory verbal hallucination is not the result of a failed control mechanism, namely failed selfâmonitoring, but, rather, of the persistent automaticity of auditory experience of a voice. My argument emphasizes the importance of careful examination of phenomenology as providing substantive constraints on causal models of the positive symptoms in schizophrenia
Length and orientation constancy learning in 2-dimensions with auditory sensory substitution: the importance of self-initiated movement
A subset of sensory substitution (SS) devices translate images into sounds in real time using a portable computer, camera, and headphones. Perceptual constancy is the key to understanding both functional and phenomenological aspects of perception with SS. In particular, constancies enable object externalization, which is critical to the performance of daily tasks such as obstacle avoidance and locating dropped objects. In order to improve daily task performance by the blind, and determine if constancies can be learned with SS, we trained blind (N = 4) and sighted (N = 10) individuals on length and orientation constancy tasks for 8 days at about 1 h per day with an auditory SS device. We found that blind and sighted performance at the constancy tasks significantly improved, and attained constancy performance that was above chance. Furthermore, dynamic interactions with stimuli were critical to constancy learning with the SS device. In particular, improved task learning significantly correlated with the number of spontaneous left-right head-tilting movements while learning length constancy. The improvement from previous head-tilting trials even transferred to a no-head-tilt condition. Therefore, not only can SS learning be improved by encouraging head movement while learning, but head movement may also play an important role in learning constancies in the sighted. In addition, the learning of constancies by the blind and sighted with SS provides evidence that SS may be able to restore vision-like functionality to the blind in daily tasks
The mind-body problem; three equations and one solution represented by immaterial-material data
Human life occurs within a complex bio-psycho-social milieu, a heterogeneous system that is integrated by multiple bidirectional interrelations existing between the abstract-intangible ideas and physical-chemical support of environment. The mind is thus placed between the abstract ideas/ concepts and neurobiological brain that is further connected to environment. In other words, the mind acts as an interface between the immaterial (abstract/ intangible) data and material (biological) support. The science is unable to conceives and explains an interaction between the immaterial and material domains (to understand nature of the mind), this question generating in literature the mind-body problem. We have published in the past a succession of articles related to the mind-body problem, in order to demonstrate the fact that this question is actually a false issue. The phenomenon of immaterial-material interaction is impossible to be explained because it never occurs, which means that there is no need to explain the immaterial-material interaction. Our mind implies only a temporal association between the immaterial data and material support, this dynamic interrelation being presented and argued here as a solution to the mind-body problem. The limited psycho-biologic approach of the mind-body problem is expanded here to a more comprehensive and feasible bio-psycho-social perspective, generating thus three distinct (bio- psychological, bio-social, and psycho-social) equations. These three equations can be solved through a solution represented by a dynamic cerebral system (two distinct and interconnected subunits of the brain) which presumably could have the capability of receiving and processing abstract data through association (with no interaction) between immaterial and material data
Auditory spatial processing in Alzheimer's disease.
: The location and motion of sounds in space are important cues for encoding the auditory world. Spatial processing is a core component of auditory scene analysis, a cognitively demanding function that is vulnerable in Alzheimer's disease. Here we designed a novel neuropsychological battery based on a virtual space paradigm to assess auditory spatial processing in patient cohorts with clinically typical Alzheimer's disease (n = 20) and its major variant syndrome, posterior cortical atrophy (n = 12) in relation to healthy older controls (n = 26). We assessed three dimensions of auditory spatial function: externalized versus non-externalized sound discrimination, moving versus stationary sound discrimination and stationary auditory spatial position discrimination, together with non-spatial auditory and visual spatial control tasks. Neuroanatomical correlates of auditory spatial processing were assessed using voxel-based morphometry. Relative to healthy older controls, both patient groups exhibited impairments in detection of auditory motion, and stationary sound position discrimination. The posterior cortical atrophy group showed greater impairment for auditory motion processing and the processing of a non-spatial control complex auditory property (timbre) than the typical Alzheimer's disease group. Voxel-based morphometry in the patient cohort revealed grey matter correlates of auditory motion detection and spatial position discrimination in right inferior parietal cortex and precuneus, respectively. These findings delineate auditory spatial processing deficits in typical and posterior Alzheimer's disease phenotypes that are related to posterior cortical regions involved in both syndromic variants and modulated by the syndromic profile of brain degeneration. Auditory spatial deficits contribute to impaired spatial awareness in Alzheimer's disease and may constitute a novel perceptual model for probing brain network disintegration across the Alzheimer's disease syndromic spectrum.<br/
Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss.
Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.The research was supported by MRC grant G0701870 and the Vision and Eye Research Unit (VERU), Postgraduate Medical Institute at Anglia Ruskin University.This is the final version of the article. It first appeared from Springer via http://dx.doi.org/10.3758/s13414-015-1015-
- âŠ