143 research outputs found

    The standard posture of the hand

    Get PDF
    Perceived limb position is known to rely on sensory signals and motor commands. Another potential source of input is a standard representation of body posture, which may bias perceived limb position towards more stereotyped positions. Recent results show that tactile stimuli are processed more efficiently when delivered to a thumb in a relatively low position or an index finger in a relatively high position. This observation suggests that we may have a standard posture of the body that promotes a more efficient interaction with the environment. In this study, we mapped the standard posture of the entire hand by characterizing the spatial associations of all five digits. Moreover, we show that the effect is not an artefact of intermanual integration. Results showed that the thumb is associated with low positions, while the other fingers are associated with upper locations

    Linguistic profile automated characterisation in pluripotential clinical high-risk mental state (CHARMS) conditions: methodology of a multicentre observational study

    Get PDF
    Introduction: Language is usually considered the social vehicle of thought in intersubjective communications. However, the relationship between language and high- order cognition seems to evade this canonical and unidirectional description (ie, the notion of language as a simple means of thought communication). In recent years, clinical high at-risk mental state (CHARMS) criteria (evolved from the Ultra-High-Risk paradigm) and the introduction of the Clinical Staging system have been proposed to address the dynamicity of early psychopathology. At the same time, natural language processing (NLP) techniques have greatly evolved and have been successfully applied to investigate different neuropsychiatric conditions. The combination of at-risk mental state paradigm, clinical staging system and automated NLP methods, the latter applied on spoken language transcripts, could represent a useful and convenient approach to the problem of early psychopathological distress within a transdiagnostic risk paradigm. Methods and analysis: Help-seeking young people presenting psychological distress (CHARMS+/− and Clinical Stage 1a or 1b; target sample size for both groups n=90) will be assessed through several psychometric tools and multiple speech analyses during an observational period of 1-year, in the context of an Italian multicentric study. Subjects will be enrolled in different contexts: Department of Neuroscience, Rehabilitation, Ophthalmology, Genetics, Maternal and Child Health (DINOGMI), Section of Psychiatry, University of Genoa—IRCCS Ospedale Policlinico San Martino, Genoa, Italy; Mental Health Department—territorial mental services (ASL 3—Genoa), Genoa, Italy; and Mental Health Department—territorial mental services (AUSL—Piacenza), Piacenza, Italy. The conversion rate to full-blown psychopathology (CS 2) will be evaluated over 2 years of clinical observation, to further confirm the predictive and discriminative value of CHARMS criteria and to verify the possibility of enriching them with several linguistic features, derived from a fine-grained automated linguistic analysis of speech. Ethics and dissemination: The methodology described in this study adheres to ethical principles as formulated in the Declaration of Helsinki and is compatible with International Conference on Harmonization (ICH)-good clinical practice. The research protocol was reviewed and approved by two different ethics committees (CER Liguria approval code: 591/2020—id.10993; Comitato Etico dell’Area Vasta Emilia Nord approval code: 2022/0071963). Participants will provide their written informed consent prior to study enrolment and parental consent will be needed in the case of participants aged less than 18 years old. Experimental results will be carefully shared through publication in peer- reviewed journals, to ensure proper data reproducibility. Trial registration number DOI:10.17605/OSF.IO/BQZTN

    The role of the right temporoparietal junction in perceptual conflict: detection or resolution?

    Get PDF
    The right temporoparietal junction (rTPJ) is a polysensory cortical area that plays a key role in perception and awareness. Neuroimaging evidence shows activation of rTPJ in intersensory and sensorimotor conflict situations, but it remains unclear whether this activity reflects detection or resolution of such conflicts. To address this question, we manipulated the relationship between touch and vision using the so-called mirror-box illusion. Participants' hands lay on either side of a mirror, which occluded their left hand and reflected their right hand, but created the illusion that they were looking directly at their left hand. The experimenter simultaneously touched either the middle (D3) or the ring finger (D4) of each hand. Participants judged, which finger was touched on their occluded left hand. The visual stimulus corresponding to the touch on the right hand was therefore either congruent (same finger as touch) or incongruent (different finger from touch) with the task-relevant touch on the left hand. Single-pulse transcranial magnetic stimulation (TMS) was delivered to the rTPJ immediately after touch. Accuracy in localizing the left touch was worse for D4 than for D3, particularly when visual stimulation was incongruent. However, following TMS, accuracy improved selectively for D4 in incongruent trials, suggesting that the effects of the conflicting visual information were reduced. These findings suggest a role of rTPJ in detecting, rather than resolving, intersensory conflict

    Influence of Motor Planning on Distance Perception within the Peripersonal Space

    Get PDF
    We examined whether movement costs as defined by movement magnitude have an impact on distance perception in near space. In Experiment 1, participants were given a numerical cue regarding the amplitude of a hand movement to be carried out. Before the movement execution, the length of a visual distance had to be judged. These visual distances were judged to be larger, the larger the amplitude of the concurrently prepared hand movement was. In Experiment 2, in which numerical cues were merely memorized without concurrent movement planning, this general increase of distance with cue size was not observed. The results of these experiments indicate that visual perception of near space is specifically affected by the costs of planned hand movements

    Joint angle variability and co-variation in a reaching with a rod task

    Get PDF
    The problem at the heart of motor control is how the myriad units of the neuromotor system are coordinated to perform goal-directed movements. Although for long these numerous degrees of freedom (DOFs) were considered redundant, recent views emphasize more that the DOFs should be considered abundant, allowing flexible performance. We studied how variability in arm joints was employed to stabilize the displaced end-effector in tool use to examine how the neuromotor system flexibly exploits DOFs in the upper extremity. Participants made pointing movements with the index finger and with the index finger extended by rods of 10, 20, and 30 cm. Using the uncontrolled manifold (UCM) method, the total joint angle variance was decomposed into two parts, the joint angle variance that did not affect the position of the end-effector (VUCM) and the variance that results in a deviation of the position of the end-effector from its mean (VORT). Analyses showed that some angles depended on length of the rod in use. For all rod lengths, VUCM was larger than VORT, and this did not differ over rod lengths, demonstrating that the arm was organized into a synergy. Finally, the variation in the joint angles in the arm as well as the degree of co-variation between these angles did not differ for the rod’s tip and the hand. We concluded that synergies are formed in the arm during reaching with an extended end-effector and those synergies stabilize different parts of the arm+rod system equally

    Peripersonal space representation develops independently from visual experience

    Get PDF
    Our daily-life actions are typically driven by vision. When acting upon an object, we need to represent its visual features (e.g. shape, orientation, etc.) and to map them into our own peripersonal space. But what happens with people who have never had any visual experience? How can they map object features into their own peripersonal space? Do they do it differently from sighted agents? To tackle these questions, we carried out a series of behavioral experiments in sighted and congenitally blind subjects. We took advantage of a spatial alignment effect paradigm, which typically refers to a decrease of reaction times when subjects perform an action (e.g., a reach-To-grasp pantomime) congruent with that afforded by a presented object. To systematically examine peripersonal space mapping, we presented visual or auditory affording objects both within and outside subjects' reach. The results showed that sighted and congenitally blind subjects did not differ in mapping objects into their own peripersonal space. Strikingly, this mapping occurred also when objects were presented outside subjects' reach, but within the peripersonal space of another agent. This suggests that (the lack of) visual experience does not significantly affect the development of both one's own and others' peripersonal space representation

    Using a Stick Does Not Necessarily Alter Judged Distances or Reachability

    Get PDF
    Background It has been reported that participants judge an object to be closer after a stick has been used to touch it than after touching it with the hand. In this study we try to find out why this is so. Methodology We showed six participants a cylindrical object on a table. On separate trials (randomly intermixed) participants either estimated verbally how far the object is from their body or they touched a remembered location. Touching was done either with the hand or with a stick (in separate blocks). In three different sessions, participants touched either the object location or the location halfway to the object location. Verbal judgments were given either in centimeters or in terms of whether the object would be reachable with the hand. No differences in verbal distance judgments or touching responses were found between the blocks in which the stick or the hand was used. Conclusion Instead of finding out why the judged distance changes when using a tool, we found that using a stick does not necessarily alter judged distances or judgments about the reachability of objects

    Altered visual feedback from an embodied avatar unconsciously influences movement amplitude and muscle activity

    Get PDF
    Evidence suggests that the sense of the position of our body parts can be surreptitiously deceived, for instance through illusory visual inputs. However, whether altered visual feedback during limb movement can induce substantial unconscious motor and muscular adjustments is not known. To address this question, we covertly manipulated virtual body movements in immersive virtual reality. Participants were instructed to flex their elbow to 90° while tensing an elastic band, as their virtual arm reproduced the same, a reduced (75°), or an amplified (105°) movement. We recorded muscle activity using electromyography, and assessed body ownership, agency and proprioception of the arm. Our results not only show that participants compensated for the avatar’s manipulated arm movement while being completely unaware of it, but also that it is possible to induce unconscious motor adaptations requiring significant changes in muscular activity. Altered visual feedback through body ownership illusions can influence motor performance in a process that bypasses awareness

    The Impact of Spatial Incongruence on an Auditory-Visual Illusion

    Get PDF
    The sound-induced flash illusion is an auditory-visual illusion--when a single flash is presented along with two or more beeps, observers report seeing two or more flashes. Previous research has shown that the illusion gradually disappears as the temporal delay between auditory and visual stimuli increases, suggesting that the illusion is consistent with existing temporal rules of neural activation in the superior colliculus to multisensory stimuli. However little is known about the effect of spatial incongruence, and whether the illusion follows the corresponding spatial rule. If the illusion occurs less strongly when auditory and visual stimuli are separated, then integrative processes supporting the illusion must be strongly dependant on spatial congruence. In this case, the illusion would be consistent with both the spatial and temporal rules describing response properties of multisensory neurons in the superior colliculus.status: publishe

    The Remapping of Time by Active Tool-Use

    Get PDF
    Multiple, action-based space representations are each based on the extent to which action is possible toward a specific sector of space, such as near/reachable and far/unreachable. Studies on tool-use revealed how the boundaries between these representations are dynamic. Space is not only multidimensional and dynamic, but it is also known for interacting with other dimensions of magnitude, such as time. However, whether time operates on similar action-driven multiple representations and whether it can be modulated by tool-use is yet unknown. To address these issues, healthy participants performed a time bisection task in two spatial positions (near and far space) before and after an active tool-use training, which consisted of performing goal-directed actions holding a tool with their right hand (Experiment 1). Before training, perceived stimuli duration was influenced by their spatial position defined by action. Hence, a dissociation emerged between near/reachable and far/unreachable space. Strikingly, this dissociation disappeared after the active tool-use training since temporal stimuli were now perceived as nearer. The remapping was not found when a passive tool-training was executed (Experiment 2) or when the active tool-training was performed with participants’ left hand (Experiment 3). Moreover, no time remapping was observed following an equivalent active hand-training but without a tool (Experiment 4). Taken together, our findings reveal that time processing is based on action-driven multiple representations. The dynamic nature of these representations is demonstrated by the remapping of time, which is action- and effector-dependent
    corecore