41 research outputs found

    Observation of Andreev Reflection Enhanced Shot Noise

    Full text link
    We have experimentally investigated the quasiparticle shot noise in NbN/MgO/NbN superconductor - insulator - superconductor tunnel junctions. The observed shot noise is significantly larger than theoretically expected. We attribute this to the occurrence of multiple Andreev reflection processes in pinholes present in the MgO barrier. This mechanism causes the current to flow in large charge quanta (Andreev clusters), with a voltage dependent average value of m = 1+ 2 Delta/eV times the electron charge. Because of this charge enhancement effect, the shot noise is increased by the factor m.Comment: 4 pages, 5 figures include

    Relative finger position influences whether you can localize tactile stimuli

    Get PDF
    To investigate whether the relative positions of the fingers influence tactile localization, participants were asked to localize tactile stimuli applied to their fingertips. We measured the location and rate of errors for three finger configurations: fingers stretched out and together so that they are touching each other, fingers stretched out and spread apart maximally and fingers stretched out with the two hands on top of each other so that the fingers are interwoven. When the fingers contact each other, it is likely that the error rate to the adjacent fingers will be higher than when the fingers are spread apart. In particular, we reasoned that localization would probably improve when the fingers are spread. We aimed at assessing whether such adjacency was measured in external coordinates (taking proprioception into account) or on the body (in skin coordinates). The results confirmed that the error rate was lower when the fingers were spread. However, there was no decrease in error rate to neighbouring fingertips in the fingers spread condition in comparison with the fingers together condition. In an additional experiment, we showed that the lower error rate when the fingers were spread was not related to the continuous tactile input from the neighbouring fingers when the fingers were together. The current results suggest that information from proprioception is taken into account in perceiving the location of a stimulus on one of the fingertips

    Keeping in Touch with One's Self: Multisensory Mechanisms of Self-Consciousness

    Get PDF
    BACKGROUND: The spatial unity between self and body can be disrupted by employing conflicting visual-somatosensory bodily input, thereby bringing neurological observations on bodily self-consciousness under scientific scrutiny. Here we designed a novel paradigm linking the study of bodily self-consciousness to the spatial representation of visuo-tactile stimuli by measuring crossmodal congruency effects (CCEs) for the full body. METHODOLOGY/PRINCIPAL FINDINGS: We measured full body CCEs by attaching four vibrator-light pairs to the trunks (backs) of subjects who viewed their bodies from behind via a camera and a head mounted display (HMD). Subjects made speeded elevation (up/down) judgments of the tactile stimuli while ignoring light stimuli. To modulate self-identification for the seen body subjects were stroked on their backs with a stick and the felt stroking was either synchronous or asynchronous with the stroking that could be seen via the HMD. We found that (1) tactile stimuli were mislocalized towards the seen body (2) CCEs were modulated systematically during visual-somatosensory conflict when subjects viewed their body but not when they viewed a body-sized object, i.e. CCEs were larger during synchronous than during asynchronous stroking of the body and (3) these changes in the mapping of tactile stimuli were induced in the same experimental condition in which predictable changes in bodily self-consciousness occurred. CONCLUSIONS/SIGNIFICANCE: These data reveal that systematic alterations in the mapping of tactile stimuli occur in a full body illusion and thus establish CCE magnitude as an online performance proxy for subjective changes in global bodily self-consciousness

    Rubber Hands Feel Touch, but Not in Blind Individuals

    Get PDF
    Psychology and neuroscience have a long-standing tradition of studying blind individuals to investigate how visual experience shapes perception of the external world. Here, we study how blind people experience their own body by exposing them to a multisensory body illusion: the somatic rubber hand illusion. In this illusion, healthy blindfolded participants experience that they are touching their own right hand with their left index finger, when in fact they are touching a rubber hand with their left index finger while the experimenter touches their right hand in a synchronized manner (Ehrsson et al. 2005). We compared the strength of this illusion in a group of blind individuals (n = 10), all of whom had experienced severe visual impairment or complete blindness from birth, and a group of age-matched blindfolded sighted participants (n = 12). The illusion was quantified subjectively using questionnaires and behaviorally by asking participants to point to the felt location of the right hand. The results showed that the sighted participants experienced a strong illusion, whereas the blind participants experienced no illusion at all, a difference that was evident in both tests employed. A further experiment testing the participants' basic ability to localize the right hand in space without vision (proprioception) revealed no difference between the two groups. Taken together, these results suggest that blind individuals with impaired visual development have a more veridical percept of self-touch and a less flexible and dynamic representation of their own body in space compared to sighted individuals. We speculate that the multisensory brain systems that re-map somatosensory signals onto external reference frames are less developed in blind individuals and therefore do not allow efficient fusion of tactile and proprioceptive signals from the two upper limbs into a single illusory experience of self-touch as in sighted individuals

    Manipulable Objects Facilitate Cross-Modal Integration in Peripersonal Space

    Get PDF
    Previous studies have shown that tool use often modifies one's peripersonal space – i.e. the space directly surrounding our body. Given our profound experience with manipulable objects (e.g. a toothbrush, a comb or a teapot) in the present study we hypothesized that the observation of pictures representing manipulable objects would result in a remapping of peripersonal space as well. Subjects were required to report the location of vibrotactile stimuli delivered to the right hand, while ignoring visual distractors superimposed on pictures representing everyday objects. Pictures could represent objects that were of high manipulability (e.g. a cell phone), medium manipulability (e.g. a soap dispenser) and low manipulability (e.g. a computer screen). In the first experiment, when subjects attended to the action associated with the objects, a strong cross-modal congruency effect (CCE) was observed for pictures representing medium and high manipulability objects, reflected in faster reaction times if the vibrotactile stimulus and the visual distractor were in the same location, whereas no CCE was observed for low manipulability objects. This finding was replicated in a second experiment in which subjects attended to the visual properties of the objects. These findings suggest that the observation of manipulable objects facilitates cross-modal integration in peripersonal space

    Tactile localization biases are modulated by gaze direction

    Get PDF
    Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher-level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition a single stimulus was delivered on participants’ hand while the reflexion of the right hand was seen through the mirror; in the object condition participants looked at a box through the mirror, and in the right hand condition participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points’ locations, by elongating it, in the radio-ulnar axis

    Versorgungssituation beim Lungenkarzinom in Deutschland - Ergebnisse einer Auswertung bundesweiter Daten klinischer Krebsregister

    No full text

    Wie werden die frühen Stadien des Lungenkarzinoms in Deutschland behandelt?

    No full text
    corecore