73 research outputs found

    Simulation and sensitivities for a phased IceCube-Gen2 deployment

    Get PDF

    A next-generation optical sensor for IceCube-Gen2

    Get PDF

    Simulation study for the future IceCube-Gen2 surface array

    Get PDF

    Optimization of the optical array geometry for IceCube-Gen2

    Get PDF

    Concept Study of a Radio Array Embedded in a Deep Gen2-like Optical Array

    Get PDF

    Sensitivity studies for the IceCube-Gen2 radio array

    Get PDF

    The Surface Array planned for IceCube-Gen2

    Get PDF
    IceCube-Gen2, the extension of the IceCube Neutrino Observatory, will feature three main components: an optical array in the deep ice, a large-scale radio array in the shallow ice and firn, and a surface detector above the optical array. Thus, IceCube-Gen2 will not only be an excellent detector for PeV neutrinos, but also constitutes a unique setup for the measurement of cosmic-ray air showers, where the electromagnetic component and low-energy muons are measured at the surface and high-energy muons are measured in the ice. As for ongoing enhancement of IceCube’s current surface array, IceTop, we foresee a combination of elevated scintillation and radio detectors for the Gen2 surface array, aiming at high measurement accuracy for air showers. The science goals are manifold: The in-situ measurement of the cosmic-ray flux and mass composition, as well as more thorough tests of hadronic interaction models, will improve the understanding of muons and atmospheric neutrinos detected in the ice, in particular, regarding prompt muons. Moreover, the surface array provides a cosmic-ray veto for the in-ice detector and contributes to the calibration of the optical and radio arrays. Last but not least, the surface array will make major contributions to cosmic-ray science in the energy range of the transition from Galactic to extragalactic sources. The increased sensitivities for photons and for cosmic-ray anisotropies at multi-PeV energies provide a chance to solve the puzzle of the origin of the most energetic Galactic cosmic rays and will serve IceCube’s multimessenger mission

    Me, you, and our object: Peripersonal space recruitment during executed and observed actions depends on object ownership.

    No full text
    none5siPeripersonal space (PPS) is a spatial representation that codes objects close to one's own and to someone else's body in a multisensory-motor frame of reference to support appropriate motor behavior. Recent theories framed PPS beyond its original sensorimotor aspects and proposed to relate it to social aspects of the self. Here, we manipulated the ownership status of an object ("whose object this is") to test the sensitivity of PPS to such a pervasive aspect of society. To this aim, we assessed PPS through a well-established visuo-tactile task within a novel situation where we had dyads of participants either grasping or observing to grasp an object, whose ownership was experimentally assigned to either participant (individual ownership), or to both participants (shared ownership). When ownership was assigned exclusively ("this belongs to you/the other," Experiment 1), the PPS recruitment emerged when grasping one's own object (I grasp my object), as well as when observing others grasping their own object (you grasp your object). Instead, no PPS effect was found when grasping (and observing to grasp) an object that was not one's own (I grasp yours, you grasp mine). When ownership was equally assigned ("this belongs to both of you," Experiment 2), a similar PPS recruitment emerged and, again, both when the action toward the shared object was executed and merely observed. These findings reveal that ownership is critical in shaping relatively low-level aspects of body-object interactions during everyday simple actions, highlighting the deep mark of ownership over social behavior.mixedPatané, Ivan; Brozzoli, Claudio; Koun, Eric; Frassinetti, Francesca; Farnè, AlessandroPatané, Ivan; Brozzoli, Claudio; Koun, Eric; Frassinetti, Francesca; Farnè, Alessandr

    Alpha Oscillations Are Involved in Localizing Touch on Handheld Tools

    No full text
    Abstract The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool

    Alpha Oscillations Are Involved in Localizing Touch on Handheld Tools

    No full text
    Abstract The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool
    • …
    corecore