30 research outputs found

    Local axonal morphology guides the topography of interneuron myelination in mouse and human neocortex

    Get PDF
    GABAergic fast-spiking parvalbumin-positive (PV) interneurons are frequently myelinated in the cerebral cortex. However, the factors governing the topography of cortical interneuron myelination remain incompletely understood. Here, we report that segmental myelination along neocortical interneuron axons is strongly predicted by the joint combination of interbranch distance and local axon caliber. Enlargement of PV+ interneurons increased axonal myelination, while reduced cell size led to decreased myelination. Next, we considered regular-spiking SOM+ cells, which normally have relatively shorter interbranch distances and thinner axon diameters than PV+ cells, and are rarely myelinated. Consistent with the importance of axonal morphology for guiding interneuron myelination, enlargement of SOM+ cell size dramatically increased the frequency of myelinated axonal segments. Lastly, we confirm that these findings also extend to human neocortex by quantifying interneuron axonal myelination from ex vivo surgical tissue. Together, these findings establish a predictive model of neocortical GABAergic interneuron myelination determined by 42 local axonal morphology

    Understanding and manipulating eye height to change the user's experience of perceived space in virtual reality

    No full text
    Today, virtual reality technology is a multi-purpose tool for diverse applications in various domains. However, research has shown that virtual worlds are often not perceived in scale, especially regarding egocentric distances, as the programmer intended them. While the main reason for this misperception of distances in virtual environments is still unknown, this dissertation investigates one specific aspect of fundamental importance to distance perception ndash; eye height. In human perception, the ability to determine eye height is essential, because eye height is used to perceive heights of objects, velocity, affordances and distances, all of which allow for successful environmental interaction. It is reasonably well understood how eye height is used to determine many of these percepts. Yet, how eye height itself is determined is still unknown. In multiple studies conducted in virtual reality and the real world, this dissertation investigates how eye height might be determined in common scenarios in virtual reality. Using manipulations of the virtual eye height and distance perception tasks, the results suggest that humans rely more on their body-based information to determine their eye height, if they have no possibility for calibration. This has major implications for many existing virtual reality setups. Because humans rely on their body-based eye height, this can be exploited to systematically alter the perceived space in immersive virtual environments, which might be sufficient to enable every user an experience close to what was intended by the programmer

    The importance of postural cues for determining eye height in immersive virtual reality

    No full text
    In human perception, the ability to determine eye height is essential, because eye height is used to scale heights of objects, velocities, affordances and distances, all of which allow for successful environmental interaction. It is well understood that eye height is fundamental to determine many of these percepts. Yet, how eye height itself is provided is still largely unknown. While the information potentially specifying eye height in the real world is naturally coincident in an environment with a regular ground surface, these sources of information can be easily divergent in similar and common virtual reality scenarios. Thus, we conducted virtual reality experiments where we manipulated the virtual eye height in a distance perception task to investigate how eye height might be determined in such a scenario. We found that humans rely more on their postural cues for determining their eye height if there is a conflict between visual and postural information and little opportunity for perceptual-motor calibration is provided. This is demonstrated by the predictable variations in their distance estimates. Our results suggest that the eye height in such circumstances is informed by postural cues when estimating egocentric distances in virtual reality and consequently, does not depend on an internalized value for eye height

    Eye height manipulations:a possible solution to reduce underestimation of egocentric distances in head-mounted displays

    No full text
    Virtual reality technology can be considered a multipurpose tool for diverse applications in various domains, for example, training, prototyping, design, entertainment, and research investigating human perception. However, for many of these applications, it is necessary that the designed and computer-generated virtual environments are perceived as a replica of the real world. Many research studies have shown that this is not necessarily the case. Specifically, egocentric distances are underestimated compared to real-world estimates regardless of whether the virtual environment is displayed in a head-mounted display or on an immersive large-screen display. While the main reason for this observed distance underestimation is still unknown, we investigate a potential approach to reduce or even eliminate this distance underestimation. Building up on the angle of declination below the horizon relationship for perceiving egocentric distances, we describe how eye height manipulations in virtual reality should affect perceived distances. In addition, we describe how this relationship could be exploited to reduce distance underestimation for individual users. In a first experiment, we investigate the influence of a manipulated eye height on an action-based measure of egocentric distance perception. We found that eye height manipulations have similar predictable effects on an action-based measure of egocentric distance as we previously observed for a cognitive measure. This might make this approach more useful than other proposed solutions across different scenarios in various domains, for example, for collaborative tasks. In three additional experiments, we investigate the influence of an individualized manipulation of eye height to reduce distance underestimation in a sparse-cue and a rich-cue environment. In these experiments, we demonstrate that a simple eye height manipulation can be used to selectively alter perceived distances on an individual basis, which could be helpful to enable every user to have an experience close to what was intended by the content designer

    The effect of a manipulated virtual eye height (-50 cm or +50 cm) on egocentric distances in a standing position in comparison to the respective baseline condition (0 cm).

    No full text
    <p>Error bars represent ±1 SE. The actual mean participant (postural) eye height in the experiment is depicted in the left upper corner. <b>Note:</b> (a) The predictions are shifted by the observed underestimation in the baseline condition to account for the usually observed distance underestimation in head mounted displays (in an ideal world, the 0 cm estimates would correspond to veridical performance). (b) If the virtual eye height were used, there should be no differences and the prediction for visual eye height would apply for all conditions.</p

    Percent of responding length is larger in both hand width conditions.

    No full text
    <p>Error bars represent 1 standard error and are calculated on the basis of within-participant error with the method provided by Loftus and Masson [<a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0068594#B37" target="_blank">37</a>].</p

    Illustration of body based scaling measurements across different hand width conditions.

    No full text
    <p>Illustration of body based scaling measurements across different hand width conditions.</p

    Experimental setup for Experiment 3 with respect to the participants’ viewpoints in the three different pen size conditions.

    No full text
    <p>Experimental setup for Experiment 3 with respect to the participants’ viewpoints in the three different pen size conditions.</p

    The effect of a manipulated virtual eye height (-50 cm or +50 cm) on egocentric distances in a prone position on a bed (adjusted to be approximately at seated eye height) in comparison to the respective baseline condition (0 cm).

    No full text
    <p>Error bars represent ±1 SE. The actual mean participant (postural) eye height in the experiment is depicted in the left upper corner. Note: (a) The predictions are shifted by the observed underestimation in the baseline condition to account for the usually observed distance underestimation in head mounted displays (in an ideal world, the 0 cm estimates would correspond to the prediction, which is veridical performance). (b) If the virtual eye height were used, there should be no differences and the prediction for visual eye height would apply for all conditions.</p
    corecore