2,286 research outputs found

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Inter-hemispheric integration of tactile-motor responses across body parts

    Get PDF
    In simple detection tasks, reaction times are faster when stimuli are presented to the visual field or side of the body ipsilateral to the body part used to respond. This advantage, the crossed-uncrossed difference (CUD), is thought to reflect inter-hemispheric interactions needed for sensorimotor information to be integrated between the two cerebral hemispheres. However, it is unknown whether the tactile CUD is invariant when different body parts are stimulated. The most likely structure mediating such processing is thought to be the corpus callosum (CC). Neurophysiological studies have shown that there are denser callosal connections between regions that represent proximal parts of the body near the body midline and more sparse connections for regions representing distal extremities. Therefore, if the information transfer between the two hemispheres is affected by the density of callosal connections, stimuli presented on more distal regions of the body should produce a greater CUD compared to stimuli presented on more proximal regions. This is because interhemispheric transfer of information from regions with sparse callosal connections will be less efficient, and hence slower. Here, we investigated whether the CUD is modulated as a function of the different body parts stimulated by presenting tactile stimuli unpredictably on body parts at different distances from the body midline (i.e., Middle Finger, Forearm, or Forehead of each side of the body). Participants detected the stimulus and responded as fast as possible using either their left or right foot. Results showed that the magnitude of the CUD was larger on the finger (~2.6 ms) and forearm (~1.8 ms) than on the forehead (~-0.9 ms). This result suggests that the interhemispheric transfer of tactile stimuli varies as a function of the strength of callosal connections of the body parts

    Human operator performance of remotely controlled tasks: Teleoperator research conducted at NASA's George C. Marshal Space Flight Center

    Get PDF
    The capabilities within the teleoperator laboratories to perform remote and teleoperated investigations for a wide variety of applications are described. Three major teleoperator issues are addressed: the human operator, the remote control and effecting subsystems, and the human/machine system performance results for specific teleoperated tasks

    Inter-hemispheric integration of tactile-motor responses across body parts

    Get PDF
    In simple detection tasks, reaction times (RTs) are faster when stimuli are presented to the visual field or side of the body ipsilateral to the body part used to respond. This advantage, the crossed-uncrossed difference (CUD), is thought to reflect interhemispheric interactions needed for sensorimotor information to be integrated between the two cerebral hemispheres. However, it is unknown whether the tactile CUD is invariant when different body parts are stimulated. The most likely structure mediating such processing is thought to be the corpus callosum (CC). Neurophysiological studies have shown that there are denser callosal connections between regions that represent proximal parts of the body near the body midline and more sparse connections for regions representing distal extremities. Therefore, if the information transfer between the two hemispheres is affected by the density of callosal connections, stimuli presented on more distal regions of the body should produce a greater CUD compared to stimuli presented on more proximal regions. This is because interhemispheric transfer of information from regions with sparse callosal connections will be less efficient, and hence slower. Here, we investigated whether the CUD is modulated as a function of the different body parts stimulated by presenting tactile stimuli unpredictably on body parts at different distances from the body midline (i.e., Middle Finger, Forearm, or Forehead of each side of the body). Participants detected the stimulus and responded as fast as possible using either their left or right foot. Results showed that the magnitude of the CUD was larger on the finger (∼2.6 ms) and forearm (∼1.8 ms) than on the forehead ( 0.9 ms). This result suggests that the interhemispheric transfer of tactile stimuli varies as a function of the strength of callosal connections of the body parts

    THE COUPLING OF PERCEPTION AND ACTION IN REPRESENTATION

    Get PDF
    This thesis examines how the objects that we visually perceive in the world are coupled to the actions that we make towards them. For example, a whole hand grasp might be coupled with an object like an apple, but not with an object like a pea. It has been claimed that the coupling of what we see and what we do is not simply associative, but is fundamental to the way the brain represents visual objects. More than association, it is thought that when an object is seen (even if there is no intention to interact with it), there is a partial and automatic activation of the networks in the brain that plan actions (such as reaches and grasps). The central aim of this thesis was to investigate how specific these partial action plans might be, and how specific the properties of objects that automatically activate them might be. In acknowledging that perception and action are dynamically intertwining processes (such that in catching a butterfly the eye and the hand cooperate with a fluid and seamless efficiency), it was supposed that these couplings of perception and action in the brain might be loosely constrained. That is, they should not be rigidly prescribed (such that a highly specific action is always and only coupled with a specific object property) but they should instead involve fairly general components of actions that can adapt to different situations. The experimental work examined the automatic coupling of simplistic left and right actions (e.g. key presses) to pictures of oriented objects. Typically a picture of an object was shown and the viewer responded as fast as possible to some object property that was not associated with action (such as its colour). Of interest was how the performance of these left or right responses related to the task irrelevant left or right orientation of the object. The coupling of a particular response to a particular orientation could be demonstrated by the response performance (speed and accuracy). The more tightly coupled a response was to a particular object orientation, the faster and more accurate it was. The results supported the idea of loosely constrained action plans. Thus it appeared that a range of different actions (even foot responses) could be coupled with an object's orientation. These actions were coupled by default to an object's X-Z orientation (e.g. orientation in the depth plane). In further reflecting a loosely constrained perception-action mechanism, these couplings were shown to change in different situations (e.g. when the object moved towards the viewer, or when a key press made the object move in a predictable way). It was concluded that the kinds of components of actions that are automatically activated when viewing an object are not very detailed or fixed, but are initially quite general and can change and become more specific when circumstances demand it

    Engineering data compendium. Human perception and performance, volume 3

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design of military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by system designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is Volume 3, containing sections on Human Language Processing, Operator Motion Control, Effects of Environmental Stressors, Display Interfaces, and Control Interfaces (Real/Virtual)

    Tell it to the hand: Attentional modulation in the identification of misoriented chiral objects

    Get PDF
    Research in the field of cognitive neuroscience and neuropsychology on spatial cognition and mental imagery has increased considerably over the last few decades. While at the beginning of the XX century studying imagery was considered an object of derision \u2013 a \u2015sheer bunk\u2016 (Watson, 1928) \u2013 at the present, imagery researchers have successfully developed models and improved behavioral and neurophysiological measures (e.g., Kosslyn et al., 2006). Mental rotation constituted a major advance in terms of behavioral measures sensitive to imaginative operations executed on visual representations (i.e., Shepard & Cooper, 1982). The linearity of modulation between response times and angular disparity of the images allowed a quantitative estimate of imagery processes. The experiments described in the present thesis were motivated by the intent to continue and extend the understanding of such fascinating mental phenomena. The evolution of the present work took initial steps from the adoption of a behavioral paradigm, the hand laterality judgment task, as privileged tool for studying motor imagery in healthy individuals and brain-damaged patients. The similarity with mental rotation tasks and the implicit nature of the task made it the best candidate to test hypotheses regarding the mental simulation of body movements. In this task, response times are linearly affected by the angular departures the hand pictures are shown in, as for mental rotation, and their distributions are asymmetric between left and right hands. Drawing from these task features a widely held view posits that laterality judgment of rotated hand pictures requires participants to imagine hand-arm movements, although they receive no instruction to do so (e.g., Parsons, 1987a; Parsons, 1994). In Chapter 1, I provided a review of the relevant literature on visual and motor imagery. Particular aspects of the mental rotation literature are also explored. In Chapter 2, I examined the hand laterality task and the vast literature of studies that employed this task as means to test motor imagery processes. An alternative view to the motor imagery account is also discussed (i.e., the disembodied account). In Chapter 3, I exploited the hand laterality task, and a visual laterality task (Tomasino et al., 2010) to test motor and visual imagery abilities in a group of healthy aged individuals. In Chapter 4, I described an alternative view that has been proposed by others to explain the pattern of RTs in the hand laterality task: The multisensory integration account (Grafton & Viswanathan, 2014). In this view, hand laterality is recognized by pairing information between the seen hand's visual features and the observer's felt own hand. In Chapter 5, I tested and found evidence for a new interpretation of the particular configuration of response times in the hand laterality task. I demonstrated a spatial compatibility effect for rotated pictures of hands given by the interaction between the direction of stimulus rotation (clockwise vs. counterclockwise) and the laterality of the motor response. These effects changed by following temporal dynamics that were attributed to shifts of spatial attention. In the same chapter, I conducted other psychophysics experiments that confirmed the role of spatial attention and that ruled out the view of multisensory integration as the key aspect in determining the asymmetries of the response times' distribution. In Chapter 6, I conducted a study with patients suffering from Unilateral Neglect in which they performed the hand laterality task and a visual laterality task. The findings indicated that patients failed to integrate visual information with spatially incompatible responses irrespective of the type of task, and depending on egocentric stimulus-response spatial codes. A general discussion is presented in Chapter 7

    Human Factor Aspects of Traffic Safety

    Get PDF

    Human engineering design criteria study Final report

    Get PDF
    Human engineering design criteria for use in designing earth launch vehicle systems and equipmen
    corecore