25,727 research outputs found

    Cerebral activations related to ballistic, stepwise interrupted and gradually modulated movements in parkinson patients

    Get PDF
    Patients with Parkinson's disease (PD) experience impaired initiation and inhibition of movements such as difficulty to start/stop walking. At single-joint level this is accompanied by reduced inhibition of antagonist muscle activity. While normal basal ganglia (BG) contributions to motor control include selecting appropriate muscles by inhibiting others, it is unclear how PD-related changes in BG function cause impaired movement initiation and inhibition at single-joint level. To further elucidate these changes we studied 4 right-hand movement tasks with fMRI, by dissociating activations related to abrupt movement initiation, inhibition and gradual movement modulation. Initiation and inhibition were inferred from ballistic and stepwise interrupted movement, respectively, while smooth wrist circumduction enabled the assessment of gradually modulated movement. Task-related activations were compared between PD patients (N = 12) and healthy subjects (N = 18). In healthy subjects, movement initiation was characterized by antero-ventral striatum, substantia nigra (SN) and premotor activations while inhibition was dominated by subthalamic nucleus (STN) and pallidal activations, in line with the known role of these areas in simple movement. Gradual movement mainly involved antero-dorsal putamen and pallidum. Compared to healthy subjects, patients showed reduced striatal/SN and increased pallidal activation for initiation, whereas for inhibition STN activation was reduced and striatal-thalamo-cortical activation increased. For gradual movement patients showed reduced pallidal and increased thalamo-cortical activation. We conclude that PD-related changes during movement initiation fit the (rather static) model of alterations in direct and indirect BG pathways. Reduced STN activation and regional cortical increased activation in PD during inhibition and gradual movement modulation are better explained by a dynamic model that also takes into account enhanced responsiveness to external stimuli in this disease and the effects of hyper-fluctuating cortical inputs to the striatum and STN in particular

    EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

    Get PDF
    While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display's sweet spot into a sweet line, and reduces gaze interaction kick-off time to 3.5 seconds -- a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays

    Assessment of Audio Interfaces for use in Smartphone Based Spatial Learning Systems for the Blind

    Get PDF
    Recent advancements in the field of indoor positioning and mobile computing promise development of smart phone based indoor navigation systems. Currently, the preliminary implementations of such systems only use visual interfaces—meaning that they are inaccessible to blind and low vision users. According to the World Health Organization, about 39 million people in the world are blind. This necessitates the need for development and evaluation of non-visual interfaces for indoor navigation systems that support safe and efficient spatial learning and navigation behavior. This thesis research has empirically evaluated several different approaches through which spatial information about the environment can be conveyed through audio. In the first experiment, blindfolded participants standing at an origin in a lab learned the distance and azimuth of target objects that were specified by four audio modes. The first three modes were perceptual interfaces and did not require cognitive mediation on the part of the user. The fourth mode was a non-perceptual mode where object descriptions were given via spatial language using clockface angles. After learning the targets through the four modes, the participants spatially updated the position of the targets and localized them by walking to each of them from two indirect waypoints. The results also indicate hand motion triggered mode to be better than the head motion triggered mode and comparable to auditory snapshot. In the second experiment, blindfolded participants learned target object arrays with two spatial audio modes and a visual mode. In the first mode, head tracking was enabled, whereas in the second mode hand tracking was enabled. In the third mode, serving as a control, the participants were allowed to learn the targets visually. We again compared spatial updating performance with these modes and found no significant performance differences between modes. These results indicate that we can develop 3D audio interfaces on sensor rich off the shelf smartphone devices, without the need of expensive head tracking hardware. Finally, a third study, evaluated room layout learning performance by blindfolded participants with an android smartphone. Three perceptual and one non-perceptual mode were tested for cognitive map development. As expected the perceptual interfaces performed significantly better than the non-perceptual language based mode in an allocentric pointing judgment and in overall subjective rating. In sum, the perceptual interfaces led to better spatial learning performance and higher user ratings. Also there is no significant difference in a cognitive map developed through spatial audio based on tracking user’s head or hand. These results have important implications as they support development of accessible perceptually driven interfaces for smartphones

    Placing large group relations into pedestrian dynamics: psychological crowds in counterflow

    Get PDF
    Understanding influences on pedestrian movement is important to accurately simulate crowd behaviour, yet little research has explored the psychological factors that influence interactions between large groups in counterflow scenarios. Research from social psychology has demonstrated that social identities can influence the micro-level pedestrian movement of a psychological crowd, yet this has not been extended to explore behaviour when two large psychological groups are co-present. This study investigates how the presence of large groups with different social identities can affect pedestrian behaviour when walking in counterflow. Participants (N = 54) were divided into two groups and primed to have identities as either ‘team A’ or ‘team B’. The trajectories of all participants were tracked to compare the movement of team A when walking alone to when walking in counterflow with team B, based on their i) speed of movement and distance walked, and ii) proximity between participants. In comparison to walking alone, the presence of another group influenced team A to collectively self-organise to reduce their speed and distance walked in order to walk closely together with ingroup members. We discuss the importance of incorporating social identities into pedestrian group dynamics for empirically validated simulations of counterflow scenarios
    • …
    corecore