3,215 research outputs found

    Implementation of Real Time Image Processing for a Human Eye Computer Interaction System

    Get PDF
    People with physical disabilities cannot fully enjoy the benefits provided by computer System. This is because the conventional mouse and keyboard were designed to be used by those who are able bodied. A number of barriers have stood in the way of the integration of eye tracking into everyday applications, including the intrusiveness, robustness, availability, and price of eye-tracking systems. Due to reducing the communication barriers between man and machine human eye computer interaction is important. The goal of this thesis is to lower these barriers so that eye tracking can be used to enhance current human computer interfaces.The main aim of this proposed system is to design and implement a human computer interaction system that tracks the direction of the human gaze. The pupil detection and tracking is an important step for developing a human-computer interaction system. To identify the gaze direction of the user’s eye (right, left, up and down).This work can develop a human computer interaction system that is based on iris tracking.A novel idea to control computer mouse cursor movement with human eyes it controls mouse-moving by automatically affecting the position where eyesight focuses on, and simulates mouse-click by affecting blinking action

    Brain Control of Movement Execution Onset Using Local Field Potentials in Posterior Parietal Cortex

    Get PDF
    The precise control of movement execution onset is essential for safe and autonomous cortical motor prosthetics. A recent study from the parietal reach region (PRR) suggested that the local field potentials (LFPs) in this area might be useful for decoding execution time information because of the striking difference in the LFP spectrum between the plan and execution states (Scherberger et al., 2005). More specifically, the LFP power in the 0–10 Hz band sharply rises while the power in the 20–40 Hz band falls as the state transitions from plan to execution. However, a change of visual stimulus immediately preceded reach onset, raising the possibility that the observed spectral change reflected the visual event instead of the reach onset. Here, we tested this possibility and found that the LFP spectrum change was still time locked to the movement onset in the absence of a visual event in self-paced reaches. Furthermore, we successfully trained the macaque subjects to use the LFP spectrum change as a "go" signal in a closed-loop brain-control task in which the animals only modulated the LFP and did not execute a reach. The execution onset was signaled by the change in the LFP spectrum while the target position of the cursor was controlled by the spike firing rates recorded from the same site. The results corroborate that the LFP spectrum change in PRR is a robust indicator for the movement onset and can be used for control of execution onset in a cortical prosthesis

    Spatially valid proprioceptive cues improve the detection of a visual stimulus

    Get PDF
    Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant’s arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask. The d′ detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible. These results suggest that proprioception influences the allocation of attention in visual spac
    corecore