4,174 research outputs found

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    A novel approach to the control of quad-rotor helicopters using fuzzy-neural networks

    Get PDF
    Quad-rotor helicopters are agile aircraft which are lifted and propelled by four rotors. Unlike traditional helicopters, they do not require a tail-rotor to control yaw, but can use four smaller fixed-pitch rotors. However, without an intelligent control system it is very difficult for a human to successfully fly and manoeuvre such a vehicle. Thus, most of recent research has focused on small unmanned aerial vehicles, such that advanced embedded control systems could be developed to control these aircrafts. Vehicles of this nature are very useful when it comes to situations that require unmanned operations, for instance performing tasks in dangerous and/or inaccessible environments that could put human lives at risk. This research demonstrates a consistent way of developing a robust adaptive controller for quad-rotor helicopters, using fuzzy-neural networks; creating an intelligent system that is able to monitor and control the non-linear multi-variable flying states of the quad-rotor, enabling it to adapt to the changing environmental situations and learn from past missions. Firstly, an analytical dynamic model of the quad-rotor helicopter was developed and simulated using Matlab/Simulink software, where the behaviour of the quad-rotor helicopter was assessed due to voltage excitation. Secondly, a 3-D model with the same parameter values as that of the analytical dynamic model was developed using Solidworks software. Computational Fluid Dynamics (CFD) was then used to simulate and analyse the effects of the external disturbance on the control and performance of the quad-rotor helicopter. Verification and validation of the two models were carried out by comparing the simulation results with real flight experiment results. The need for more reliable and accurate simulation data led to the development of a neural network error compensation system, which was embedded in the simulation system to correct the minor discrepancies found between the simulation and experiment results. Data obtained from the simulations were then used to train a fuzzy-neural system, made up of a hierarchy of controllers to control the attitude and position of the quad-rotor helicopter. The success of the project was measured against the quad-rotor’s ability to adapt to wind speeds of different magnitudes and directions by re-arranging the speeds of the rotors to compensate for any disturbance. From the simulation results, the fuzzy-neural controller is sufficient to achieve attitude and position control of the quad-rotor helicopter in different weather conditions, paving way for future real time applications

    Compensatory eye movements in mice

    Get PDF

    Compensatory eye movements in mice

    Get PDF

    Microgravity vestibular investigations (10-IML-1)

    Get PDF
    Our perception of how we are oriented in space is dependent on the interaction of virtually every sensory system. For example, to move about in our environment we integrate inputs in our brain from visual, haptic (kinesthetic, proprioceptive, and cutaneous), auditory systems, and labyrinths. In addition to this multimodal system for orientation, our expectations about the direction and speed of our chosen movement are also important. Changes in our environment and the way we interact with the new stimuli will result in a different interpretation by the nervous system of the incoming sensory information. We will adapt to the change in appropriate ways. Because our orientation system is adaptable and complex, it is often difficult to trace a response or change in behavior to any one source of information in this synergistic orientation system. However, with a carefully designed investigation, it is possible to measure signals at the appropriate level of response (both electrophysiological and perceptual) and determine the effect that stimulus rearrangement has on our sense of orientation. The environment of orbital flight represents the stimulus arrangement that is our immediate concern. The Microgravity Vestibular Investigations (MVI) represent a group of experiments designed to investigate the effects of orbital flight and a return to Earth on our orientation system

    Image Simulation in Remote Sensing

    Get PDF
    Remote sensing is being actively researched in the fields of environment, military and urban planning through technologies such as monitoring of natural climate phenomena on the earth, land cover classification, and object detection. Recently, satellites equipped with observation cameras of various resolutions were launched, and remote sensing images are acquired by various observation methods including cluster satellites. However, the atmospheric and environmental conditions present in the observed scene degrade the quality of images or interrupt the capture of the Earth's surface information. One method to overcome this is by generating synthetic images through image simulation. Synthetic images can be generated by using statistical or knowledge-based models or by using spectral and optic-based models to create a simulated image in place of the unobtained image at a required time. Various proposed methodologies will provide economical utility in the generation of image learning materials and time series data through image simulation. The 6 published articles cover various topics and applications central to Remote sensing image simulation. Although submission to this Special Issue is now closed, the need for further in-depth research and development related to image simulation of High-spatial and spectral resolution, sensor fusion and colorization remains.I would like to take this opportunity to express my most profound appreciation to the MDPI Book staff, the editorial team of Applied Sciences journal, especially Ms. Nimo Lang, the assistant editor of this Special Issue, talented authors, and professional reviewers

    Deep into the Eyes: Applying Machine Learning to improve Eye-Tracking

    Get PDF
    Eye-tracking has been an active research area with applications in personal and behav- ioral studies, medical diagnosis, virtual reality, and mixed reality applications. Improving the robustness, generalizability, accuracy, and precision of eye-trackers while maintaining privacy is crucial. Unfortunately, many existing low-cost portable commercial eye trackers suffer from signal artifacts and a low signal-to-noise ratio. These trackers are highly depen- dent on low-level features such as pupil edges or diffused bright spots in order to precisely localize the pupil and corneal reflection. As a result, they are not reliable for studying eye movements that require high precision, such as microsaccades, smooth pursuit, and ver- gence. Additionally, these methods suffer from reflective artifacts, occlusion of the pupil boundary by the eyelid and often require a manual update of person-dependent parame- ters to identify the pupil region. In this dissertation, I demonstrate (I) a new method to improve precision while maintaining the accuracy of head-fixed eye trackers by combin- ing velocity information from iris textures across frames with position information, (II) a generalized semantic segmentation framework for identifying eye regions with a further extension to identify ellipse fits on the pupil and iris, (III) a data-driven rendering pipeline to generate a temporally contiguous synthetic dataset for use in many eye-tracking ap- plications, and (IV) a novel strategy to preserve privacy in eye videos captured as part of the eye-tracking process. My work also provides the foundation for future research by addressing critical questions like the suitability of using synthetic datasets to improve eye-tracking performance in real-world applications, and ways to improve the precision of future commercial eye trackers with improved camera specifications

    Feed-forward and visual feedback control of head roll orientation in wasps (Polistes humilis, Vespidae, Hymenoptera)

    Get PDF
    Flying insects keep their visual system horizontally aligned, suggesting that gaze stabilization is a crucial first step in flight control. Unlike flies, hymenopteran insects such as bees and wasps do not have halteres that provide fast, feed-forward ang
    corecore