396 research outputs found

    Using remote vision: The effects of video image frame rate on visual object recognition performance

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2010 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.The process of using remote vision was simulated in order to determine the effects of video image frame rate on the performance in visual recognition of stationary environmental hazards in the dynamic video footage of the pedestrian travel environment. The recognition performance was assessed against two different video image frame rate variations: 25 and 2 fps. The assessment included a range of objective and subjective criteria. The obtained results show that the effects of the frame rate variations on the performance are statistically insignificant. This paper belongs to the process of development of a novel system for navigation of visually impaired pedestrians. The navigation system includes a remote vision facility, and the visual recognition of the environmental hazards by the sighted human guide is a basic activity in aiding the visually impaired user of the system in mobility

    Interface design for a remote guidance system for the blind : using dual-screen displays

    Get PDF
    The mobility for the visually impaired people is one of the main challenges that researchers are still facing around the world. Although some projects have been conducted to improve the mobility of visually impaired people, further research is still needed. One of these projects is Brunel Remote Guidance System (BRGS). BRGS is aimed to assist visually impaired users in avoiding obstacles and reaching their destinations safely by providing online instructions via a remote sighted guide. This study comes as continuation of the development process of BRGS; the main aim that has been achieved of this research is the optimisation of the interface design for the system guide terminal. This helps the sighted guide to assist the VIUs to avoid obstacles safely and comfortably in the micro-navigation, as well as to keep them on the right track to reach their destination in the macro-navigation. After using the content analysis, the performance factors and their assessments method were identified in each BRGS‘ element, which concluded that there is a lack of research on the guide terminal setup and the assessment method for the sighted guide performance. Furthermore, there are no model to assist the sighted guide performance and two-screen displays used in the literature review and similar projects. A model was designed as a platform to conduct the evaluation on sighted guide performance. Based on this model, the computer-based simulation was established and tested, which made the simulation is ready for next task; the evaluation of the sighted guide performance. The conducted study determined the effects of the two-screen displays on the recognition performance of the 80 participants in the guide terminal. The performance was measured with the context of four different resolution conditions. The study was based on a simulation technique, which is consisted of two key performance elements in order to examine the sighted guide performance; the macro-navigation element and the micro-navigation element. The results show that the two-screen displays have an effect on the performance of the sighted guide. The optimum setup for the two-screen displays for the guide terminal consisted of a big digital map screen display (4CIF [704p x 576p]) and a small video image screen display (CIF [352p x 288p]), which one of the four different resolutions. This interface design has been recommended as a final setup in the guide terminal.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Understanding and stimulating the development of perceptual-motor skills in child bicyclists

    Get PDF

    A navigation and object location device for the blind

    Get PDF
    Gemstone Team VisionTeam Vision's goal is to create a navigation system for the blind. To achieve this, we took a multi-pronged approach. First, through surveys, we assessed the needs of the blind community and developed a system around those needs. Then, using recent technology, we combined a global positioning system (GPS), inertial navigation unit (INU), computer vision algorithms, and audio and haptic interfaces into one system. The GPS and INU work together to provide walking directions from building to building when outdoors and the computer vision algorithms identify and locate objects such as signs and landmarks, both indoors and outdoors. The speech-based interface ties the GPS, INU, and computer vision algorithms together into an interactive audio-based navigation device. Finally, the haptic interface provides an alternative intuitive directional guidance system. The resulting system guides users to speci ed buildings and to important objects such as cellular telephones, wallets, or even restroom or exit signs

    Accessible Autonomy: Exploring Inclusive Autonomous Vehicle Design and Interaction for People who are Blind and Visually Impaired

    Get PDF
    Autonomous vehicles are poised to revolutionize independent travel for millions of people experiencing transportation-limiting visual impairments worldwide. However, the current trajectory of automotive technology is rife with roadblocks to accessible interaction and inclusion for this demographic. Inaccessible (visually dependent) interfaces and lack of information access throughout the trip are surmountable, yet nevertheless critical barriers to this potentially lifechanging technology. To address these challenges, the programmatic dissertation research presented here includes ten studies, three published papers, and three submitted papers in high impact outlets that together address accessibility across the complete trip of transportation. The first paper began with a thorough review of the fully autonomous vehicle (FAV) and blind and visually impaired (BVI) literature, as well as the underlying policy landscape. Results guided prejourney ridesharing needs among BVI users, which were addressed in paper two via a survey with (n=90) transit service drivers, interviews with (n=12) BVI users, and prototype design evaluations with (n=6) users, all contributing to the Autonomous Vehicle Assistant: an award-winning and accessible ridesharing app. A subsequent study with (n=12) users, presented in paper three, focused on prejourney mapping to provide critical information access in future FAVs. Accessible in-vehicle interactions were explored in the fourth paper through a survey with (n=187) BVI users. Results prioritized nonvisual information about the trip and indicated the importance of situational awareness. This effort informed the design and evaluation of an ultrasonic haptic HMI intended to promote situational awareness with (n=14) participants (paper five), leading to a novel gestural-audio interface with (n=23) users (paper six). Strong support from users across these studies suggested positive outcomes in pursuit of actionable situational awareness and control. Cumulative results from this dissertation research program represent, to our knowledge, the single most comprehensive approach to FAV BVI accessibility to date. By considering both pre-journey and in-vehicle accessibility, results pave the way for autonomous driving experiences that enable meaningful interaction for BVI users across the complete trip of transportation. This new mode of accessible travel is predicted to transform independent travel for millions of people with visual impairment, leading to increased independence, mobility, and quality of life
    • 

    corecore