3,505 research outputs found

    GIFT: Gesture-Based Interaction by Fingers Tracking, an Interaction Technique for Virtual Environment

    Get PDF
    Three Dimensional (3D) interaction is the plausible human interaction inside a Virtual Environment (VE). The rise of the Virtual Reality (VR) applications in various domains demands for a feasible 3D interface. Ensuring immersivity in a virtual space, this paper presents an interaction technique where manipulation is performed by the perceptive gestures of the two dominant fingers; thumb and index. The two fingertip-thimbles made of paper are used to trace states and positions of the fingers by an ordinary camera. Based on the positions of the fingers, the basic interaction tasks; selection, scaling, rotation, translation and navigation are performed by intuitive gestures of the fingers. Without keeping a gestural database, the features-free detection of the fingers guarantees speedier interactions. Moreover, the system is user-independent and depends neither on the size nor on the color of the users’ hand. With a case-study project; Interactions by the Gestures of Fingers (IGF) the technique is implemented for evaluation. The IGF application traces gestures of the fingers using the libraries of OpenCV at the back-end. At the front-end, the objects of the VE are rendered accordingly using the Open Graphics Library; OpenGL. The system is assessed in a moderate lighting condition by a group of 15 users. Furthermore, usability of the technique is investigated in games. Outcomes of the evaluations revealed that the approach is suitable for VR applications both in terms of cost and accuracy

    SARSCEST (human factors)

    Get PDF
    People interact with the processes and products of contemporary technology. Individuals are affected by these in various ways and individuals shape them. Such interactions come under the label 'human factors'. To expand the understanding of those to whom the term is relatively unfamiliar, its domain includes both an applied science and applications of knowledge. It means both research and development, with implications of research both for basic science and for development. It encompasses not only design and testing but also training and personnel requirements, even though some unwisely try to split these apart both by name and institutionally. The territory includes more than performance at work, though concentration on that aspect, epitomized in the derivation of the term ergonomics, has overshadowed human factors interest in interactions between technology and the home, health, safety, consumers, children and later life, the handicapped, sports and recreation education, and travel. Two aspects of technology considered most significant for work performance, systems and automation, and several approaches to these, are discussed

    AN INTEGRATED AUGMENTED REALITY METHOD TO ASSEMBLY SIMULATION AND GUIDANCE

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    The Touch Thimble: Providing Fingertip Contact Feedback During Point-Force Haptic Interaction

    Get PDF
    Touching a real object with your fingertip provides simultaneous tactile and force feedback, yet most haptic interfaces for virtual environments can convey only one of these two essential modalities. To address this opportunity, we designed, prototyped, and evaluated the Touch Thimble, a new fingertip device that provides the user with the cutaneous sensation of making and breaking contact with virtual surfaces. Designed to attach to the endpoint of an impedance-type haptic interface like a SensAble Phantom, the Touch Thimble includes a slightly oversize cup that is suspended around the fingertip by passive springs. When the haptic interface applies contact forces from the virtual environment, the springs deflect to allow contact between the user\u27s fingertip and the inner surface of the cup. We evaluated a prototype Touch Thimble against a standard thimble in a formal user study and found that it did not improve nor degrade subjects\u27 ability to recognize smoothly curving surfaces. Although four of the eight subjects preferred it to the standard interface, overall the Touch Thimble made subjects slightly slower at recognizing the presented shapes. Detailed subject comments point out strengths and weaknesses of the current design and suggest avenues for future development of the device

    The Perception/Action loop: A Study on the Bandwidth of Human Perception and on Natural Human Computer Interaction for Immersive Virtual Reality Applications

    Get PDF
    Virtual Reality (VR) is an innovating technology which, in the last decade, has had a widespread success, mainly thanks to the release of low cost devices, which have contributed to the diversification of its domains of application. In particular, the current work mainly focuses on the general mechanisms underling perception/action loop in VR, in order to improve the design and implementation of applications for training and simulation in immersive VR, especially in the context of Industry 4.0 and the medical field. On the one hand, we want to understand how humans gather and process all the information presented in a virtual environment, through the evaluation of the visual system bandwidth. On the other hand, since interface has to be a sort of transparent layer allowing trainees to accomplish a task without directing any cognitive effort on the interaction itself, we compare two state of the art solutions for selection and manipulation tasks, a touchful one, the HTC Vive controllers, and a touchless vision-based one, the Leap Motion. To this aim we have developed ad hoc frameworks and methodologies. The software frameworks consist in the creation of VR scenarios, where the experimenter can choose the modality of interaction and the headset to be used and set experimental parameters, guaranteeing experiments repeatability and controlled conditions. The methodology includes the evaluation of performance, user experience and preferences, considering both quantitative and qualitative metrics derived from the collection and the analysis of heterogeneous data, as physiological and inertial sensors measurements, timing and self-assessment questionnaires. In general, VR has been found to be a powerful tool able to simulate specific situations in a realistic and involving way, eliciting user\u2019s sense of presence, without causing severe cybersickness, at least when interaction is limited to the peripersonal and near-action space. Moreover, when designing a VR application, it is possible to manipulate its features in order to trigger or avoid triggering specific emotions and voluntarily create potentially stressful or relaxing situations. Considering the ability of trainees to perceive and process information presented in an immersive virtual environment, results show that, when people are given enough time to build a gist of the scene, they are able to recognize a change with 0.75 accuracy when up to 8 elements are in the scene. For interaction, instead, when selection and manipulation tasks do not require fine movements, controllers and Leap Motion ensure comparable performance; whereas, when tasks are complex, the first solution turns out to be more stable and efficient, also because visual and audio feedback, provided as a substitute of the haptic one, does not substantially contribute to improve performance in the touchless case

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 359)

    Get PDF
    This bibliography lists 164 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during Jan. 1992. Subject coverage includes: aerospace medicine and physiology, life support systems and man/system technology, protective clothing, exobiology and extraterrestrial life, planetary biology, and flight crew behavior and performance

    Remote Access and Computerized User Control of Robotic Micromanipulators

    Get PDF
    Nano- and micromanipulators are critical research tools in numerous fields including micro-manufacturing and disease study. Despite their importance, nano- and micromanipulation systems remain inaccessible to many groups due to price and lack of portability. An intuitive and remotely accessible manipulation system helps mitigate this access problem. Previously, optimal control hardware for single-probe manipulation and the effect of latency on user performance were not well understood. Remote access demands full computerization; graphical user interfaces with networking capabilities were developed to fulfill this requirement and allow the use of numerous hardware controllers. Virtual environments were created to simulate the use of a manipulator with full parametric control and measurement capabilities. Users completed simulated tasks with each device and were surveyed about their perceptions. User performance with a commercial manipulator controller was exceeded by performance with both a computer mouse and pen tablet. Latency was imposed within the virtual environment to study it’s effects and establish guidelines as to which latency ranges are acceptable for long-range remote manipulation. User performance began to degrade noticeably at 100 ms and severely at 400 ms and performance with the mouse degraded the least as latency increased. A computer vision system for analyzing carbon nanotube arrays was developed so the computation time could be compared to acceptable system latency. The system characterizes the arrays to a high degree of accuracy and most of the measurement types of obtainable fast enough for real-time analysis

    Practical color-based motion capture

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 93-101).Motion capture systems track the 3-D pose of the human body and are widely used for high quality content creation, gestural user input and virtual reality. However, these systems are rarely deployed in consumer applications due to their price and complexity. In this thesis, we propose a motion capture system built from commodity components that can be deployed in a matter of minutes. Our approach uses one or more webcams and a color garment to track either the user's upper body or hands for motion capture and user input. We demonstrate that custom designed color garments can simplify difficult computer vision problems and lead to efficient and robust algorithms for hand and upper body tracking. Specifically, our highly descriptive color patterns alleviate ambiguities that are commonly encountered when tracking only silhouettes or edges, allowing us to employ a nearest-neighbor approach to track either the hands or the upper body at interactive rates. We also describe a robust color calibration system that enables our color-based tracking to work against cluttered backgrounds and under multiple illuminants. We demonstrate our system in several real-world indoor and outdoor settings and describe proof-of-concept applications enabled by our system that we hope will provide a foundation for new interactions in computer aided design, animation control and augmented reality.by Robert Yuanbo Wang.Ph.D

    Towards markerless orthopaedic navigation with intuitive Optical See-through Head-mounted displays

    Get PDF
    The potential of image-guided orthopaedic navigation to improve surgical outcomes has been well-recognised during the last two decades. According to the tracked pose of target bone, the anatomical information and preoperative plans are updated and displayed to surgeons, so that they can follow the guidance to reach the goal with higher accuracy, efficiency and reproducibility. Despite their success, current orthopaedic navigation systems have two main limitations: for target tracking, artificial markers have to be drilled into the bone and calibrated manually to the bone, which introduces the risk of additional harm to patients and increases operating complexity; for guidance visualisation, surgeons have to shift their attention from the patient to an external 2D monitor, which is disruptive and can be mentally stressful. Motivated by these limitations, this thesis explores the development of an intuitive, compact and reliable navigation system for orthopaedic surgery. To this end, conventional marker-based tracking is replaced by a novel markerless tracking algorithm, and the 2D display is replaced by a 3D holographic Optical see-through (OST) Head-mounted display (HMD) precisely calibrated to a user's perspective. Our markerless tracking, facilitated by a commercial RGBD camera, is achieved through deep learning-based bone segmentation followed by real-time pose registration. For robust segmentation, a new network is designed and efficiently augmented by a synthetic dataset. Our segmentation network outperforms the state-of-the-art regarding occlusion-robustness, device-agnostic behaviour, and target generalisability. For reliable pose registration, a novel Bounded Iterative Closest Point (BICP) workflow is proposed. The improved markerless tracking can achieve a clinically acceptable error of 0.95 deg and 2.17 mm according to a phantom test. OST displays allow ubiquitous enrichment of perceived real world with contextually blended virtual aids through semi-transparent glasses. They have been recognised as a suitable visual tool for surgical assistance, since they do not hinder the surgeon's natural eyesight and require no attention shift or perspective conversion. The OST calibration is crucial to ensure locational-coherent surgical guidance. Current calibration methods are either human error-prone or hardly applicable to commercial devices. To this end, we propose an offline camera-based calibration method that is highly accurate yet easy to implement in commercial products, and an online alignment-based refinement that is user-centric and robust against user error. The proposed methods are proven to be superior to other similar State-of- the-art (SOTA)s regarding calibration convenience and display accuracy. Motivated by the ambition to develop the world's first markerless OST navigation system, we integrated the developed markerless tracking and calibration scheme into a complete navigation workflow designed for femur drilling tasks during knee replacement surgery. We verify the usability of our designed OST system with an experienced orthopaedic surgeon by a cadaver study. Our test validates the potential of the proposed markerless navigation system for surgical assistance, although further improvement is required for clinical acceptance.Open Acces
    corecore