282 research outputs found

    Machine-Insect Interface: Spatial Navigation of a Mobile Robot by a Drosophila

    Get PDF
    Machine-insect interfaces have been studied in detail in the past few decades. Animal-machine interfaces have been developed in various ways. In our study, we develop a machine-insect interface wherein an untethered fruit fly (Drosophila melanogaster) is tracked to remotely control a mobile robot. We develop the Active Omni-directional Treadmill (AOT) model, and integrate into the mobile robot to create the interface between the robot and the fruit fly. In this system, a fruit fly is allowed to walk on top of a transparent ball. As the fly tries to walk on the ball, we track the position of the fly using the dark field imaging technique. The displacement of the fly will be balanced out by a counter-displacement of the transparent ball, which is actuated by the omni-directional wheels, to keep the fly at the same position on the ball. Then the mobile robot spatially navigates based on the fly movements. The Robotic Operating System (ROS) is used to interface between the ball tracker and the mobile robot wirelessly. This study will help in investigating the fly’s behavior under different situations such as its response to a physical or virtual stimulus. The future scope of this project will include imaging the brain activity on the Drosophila as it spatially navigates towards a stimulus

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Locomotion in virtual environments and analysis of a new virtual walking device

    Get PDF
    This thesis investigates user interfaces for locomotion in virtual environments (VEs). It looks initially at virtual environments and user interfaces, then concentrates on locomotion interfaces, specifically on the Omni-Directional Treadmill (ODT) (Darken and Cockayne, 1997) and a new virtual walking device, LocoX, which was developed at the MOVES Institute, Naval Postgraduate School. It analyzes and compares the ODT and LocoX in terms of the application of human ability requirements (HARs). Afterwards, it compares the results of the analysis of the ODT and LocoX to real-world locomotion. The analysis indicates that LocoX, a new way of exploring virtual environments (VEs), provides a close match to real locomotion on some subtasks in VEs-- compared to the ODT--and produces relatively closer representation on some subtasks of real world locomotion. This thesis concludes that LocoX has great potential and that the locomotion provided is realistic enough to simulate certain kinds of movements inherent to real-world locomotion. LocoX still requires maturation and development, but is nonetheless a viable locomotion technique for VEs and future game-based simulations.http://archive.org/details/locomotioninvirt109452226Lieutenant Junior Grade, Turkish NavyApproved for public release; distribution is unlimited

    A multi-modal dance corpus for research into real-time interaction between humans in online virtual environments

    Get PDF
    We present a new, freely available, multimodal corpus for research into, amongst other areas, real-time realistic interaction between humans in online virtual environments. The specific corpus scenario focuses on an online dance class application scenario where students, with avatars driven by whatever 3D capture technology are locally available to them, can learn choerographies with teacher guidance in an online virtual ballet studio. As the data corpus is focused on this scenario, it consists of student/teacher dance choreographies concurrently captured at two different sites using a variety of media modalities, including synchronised audio rigs, multiple cameras, wearable inertial measurement devices and depth sensors. In the corpus, each of the several dancers perform a number of fixed choreographies, which are both graded according to a number of specific evaluation criteria. In addition, ground-truth dance choreography annotations are provided. Furthermore, for unsynchronised sensor modalities, the corpus also includes distinctive events for data stream synchronisation. Although the data corpus is tailored specifically for an online dance class application scenario, the data is free to download and used for any research and development purposes
    • …
    corecore