10,858 research outputs found

    Feedback Control of an Exoskeleton for Paraplegics: Toward Robustly Stable Hands-free Dynamic Walking

    Get PDF
    This manuscript presents control of a high-DOF fully actuated lower-limb exoskeleton for paraplegic individuals. The key novelty is the ability for the user to walk without the use of crutches or other external means of stabilization. We harness the power of modern optimization techniques and supervised machine learning to develop a smooth feedback control policy that provides robust velocity regulation and perturbation rejection. Preliminary evaluation of the stability and robustness of the proposed approach is demonstrated through the Gazebo simulation environment. In addition, preliminary experimental results with (complete) paraplegic individuals are included for the previous version of the controller.Comment: Submitted to IEEE Control System Magazine. This version addresses reviewers' concerns about the robustness of the algorithm and the motivation for using such exoskeleton

    Teaching introductory undergraduate Physics using commercial video games

    Get PDF
    Commercial video games are increasingly using sophisticated physics simulations to create a more immersive experience for players. This also makes them a powerful tool for engaging students in learning physics. We provide some examples to show how commercial off-the-shelf games can be used to teach specific topics in introductory undergraduate physics. The examples are selected from a course taught predominantly through the medium of commercial video games.Comment: Accepted to Physics Education, Fig1 does not render properly in this versio

    For efficient navigational search, humans require full physical movement but not a rich visual scene

    Get PDF
    During navigation, humans combine visual information from their surroundings with body-based information from the translational and rotational components of movement. Theories of navigation focus on the role of visual and rotational body-based information, even though experimental evidence shows they are not sufficient for complex spatial tasks. To investigate the contribution of all three sources of information, we asked participants to search a computer generated “virtual” room for targets. Participants were provided with either only visual information, or visual supplemented with body-based information for all movement (walk group) or rotational movement (rotate group). The walk group performed the task with near-perfect efficiency, irrespective of whether a rich or impoverished visual scene was provided. The visual-only and rotate groups were significantly less efficient, and frequently searched parts of the room at least twice. This suggests full physical movement plays a critical role in navigational search, but only moderate visual detail is required

    Automated Game Design Learning

    Full text link
    While general game playing is an active field of research, the learning of game design has tended to be either a secondary goal of such research or it has been solely the domain of humans. We propose a field of research, Automated Game Design Learning (AGDL), with the direct purpose of learning game designs directly through interaction with games in the mode that most people experience games: via play. We detail existing work that touches the edges of this field, describe current successful projects in AGDL and the theoretical foundations that enable them, point to promising applications enabled by AGDL, and discuss next steps for this exciting area of study. The key moves of AGDL are to use game programs as the ultimate source of truth about their own design, and to make these design properties available to other systems and avenues of inquiry.Comment: 8 pages, 2 figures. Accepted for CIG 201

    The benefits of using a walking interface to navigate virtual environments

    No full text
    Navigation is the most common interactive task performed in three-dimensional virtual environments (VEs), but it is also a task that users often find difficult. We investigated how body-based information about the translational and rotational components of movement helped participants to perform a navigational search task (finding targets hidden inside boxes in a room-sized space). When participants physically walked around the VE while viewing it on a head-mounted display (HMD), they then performed 90% of trials perfectly, comparable to participants who had performed an equivalent task in the real world during a previous study. By contrast, participants performed less than 50% of trials perfectly if they used a tethered HMD (move by physically turning but pressing a button to translate) or a desktop display (no body-based information). This is the most complex navigational task in which a real-world level of performance has been achieved in a VE. Behavioral data indicates that both translational and rotational body-based information are required to accurately update one's position during navigation, and participants who walked tended to avoid obstacles, even though collision detection was not implemented and feedback not provided. A walking interface would bring immediate benefits to a number of VE applications

    Design of teacher assistance tools in an exploratory learning environment for algebraic generalisation

    Get PDF
    The MiGen project is designing and developing an intelligent exploratory environment to support 11-14 year-old students in their learning of algebraic generalisation. Deployed within the classroom, the system also provides tools to assist teachers in monitoring students' activities and progress. This paper describes the architectural design of these Teacher Assistance tools and gives a detailed description of one such tool, focussing in particular on the research challenges faced, and the technologies and approaches chosen to implement the necessary functionalities given the context of the project

    Play Area Utilization Optimization for Room-scale Exploration of Virtual Worlds

    Get PDF
    Virtual Reality (VR) opens up new possibilities for developers to create immersive worlds and experiences. While it’s possible to craft unique and engaging interactive environments with unprecedented realism, the virtual world is constrained by the real one. Current approaches to player navigation in VR applications include joystick controls, teleportation, and motion-based movement. While these methods are effective in certain scenarios to overcome real-world limitations, my research introduces a novel approach that leverages room scale-based movement, with portals, to traverse a given VR world. This work presents algorithms that accurately predict the percentage of play area utilized, and rules to implement typical game elements to allow large scale virtual immersion under real world constraints
    • …
    corecore