5,472 research outputs found

    A Biomechanical Model for the Development of Myoelectric Hand Prosthesis Control Systems

    Get PDF
    Advanced myoelectric hand prostheses aim to reproduce as much of the human hand's functionality as possible. Development of the control system of such a prosthesis is strongly connected to its mechanical design; the control system requires accurate information on the prosthesis' structure and the surrounding environment, which can make development difficult without a finalized mechanical prototype. This paper presents a new framework for the development of electromyographic hand control systems, consisting of a prosthesis model based on the biomechanical structure of the human hand. The model's dynamic structure uses an ellipsoidal representation of the phalanges. Other features include underactuation in the fingers and thumb modeled with bond graphs, and a viscoelastic contact model. The model's functions are demonstrated by the execution of lateral and tripod grasps, and evaluated with regard to joint dynamics and applied forces. Finally, additions are suggested with which this model can be of use in mechanical design and patient training as well

    Exploring the Front Touch Interface for Virtual Reality Headsets

    Full text link
    In this paper, we propose a new interface for virtual reality headset: a touchpad in front of the headset. To demonstrate the feasibility of the front touch interface, we built a prototype device, explored VR UI design space expansion, and performed various user studies. We started with preliminary tests to see how intuitively and accurately people can interact with the front touchpad. Then, we further experimented various user interfaces such as a binary selection, a typical menu layout, and a keyboard. Two-Finger and Drag-n-Tap were also explored to find the appropriate selection technique. As a low-cost, light-weight, and in low power budget technology, a touch sensor can make an ideal interface for mobile headset. Also, front touch area can be large enough to allow wide range of interaction types such as multi-finger interactions. With this novel front touch interface, we paved a way to new virtual reality interaction methods

    Design and Evaluation of Menu Systems for Immersive Virtual Environments

    Get PDF
    Interfaces for system control tasks in virtual environments (VEs) have not been extensively studied. This paper focuses on various types of menu systems to be used in such environments. We describe the design of the TULIP menu, a menu system using Pinch Gloves™, and compare it to two common alternatives: floating menus and pen and tablet menus. These three menus were compared in an empirical evaluation. The pen and tablet menu was found to be significantly faster, while users had a preference for TULIP. Subjective discomfort levels were also higher with the floating menus and pen and tablet

    A virtual environment for the design and simulated construction of prefabricated buildings

    Get PDF
    The construction industry has acknowledged that its current working practices are in need of substantial improvements in quality and efficiency and has identified that computer modelling techniques and the use of prefabricated components can help reduce times, costs, and minimise defects and problems of on-site construction. This paper describes a virtual environment to support the design and construction processes of buildings from prefabricated components and the simulation of their construction sequence according to a project schedule. The design environment can import a library of 3-D models of prefabricated modules that can be used to interactively design a building. Using Microsoft Project, the construction schedule of the designed building can be altered, with this information feeding back to the construction simulation environment. Within this environment the order of construction can be visualised using virtual machines. Novel aspects of the system are that it provides a single 3-D environment where the user can construct their design with minimal user interaction through automatic constraint recognition and view the real-time simulation of the construction process within the environment. This takes this area of research a step forward from other systems that only allow the planner to view the construction at certain stages, and do not provide an animated view of the construction process

    Embodied Interactions for Spatial Design Ideation: Symbolic, Geometric, and Tangible Approaches

    Get PDF
    Computer interfaces are evolving from mere aids for number crunching into active partners in creative processes such as art and design. This is, to a great extent, the result of mass availability of new interaction technology such as depth sensing, sensor integration in mobile devices, and increasing computational power. We are now witnessing the emergence of maker culture that can elevate art and design beyond the purview of enterprises and professionals such as trained engineers and artists. Materializing this transformation is not trivial; everyone has ideas but only a select few can bring them to reality. The challenge is the recognition and the subsequent interpretation of human actions into design intent

    Natural user interfaces for interdisciplinary design review using the Microsoft Kinect

    Get PDF
    As markets demand engineered products faster, waiting on the cyclical design processes of the past is not an option. Instead, industry is turning to concurrent design and interdisciplinary teams. When these teams collaborate, engineering CAD tools play a vital role in conceptualizing and validating designs. These tools require significant user investment to master, due to challenging interfaces and an overabundance of features. These challenges often prohibit team members from using these tools for exploring designs. This work presents a method allowing users to interact with a design using intuitive gestures and head tracking, all while keeping the model in a CAD format. Specifically, Siemens\u27 Teamcenter® Lifecycle Visualization Mockup (Mockup) was used to display design geometry while modifications were made through a set of gestures captured by a Microsoft KinectTM in real time. This proof of concept program allowed a user to rotate the scene, activate Mockup\u27s immersive menu, move the immersive wand, and manipulate the view based on head position. This work also evaluates gesture usability and task completion time for this proof of concept system. A cognitive model evaluation method was used to evaluate the premise that gesture-based user interfaces are easier to use and learn with regards to time than a traditional mouse and keyboard interface. Using a cognitive model analysis tool allowed the rapid testing of interaction concepts without the significant overhead of user studies and full development cycles. The analysis demonstrated that using the KinectTM is a feasible interaction mode for CAD/CAE programs. In addition, the analysis pointed out limitations in the gesture interfaces ability to compete time wise with easily accessible customizable menu options

    Electromyogram (EMG) Driven System Based Virtual Reality for Prosthetic and Rehabilitation Devices

    Full text link
    The users of current prosthetic and rehabilitation devices are facing problems to adapt to their new hosts or not receiving any bio-feedback despite rehabilitation process and retraining, particularly when working with Electromyogram (EMG) signals. In characterizing virtual human limbs, as a potential prosthetic device in 3D virtual reality, patients are able to familiarize themselves with their new appendage and its capabilities in a virtual training environment or can see their movements intention. This paper presents a Virtual Reality (VR) based design and implementation of a below-shoulder 3D human arm capable of 10-class EMG based motions driven system of biomedical EMG signal. The method considers a signal classification output as potential control stimulus to drive the virtual prosthetic prototype. A hierarchical design methodology is adopted based on anatomical structure, congruent with Virtual Reality Modeling Language (VRML) architecture. The resulting simulation is based on a portable, self-contained VR model implementation paired with an instrumental virtual control-select board capable of actuating any combinations of singular or paired kinematic 10-class EMG motions. The built model allows for multiple degree of freedom profiles as the classes can be activated independently or in conjunction with others allowing enhanced arm movement
    corecore