1,498 research outputs found

    Framework for Dynamic Evaluation of Muscle Fatigue in Manual Handling Work

    Get PDF
    Muscle fatigue is defined as the point at which the muscle is no longer able to sustain the required force or work output level. The overexertion of muscle force and muscle fatigue can induce acute pain and chronic pain in human body. When muscle fatigue is accumulated, the functional disability can be resulted as musculoskeletal disorders (MSD). There are several posture exposure analysis methods useful for rating the MSD risks, but they are mainly based on static postures. Even in some fatigue evaluation methods, muscle fatigue evaluation is only available for static postures, but not suitable for dynamic working process. Meanwhile, some existing muscle fatigue models based on physiological models cannot be easily used in industrial ergonomic evaluations. The external dynamic load is definitely the most important factor resulting muscle fatigue, thus we propose a new fatigue model under a framework for evaluating fatigue in dynamic working processes. Under this framework, virtual reality system is taken to generate virtual working environment, which can be interacted with the work with haptic interfaces and optical motion capture system. The motion information and load information are collected and further processed to evaluate the overall work load of the worker based on dynamic muscle fatigue models and other work evaluation criterions and to give new information to characterize the penibility of the task in design process.Comment: International Conference On Industrial Technology, Chengdu : Chine (2008

    Haptic interfaces: Hardware, software and human performance

    Get PDF
    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed

    Press'Em: Simulating Varying Button Tactility via FDVV Models

    Full text link
    Push-buttons provide rich haptic feedback during a press via mechanical structures. While different buttons have varying haptic qualities, few works have attempted to dynamically render such tactility, which limits designers from freely exploring buttons' haptic design. We extend the typical force-displacement (FD) model with vibration (V) and velocity-dependence characteristics (V) to form a novel FDVV model. We then introduce Press'Em, a 3D-printed prototype capable of simulating button tactility based on FDVV models. To drive Press'Em, an end-to-end simulation pipeline is presented that covers (1) capturing any physical buttons, (2) controlling the actuation signals, and (3) simulating the tactility. Our system can go beyond replicating existing buttons to enable designers to emulate and test non-existent ones with desired haptic properties. Press'Em aims to be a tool for future research to better understand and iterate over button designs.Comment: 4 pages, CHI'20 EA. arXiv admin note: text overlap with arXiv:2001.0435

    NOViSE: a virtual natural orifice transluminal endoscopic surgery simulator

    Get PDF
    Purpose: Natural Orifice Transluminal Endoscopic Surgery (NOTES) is a novel technique in minimally invasive surgery whereby a flexible endoscope is inserted via a natural orifice to gain access to the abdominal cavity, leaving no external scars. This innovative use of flexible endoscopy creates many new challenges and is associated with a steep learning curve for clinicians. Methods: We developed NOViSE - the first force-feedback enabled virtual reality simulator for NOTES training supporting a flexible endoscope. The haptic device is custom built and the behaviour of the virtual flexible endoscope is based on an established theoretical framework – the Cosserat Theory of Elastic Rods. Results: We present the application of NOViSE to the simulation of a hybrid trans-gastric cholecystectomy procedure. Preliminary results of face, content and construct validation have previously shown that NOViSE delivers the required level of realism for training of endoscopic manipulation skills specific to NOTES Conclusions: VR simulation of NOTES procedures can contribute to surgical training and improve the educational experience without putting patients at risk, raising ethical issues or requiring expensive animal or cadaver facilities. In the context of an experimental technique, NOViSE could potentially facilitate NOTES development and contribute to its wider use by keeping practitioners up to date with this novel surgical technique. NOViSE is a first prototype and the initial results indicate that it provides promising foundations for further development

    Development of a Physical Shoulder Simulator for the Training of Basic Arthroscopic Skills

    Get PDF
    Increasingly, shoulder surgeries are performed using arthroscopic techniques, leading to reduced tissue damage and shorter patient recovery times. Orthopaedic training programs are responding to the increased demand for arthroscopic surgeries by incorporating arthroscopic skills into their residency curriculums. A need for accessible and effective training tools exists. This thesis describes the design and development of a physical shoulder simulator for training basic arthroscopy skills such as triangulation, orientation, and navigation of the anatomy. The simulator can be used in either the lateral decubitus or beach chair orientation and accommodates wet or dry practice. Sensors embedded in the simulator provide a means to assess performance. A study was conducted to determine the effectiveness of the simulator. Novice subjects improved their performance after practicing with the simulator. A survey completed by experts, recognized the simulator as a valuable tool for training novice surgeons in basic arthroscopic skills

    BodySpace: inferring body pose for natural control of a music player

    Get PDF
    We describe the BodySpace system, which uses inertial sensing and pattern recognition to allow the gestural control of a music player by placing the device at different parts of the body. We demonstrate a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based techniques can shape gestural interaction

    Interactive form creation: exploring the creation and manipulation of free form through the use of interactive multiple input interface

    Get PDF
    Most current CAD systems support only the two most common input devices: a mouse and a keyboard that impose a limit to the degree of interaction that a user can have with the system. However, it is not uncommon for users to work together on the same computer during a collaborative task. Beside that, people tend to use both hands to manipulate 3D objects; one hand is used to orient the object while the other hand is used to perform some operation on the object. The same things could be applied to computer modelling in the conceptual phase of the design process. A designer can rotate and position an object with one hand, and manipulate the shape [deform it] with the other hand. Accordingly, the 3D object can be easily and intuitively changed through interactive manipulation of both hands.The research investigates the manipulation and creation of free form geometries through the use of interactive interfaces with multiple input devices. First the creation of the 3D model will be discussed; several different types of models will be illustrated. Furthermore, different tools that allow the user to control the 3D model interactively will be presented. Three experiments were conducted using different interactive interfaces; two bi-manual techniques were compared with the conventional one-handed approach. Finally it will be demonstrated that the use of new and multiple input devices can offer many opportunities for form creation. The problem is that few, if any, systems make it easy for the user or the programmer to use new input devices

    Augmenting User Interfaces with Haptic Feedback

    Get PDF
    Computer assistive technologies have developed considerably over the past decades. Advances in computer software and hardware have provided motion-impaired operators with much greater access to computer interfaces. For people with motion impairments, the main di�culty in the communication process is the input of data into the system. For example, the use of a mouse or a keyboard demands a high level of dexterity and accuracy. Traditional input devices are designed for able-bodied users and often do not meet the needs of someone with disabilities. As the key feature of most graphical user interfaces (GUIs) is to point-and-click with a cursor this can make a computer inaccessible for many people. Human-computer interaction (HCI) is an important area of research that aims to improve communication between humans and machines. Previous studies have identi�ed haptics as a useful method for improving computer access. However, traditional haptic techniques su�er from a number of shortcomings that have hindered their inclusion with real world software. The focus of this thesis is to develop haptic rendering algorithms that will permit motion-impaired operators to use haptic assistance with existing graphical user interfaces. The main goal is to improve interaction by reducing error rates and improving targeting times. A number of novel haptic assistive techniques are presented that utilise the three degrees-of-freedom (3DOF) capabilities of modern haptic devices to produce assistance that is designed speci�- cally for motion-impaired computer users. To evaluate the e�ectiveness of the new techniques a series of point-and-click experiments were undertaken in parallel with cursor analysis to compare the levels of performance. The task required the operator to produce a prede�ned sentence on the densely populated Windows on-screen keyboard (OSK). The results of the study prove that higher performance levels can be i ii achieved using techniques that are less constricting than traditional assistance
    • …
    corecore