45 research outputs found

    Using Kinect for hand tracking and rendering in wearable haptics

    Full text link
    Wearable haptic devices with poor position sensing are combined with the Kinect depth sensor by Microsoft. A heuristic hand tracker has been developed. It allows for the animation of the hand avatar in the virtual reality and the implementation of the force rendering algorithm: the position of the fingertips is measured by the hand tracker designed and optimized for Kinect, and the rendering algorithm computes the contact forces for wearable haptic display. Preliminary experiments with qualitative results show the effectiveness of the idea of combining Kinect and wearable haptics

    Fabrication of the Kinect Remote-controlled Cars and Planning of the Motion Interaction Courses

    Get PDF
    AbstractThis paper describes the fabrication of Kinect remote-controlled cars, using PC, Kinect sensor, interface control circuit, embedded controller, and brake device, as well as the planning of motion interaction courses. The Kinect sensor first detects the body movement of the user, and converts it into control commands. Then, the PC sends the commands to Arduino control panel via XBee wireless communication modules. The interface circuit is used to control movement and direction of motors, including forward and backward, left and right. In order to develop the content of Kinect motion interaction courses, this study conducted literature review to understand the curriculum contents, and invited experts for interviews to collect data on learning background, teaching contents and unit contents. Based on the data, the teaching units and outlines are developed for reference of curriculums

    Master of Science

    Get PDF
    thesisStroke is a leading cause of death and adult disability in the United States. Survivors lose abilities that were controlled by the affected area of the brain. Rehabilitation therapy is administered to help survivors regain control of lost functional abilities. The number of sessions that stroke survivors attend are limited to the availability of a clinic close to their residence and the amount of time friends and family can devote to help them commute, as most are incapable of driving. Home-based therapy using virtual reality and computer games have the potential of solving these issues, increasing the amount of independent therapy performed by patients. This thesis presents the design, development and testing of a low-cost system, potentially suitable for use in the home environment. This system is designed for rehabilitation of the impaired upper limb of stroke survivors. A Microsoft Kinect was used to track the position of the patient's hand and the game requires the user to move the arm over increasing large areas by sliding the arm on a support. Studies were performed with six stroke survivors and five control subjects to determine the feasibility of the system. Patients played the game for 6 to 10 days and their game scores, range of motion and Fugl-Meyer scores were recorded for analysis. Statistically significant (p<0.05) differences were found between the game scores of the first and last day of the study. Furthermore, acceptability surveys revealed patients enjoyed playing the game, found this kind of therapy more enjoyable than conventional therapy and were willing to use the system in the home environment. Future work in the system will be focused on larger studies, improving the comfort of patients while playing the game, and developing new games that address cognitive issues and integrate art and therapy

    GUI system for Elders/Patients in Intensive Care

    Full text link
    In the old age, few people need special care if they are suffering from specific diseases as they can get stroke while they are in normal life routine. Also patients of any age, who are not able to walk, need to be taken care of personally but for this, either they have to be in hospital or someone like nurse should be with them for better care. This is costly in terms of money and man power. A person is needed for 24x7 care of these people. To help in this aspect we purposes a vision based system which will take input from the patient and will provide information to the specified person, who is currently may not in the patient room. This will reduce the need of man power, also a continuous monitoring would not be needed. The system is using MS Kinect for gesture detection for better accuracy and this system can be installed at home or hospital easily. The system provides GUI for simple usage and gives visual and audio feedback to user. This system work on natural hand interaction and need no training before using and also no need to wear any glove or color strip.Comment: In proceedings of the 4th IEEE International Conference on International Technology Management Conference, Chicago, IL USA, 12-15 June, 201

    Application of EMG and Force Signals of Elbow Joint on Robot-assisted Arm Training

    Get PDF
    Flexion-extension based on the system's robotic arm has the potential to increase the patient's elbow joint movement. The force sensor and electromyography signals can support the biomechanical system to detect electrical signals generated by the muscles of the biological. The purpose of this study is to implement the design of force sensor and EMG signals application on the elbow flexion motion of the upper arm. In this experiments, the movements of flexion at an angle of 45潞, 90潞 and 135潞 is applied to identify the relationship between the amplitude of the EMG and force signals on every angle. The contribution of this research is for supporting the development of the Robot-Assisted Arm Training. The correlation between the force signal and the EMG signal from the subject studied in the elbow joint motion tests. The application of sensors tested by an experimental on healthy subjects to simulating arm movement. The experimental results show the relationship between the amplitude of the EMG and force signals on flexion angle of the joint mechanism for monitoring the angular displacement of the robotic arm. Further developments in the design of force sensor and EMG signals are potentially for open the way for the next researches based on the physiological condition of each patient

    Visual Hand Tracking on Depth Image Using 2-D Matched Filter

    Get PDF
    Hand detection has been the central attention of human-machine interaction in recent researches. In order to track hand accurately, traditional methods mostly involve using machine learning and other available libraries, which requires a lot of computational resource on data collection and processing. This paper presents a method of hand detection and tracking using depth image which can be conveniently and manageably applied in practice without the huge data analysis. This method is based on the two-dimensional matched filter in image processing to precisely locate the hand position through several underlying codes, cooperated with a Delta robot. Compared with other approaches, this method is comprehensible and time-saving, especially for single specific gesture detection and tracking. Additionally, it is friendly-programmed and can be used on variable platforms such as MATLAB and Python. The experiments show that this method can do fast hand tracking and improve accuracy by selecting the proper hand template and can be directly used in the applications of human-machine interaction. In order to evaluate the performance of gesture tracking, a recorded video on depth image model is used to test theoretical design, and a delta parallel robot is used to follow the moving hand by the proposed algorithm, which demonstrates the feasibility in practice

    An intuitive tangible game controller

    Get PDF
    This paper outlines the development of a sensory feedback device providing a low cost, versatile and intuitive interface for controlling digital environments, in this example a flight simulator. Gesture based input allows for a more immersive experience, so rather than making the user feel like they are controlling an aircraft the intuitive interface allows the user to become the aircraft that is controlled by the movements of the user's hand. The movements are designed to feel intuitive and allow for a sense of immersion that would be difficult to achieve with an alternative interface. In this example the user's hand can become the aircraft much the same way that a child would imagine it

    Plataforma computacional de captura y representaci贸n del movimiento en 3D para apoyo a la rehabilitaci贸n de la marcha

    Get PDF
    El movimiento humano es un elemento importante en el estudio de la biolog铆a humana debido a ser una propiedad b谩sica de la vida. Este resulta a煤n m谩s importante en aspectos colectivos como el deporte, la salud, la educaci贸n f铆sica, la danza, la rehabilitaci贸n, entre otros. Especial atenci贸n requiere la rehabilitaci贸n donde, a trav茅s del an谩lisis de marcha, el fisiatra observa y estudia el desempe帽o de los pacientes a lo largo de diversas sesiones que realizan, con el fin de ver la evoluci贸n de los mismos. Sin embargo, capturar y realizar seguimiento al movimiento humano de manera computacional es una tarea compleja con alternativas costosas. Asimismo, los centros de rehabilitaci贸n de Lima no cuentan con la tecnolog铆a apropiada y existe, por parte de las personas, una negaci贸n al cambio que complica la inclusi贸n de la misma. Debido a esto, el presente proyecto presenta una herramienta de bajo costo que permite registrar y representar el movimiento humano en 3D para apoyar a las terapias de rehabilitaci贸n de la marcha. El registro puede ser almacenado de forma permanente con el fin de mantener un hist贸rico y realizar nuevos an谩lisis en el tiempo. El dispositivo usado (Kinect) se mostr贸 adecuado para la captura de la informaci贸n 煤til con un margen de error aceptable que, dependiendo del entorno de captura, 茅ste puede ser controlado. En el trabajo se consigui贸 detectar el patr贸n de marcha de los 谩ngulos de flexi贸n de la rodilla, lo cual permiti贸 comparar diferentes situaciones de caminata y calcular medidas como distancia de paso, velocidad del movimiento de la persona, porcentaje de datos dentro del rango de referencia de una caminata normal (medidas de los 谩ngulos de flexi贸n), de esta manera se consigue determinar el nivel de normalidad de un ciclo de paso. Finalmente se desarroll贸 un prototipo de visualizaci贸n 3D que permite manipular el entorno y tener diferentes vistas del movimiento del paciente. Este puede ser reutilizado para crear un ambiente m谩s completo de an谩lisis que comprenda otros tipos de movimientos (adem谩s de la rodilla), soportar un mayor n煤mero de fuentes de diferentes 谩ngulos (m煤ltiples Kinect), Kinect en movimiento que permita realizar un seguimiento de la marcha en un mayor espacio, entre otros. Todo lo anterior mencionado servir铆a de base para la constituci贸n de un laboratorio de rehabilitaci贸n basado en realidad virtual.Tesi
    corecore