337 research outputs found

    Feasibility study of a permanently implanted prosthetic hand

    Get PDF
    The feasibility of a permanently implanted prosthetic hand was evaluated from both an internal biocompatibility and exterior mechanics point of view. A literature review of the issues involved in permanent implantation of a percutanious device was performed in the areas of bone interaction and fixation and neural interface control. A theoretical implant was designed for a 90th percentile male, using an HA-G-Ti composite material to provide a permanent base to which the hand could attach. Using a radial implant length of 1.87 inches and an ulna implant length of 1.32 inches, the simulated implant could withstand a push out force of 10,260 pounds. Using nerve guidance channels and micro-electrode arrays, a Regenerative Neural Interface was postulated to control the implant. The use of Laminin-5 was suggested as a method of preventing the lack of wound closure observed in percutaneous devices. The exterior portion of a permanent artificial hand was analyzed by the construction of a robotic hand optimized for weight, size, grip force and wrist torque, power consumption and range of motion. Using a novel dual drive system, each finger was equipped with both joint position servos as well as a tendon. Fine grip shape was formed using the servos, while the tendon was pulled taunt when grasping an object. Control of the prosthetic was performed using a distributed network of micro-controllers. Each finger\u27s behavior was governed by a master/slave system where input from a control glove was processed by a master controller with joint servo and tendon instructions passed to lower-level controllers for management of hand actuators. The final weight of the prototype was 3.85 pounds and was approximately 25% larger than the 90th percentile male hand it was based on. Grip force was between 1.25 and 2 pounds per finger, depending on amount of finger flexion with a wrist lifting capacity of 1.2 pounds at the center of the palm. The device had an average current draw of 3 amps in both normal operation and tight grasping. Range of motion was similar to that of the human model. Overall feasibility is examined and factors involved in industrial implementation are also discussed

    Interactive exploration of historic information via gesture recognition

    Get PDF
    Developers of interactive exhibits often struggle to �nd appropriate input devices that enable intuitive control, permitting the visitors to engage e�ectively with the content. Recently motion sensing input devices like the Microsoft Kinect or Panasonic D-Imager have become available enabling gesture based control of computer systems. These devices present an attractive input device for exhibits since the user can interact with their hands and they are not required to physically touch any part of the system. In this thesis we investigate techniques to enable the raw data coming from these types of devices to be used to control an interactive exhibit. Object recognition and tracking techniques are used to analyse the user's hand where movement and clicks are processed. To show the e�ectiveness of the techniques the gesture system is used to control an interactive system designed to inform the public about iconic buildings in the centre of Norwich, UK. We evaluate two methods of making selections in the test environment. At the time of experimentation the technologies were relatively new to the image processing environment. As a result of the research presented in this thesis, the techniques and methods used have been detailed and published [3] at the VSMM (Virtual Systems and Multimedia 2012) conference with the intention of further forwarding the area

    Direct interaction with large displays through monocular computer vision

    Get PDF
    Large displays are everywhere, and have been shown to provide higher productivity gain and user satisfaction compared to traditional desktop monitors. The computer mouse remains the most common input tool for users to interact with these larger displays. Much effort has been made on making this interaction more natural and more intuitive for the user. The use of computer vision for this purpose has been well researched as it provides freedom and mobility to the user and allows them to interact at a distance. Interaction that relies on monocular computer vision, however, has not been well researched, particularly when used for depth information recovery. This thesis aims to investigate the feasibility of using monocular computer vision to allow bare-hand interaction with large display systems from a distance. By taking into account the location of the user and the interaction area available, a dynamic virtual touchscreen can be estimated between the display and the user. In the process, theories and techniques that make interaction with computer display as easy as pointing to real world objects is explored. Studies were conducted to investigate the way human point at objects naturally with their hand and to examine the inadequacy in existing pointing systems. Models that underpin the pointing strategy used in many of the previous interactive systems were formalized. A proof-of-concept prototype is built and evaluated from various user studies. Results from this thesis suggested that it is possible to allow natural user interaction with large displays using low-cost monocular computer vision. Furthermore, models developed and lessons learnt in this research can assist designers to develop more accurate and natural interactive systems that make use of human’s natural pointing behaviours

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Enriching mobile interaction with garment-based wearable computing devices

    Get PDF
    Wearable computing is on the brink of moving from research to mainstream. The first simple products, such as fitness wristbands and smart watches, hit the mass market and achieved considerable market penetration. However, the number and versatility of research prototypes in the field of wearable computing is far beyond the available devices on the market. Particularly, smart garments as a specific type of wearable computer, have high potential to change the way we interact with computing systems. Due to the proximity to the user`s body, smart garments allow to unobtrusively sense implicit and explicit user input. Smart garments are capable of sensing physiological information, detecting touch input, and recognizing the movement of the user. In this thesis, we explore how smart garments can enrich mobile interaction. Employing a user-centered design process, we demonstrate how different input and output modalities can enrich interaction capabilities of mobile devices such as mobile phones or smart watches. To understand the context of use, we chart the design space for mobile interaction through wearable devices. We focus on the device placement on the body as well as interaction modality. We use a probe-based research approach to systematically investigate the possible inputs and outputs for garment based wearable computing devices. We develop six different research probes showing how mobile interaction benefits from wearable computing devices and what requirements these devices pose for mobile operating systems. On the input side, we look at explicit input using touch and mid-air gestures as well as implicit input using physiological signals. Although touch input is well known from mobile devices, the limited screen real estate as well as the occlusion of the display by the input finger are challenges that can be overcome with touch-enabled garments. Additionally, mid-air gestures provide a more sophisticated and abstract form of input. We present a gesture elicitation study to address the special requirements of mobile interaction and present the resulting gesture set. As garments are worn, they allow different physiological signals to be sensed. We explore how we can leverage these physiological signals for implicit input. We conduct a study assessing physiological information by focusing on the workload of drivers in an automotive setting. We show that we can infer the driver´s workload using these physiological signals. Beside the input capabilities of garments, we explore how garments can be used as output. We present research probes covering the most important output modalities, namely visual, auditory, and haptic. We explore how low resolution displays can serve as a context display and how and where content should be placed on such a display. For auditory output, we investigate a novel authentication mechanism utilizing the closeness of wearable devices to the body. We show that by probing audio cues through the head of the user and re-recording them, user authentication is feasible. Last, we investigate EMS as a haptic feedback method. We show that by actuating the user`s body, an embodied form of haptic feedback can be achieved. From the aforementioned research probes, we distilled a set of design recommendations. These recommendations are grouped into interaction-based and technology-based recommendations and serve as a basis for designing novel ways of mobile interaction. We implement a system based on these recommendations. The system supports developers in integrating wearable sensors and actuators by providing an easy to use API for accessing these devices. In conclusion, this thesis broadens the understanding of how garment-based wearable computing devices can enrich mobile interaction. It outlines challenges and opportunities on an interaction and technological level. The unique characteristics of smart garments make them a promising technology for making the next step in mobile interaction

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility
    corecore