23 research outputs found

    WristFlex: low-power gesture input with wrist-worn pressure sensors

    Get PDF
    In this paper we present WristFlex, an always-available on-body gestural interface. Using an array of force sensitive resistors (FSRs) worn around the wrist, the interface can distinguish subtle finger pinch gestures with high accuracy (>80 %) and speed. The system is trained to classify gestures from subtle tendon movements on the wrist. We demonstrate that WristFlex is a complete system that works wirelessly in real-time. The system is simple and light-weight in terms of power consumption and computational overhead. WristFlex's sensor power consumption is 60.7 uW, allowing the prototype to potentially last more then a week on a small lithium polymer battery. Also, WristFlex is small and non-obtrusive, and can be integrated into a wristwatch or a bracelet. We perform user studies to evaluate the accuracy, speed, and repeatability. We demonstrate that the number of gestures can be extended with orientation data from an accelerometer. We conclude by showing example applications.National Science Foundation (U.S.) (NSF award 1256082

    SensorTape: Modular and Programmable 3D-Aware Dense Sensor Network on a Tape

    Get PDF
    SensorTape is a modular and dense sensor network in a form factor of a tape. SensorTape is composed of interconnected and programmable sensor nodes on a flexible electronics substrate. Each node can sense its orientation with an inertial measurement unit, allowing deformation self-sensing of the whole tape. Also, nodes sense proximity using time-of-flight infrared. We developed network architecture to automatically determine the location of each sensor node, as SensorTape is cut and rejoined. Also, we made an intuitive graphical interface to program the tape. Our user study suggested that SensorTape enables users with different skill sets to intuitively create and program large sensor network arrays. We developed diverse applications ranging from wearables to home sensing, to show low deployment effort required by the user. We showed how SensorTape could be produced at scale using current technologies and we made a 2.3-meter long prototype.National Science Foundation (U.S.) (NSF award 1256082

    EMI Spy: Harnessing electromagnetic interference for low-cost, rapid prototyping of proxemic interaction

    Get PDF
    We present a wearable system that uses ambient electromagnetic interference (EMI) as a signature to identify electronic devices and support proxemic interaction. We designed a low cost tool, called EMI Spy, and a software environment for rapid deployment and evaluation of ambient EMI-based interactive infrastructure. EMI Spy captures electromagnetic interference and delivers the signal to a user's mobile device or PC through either the device's wired audio input or wirelessly using Bluetooth. The wireless version can be worn on the wrist, communicating with the user;s mobile device in their pocket. Users are able to train the system in less than 1 second to uniquely identify displays in a 2-m radius around them, as well as to detect pointing at a distance and touching gestures on the displays in real-time. The combination of a low cost EMI logger and an open source machine learning tool kit allows developers to quickly prototype proxemic, touch-to-connect, and gestural interaction. We demonstrate the feasibility of mobile, EMI-based device and gesture recognition with preliminary user studies in 3 scenarios, achieving 96% classification accuracy at close range for 6 digital signage displays distributed throughout a building, and 90% accuracy in classifying pointing gestures at neighboring desktop LCD displays. We were able to distinguish 1- and 2-finger touching with perfect accuracy and show indications of a way to determine power consumption of a device via touch. Our system is particularly well-suited to temporary use in a public space, where the sensors could be distributed to support a popup interactive environment anywhere with electronic devices. By designing for low cost, mobile, flexible, and infrastructure-free deployment, we aim to enable a host of new proxemic interfaces to existing appliances and displays

    NailO: Fingernails as an Input Surface

    Get PDF
    We present NailO, a nail-mounted gestural input surface. Using capacitive sensing on printed electrodes, the interface can distinguish on-nail finger swipe gestures with high accuracy (>92%). NailO works in real-time: we miniaturized the system to fit on the fingernail, while wirelessly transmitting the sensor data to a mobile phone or PC. NailO allows one-handed and always-available input, while being unobtrusive and discrete. Inspired by commercial nail stickers, the device blends into the user's body, is customizable, fashionable and even removable. We show example applications of using the device as a remote controller when hands are busy and using the system to increase the input space of mobile phones

    Opisthenar : hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Get PDF
    We introduce a vision-based technique to recognize static hand poses and dynamic finger tapping gestures. Our approach employs a camera on the wrist, with a view of the opisthenar (back of the hand) area. We envisage such cameras being included in a wrist-worn device such as a smartwatch, fitness tracker or wristband. Indeed, selected off-the-shelf smartwatches now incorporate a built-in camera on the side for photography purposes. However, in this configuration, the fingers are occluded from the view of the camera. The oblique angle and placement of the camera make typical vision-based techniques difficult to adopt. Our alternative approach observes small movements and changes in the shape, tendons, skin and bones on the opisthenar area. We train deep neural networks to recognize both hand poses and dynamic finger tapping gestures. While this is a challenging configuration for sensing, we tested the recognition with a real-time user test and achieved a high recognition rate of 89.4% (static poses) and 67.5% (dynamic gestures). Our results further demonstrate that our approach can generalize across sessions and to new users. Namely, users can remove and replace the wrist-worn device while new users can employ a previously trained system, to a certain degree. We conclude by demonstrating three applications and suggest future avenues of work based on sensing the back of the hand.Postprin

    WRIST : Watch-Ring Interaction and Sensing Technique for wrist gestures and macro-micro pointing

    Get PDF
    Funding: Next-Generation In-ormation Computing Development Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT (NRF-2017M3C4A7066316) and Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No.2019-0-01270, WISE AR UI/UX Platform Development for Smartglasses).To better explore the incorporation of pointing and gesturing into ubiquitous computing, we introduce WRIST, an interaction and sensing technique that leverages the dexterity of human wrist motion. WRIST employs a sensor fusion approach which combines inertial measurement unit (IMU) data from a smartwatch and a smart ring. The relative orientation difference of the two devices is measured as the wrist rotation that is independent from arm rotation, which is also position and orientation invariant. Employing our test hardware, we demonstrate that WRIST affords and enables a number of novel yet simplistic interaction techniques, such as (i) macro-micro pointing without explicit mode switching and (ii) wrist gesture recognition when the hand is held in different orientations (e.g., raised or lowered). We report on two studies to evaluate the proposed techniques and we present a set of applications that demonstrate the benefits of WRIST. We conclude with a discussion of the limitations and highlight possible future pathways for research in pointing and gesturing with wearable devices.Postprin

    Dynamic wearable technology : designing and deploying small climbing robots for sensing and actuation on the human body

    No full text
    Thesis: Ph. D., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2019Cataloged from PDF version of thesis.Includes bibliographical references (pages 147-162).This thesis introduces the idea of Dynamic Wearable Technology - a concept of wearable devices as small autonomous robots that can move on and around the human body. Ecosystems in the natural world have static and dynamic organisms such as plants vs. animals. In our wearable ecosystem, all our current devices are static, thus limiting their functionality. Adding robots could significantly increase the usability of wearable devices and/or open up entirely new avenues of application. This thesis develops and evaluates two approaches to wearable robots. First, Rovables, an on-clothing climbing robot that pinches fabric with magnetic rollers, and second, Epidermal Robots that use controlled suction to attach to the skin. The robots contain on-board navigation that uses inertial measurement units, motor encoders, and occasional ground truth from on-skin features or beacons to estimate position. In this thesis, we analyze important aspects of such robots: size, localization, weight, power consumption, and locomotion. Dynamic wearable technology has potential applications in many areas, such as medicine, human-computer interactions, fashion, and art. We explore several applications in each of these areas. We focus on how the robots can help to systematically collect health information, such as the mechanical, optical, and electrodermal properties of tissues. Robots like these will provide new avenues of autonomous or guided medical assessment and treatment as well as new venues for the artistic and interfacial exploration of relationships between our bodies and our devices.by Artem Dementyev.Ph. D.Ph.D. Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Science

    Applications of RF-powered computing systems: wearable EEG monitor and bistable display tag

    No full text
    Thesis (Master's)--University of Washington, 2013The advances in electronics reduced the energy requirements for computation and sensing to an extent needed to enable RF-powered systems. We demonstrate that it is possible to build key components of a ubiquitous computing system in a RF-powered way: sensing (input) and output. Those systems were built using software defined passive radio frequency identification (RFID) tags. As an input application, we demonstrate the EEGWISP: battery-free electroencephalogram (EEG), which uses RFID for power and communications. The wearable EEG monitoring systems are the cornerstone of noninvasive brain-computer interfaces (BCI) and many medical applications, but state-of-the-art wearable systems are limited by weight, battery life and size. Since EEGWISP does not need batteries it can be lightweight, miniature and maintenance free for users. For the output application we developed a bistable display tag that, from an energy standpoint, is capable of perpetual operation. A commercial off-the-shelf NFC-enabled phone generates RF signals carrying both the information and energy necessary to update the display. After the update is complete, the display continues to present the information with no further power input. We present one example implementation, a companion display for a mobile phone that can be used to capture and preserve a screenshot

    A Wearable UHF RFID-Based EEG System

    No full text
    Abstract—The wearable electroencephalogram (EEG) monitoring systems are the cornerstone of noninvasive brain-computer interfaces (BCI) and many medical applications, but state-ofthe-art wearable systems are limited by weight, battery life and size. In this paper we present EEGWISP: am EEG monitoring system that is battery-free; is powered by a standard UHF RFID reader; and uses backscatter to transmit the data using a EPC Class 1 Gen2 protocol. Since EEGWISP does not need batteries it can be lightweight, miniature and maintenance free for users. We designed a low-power EEG acquisition circuit with 62.6 µA current consumption. For validation, EEG signals were shown by distinct appearance of 8–12 Hz oscillations (alpha waves) when wearer’s eyes are closed. EEGWISP can record EEG signals at 63 Hz sampling rate, at the distances up to 0.80 m and with 0.1 % data loss. With slight modifications, our system can be used for other biopotential signals such as ECG. I
    corecore