4 research outputs found

    Hand gesture recognition through capacitive sensing : a thesis presented in partial fulfilment of the requirements for the degree of Master of Engineering in Electronics & Computer Engineering at Massey University, School of Food and Advanced Technology (SF&AT), Auckland, New Zealand

    Get PDF
    Figures 1.1, 1.2, 1.3, 2.1, 2.3 & 2.4 are re-used with permission. Figure 2.2 (=Smith, 1996 Fig 1) ©1996 by International Business Machines Corporation was removed.This thesis investigated capacitive sensing-based hand gesture recognition by developing and validating through custom built hardware. We attempted to discover if massed arrays of capacitance sensors can produce a robust system capable of simple hand gesture detection and recognition. The first stage of this research was to build the hardware that performed capacitance sensing. This hardware needs to be sensitive enough to capture minor variations in capacitance values, while also reducing stray capacitance to their minimum. The hardware designed in this stage formed the basis of all the data captured and utilised for subsequent training and testing of machine learning based classifiers. The second stage of this system used mass arrays of capacitance sensor pads to capture frames of hand gestures in the form of low-resolution 2D images. The raw data was then processed to account for random variations and noise present naturally in the surrounding environment. Five different gestures were captured from several test participants and used to train, validate and test the classifiers. Different methods were explored in the recognition and classification stage: initially, simple probabilistic classifiers were used; afterwards, neural networks were used. Two types of neural networks are explored, namely Multilayer Perceptron (MLP) and Convolutional Neural Network (CNN), which are capable of achieving upwards of 92.34 % classification accuracy

    Capacitive Sensing and Communication for Ubiquitous Interaction and Environmental Perception

    Get PDF
    During the last decade, the functionalities of electronic devices within a living environment constantly increased. Besides the personal computer, now tablet PCs, smart household appliances, and smartwatches enriched the technology landscape. The trend towards an ever-growing number of computing systems has resulted in many highly heterogeneous human-machine interfaces. Users are forced to adapt to technology instead of having the technology adapt to them. Gathering context information about the user is a key factor for improving the interaction experience. Emerging wearable devices show the benefits of sophisticated sensors which make interaction more efficient, natural, and enjoyable. However, many technologies still lack of these desirable properties, motivating me to work towards new ways of sensing a user's actions and thus enriching the context. In my dissertation I follow a human-centric approach which ranges from sensing hand movements to recognizing whole-body interactions with objects. This goal can be approached with a vast variety of novel and existing sensing approaches. I focused on perceiving the environment with quasi-electrostatic fields by making use of capacitive coupling between devices and objects. Following this approach, it is possible to implement interfaces that are able to recognize gestures, body movements and manipulations of the environment at typical distances up to 50cm. These sensors usually have a limited resolution and can be sensitive to other conductive objects or electrical devices that affect electric fields. The technique allows for designing very energy-efficient and high-speed sensors that can be deployed unobtrusively underneath any kind of non-conductive surface. Compared to other sensing techniques, exploiting capacitive coupling also has a low impact on a user's perceived privacy. In this work, I also aim at enhancing the interaction experience with new perceptional capabilities based on capacitive coupling. I follow a bottom-up methodology and begin by presenting two low-level approaches for environmental perception. In order to perceive a user in detail, I present a rapid prototyping toolkit for capacitive proximity sensing. The prototyping toolkit shows significant advancements in terms of temporal and spatial resolution. Due to some limitations, namely the inability to determine the identity and fine-grained manipulations of objects, I contribute a generic method for communications based on capacitive coupling. The method allows for designing highly interactive systems that can exchange information through air and the human body. I furthermore show how human body parts can be recognized from capacitive proximity sensors. The method is able to extract multiple object parameters and track body parts in real-time. I conclude my thesis with contributions in the domain of context-aware devices and explicit gesture-recognition systems

    Ambient Gesture-Recognizing Surfaces with Visual Feedback

    No full text
    In recent years, gesture-based interaction gained increasing interest in Ambient Intelligence. Especially the success of camera-based gesture recognition systems shows that a great variety of applications can benefit significantly from natural and intuitive interaction paradigms. Besides camera-based systems, proximity-sensing surfaces are especially suitable as an input modality for intelligent environments. They can be installed ubiquitously under any kind of non-conductive surface, such as a table. However, interaction barriers and the types of supported gestures are often not apparent to the user. In order to solve this problem, we investigate an approach which combines a semi-transparent capacitive proximity-sensing surface with an LED array. The LED array is used to indicate possible gestural movements and provide visual feedback on the current interaction status. A user study shows that our approach can enhance the user experience, especially for inexperienced users

    Ambient Gesture-Recognizing Surfaces with Visual Feedback

    No full text
    In recent years, gesture-based interaction gained increasing interest in Ambient Intelligence. Especially the success of camera-based gesture recognition systems shows that a great variety of applications can benefit significantly from natural and intuitive interaction paradigms. Besides camera-based systems, proximity-sensing surfaces are especially suitable as an input modality for intelligent environments. They can be installed ubiquitously under any kind of non-conductive surface, such as a table. However, interaction barriers and the types of supported gestures are often not apparent to the user. In order to solve this problem, we investigate an approach which combines a semi-transparent capacitive proximity-sensing surface with an LED array. The LED array is used to indicate possible gestural movements and provide visual feedback on the current interaction status. A user study shows that our approach can enhance the user experience, especially for inexperienced users
    corecore