1,458 research outputs found

    Interaction With Tilting Gestures In Ubiquitous Environments

    Full text link
    In this paper, we introduce a tilting interface that controls direction based applications in ubiquitous environments. A tilt interface is useful for situations that require remote and quick interactions or that are executed in public spaces. We explored the proposed tilting interface with different application types and classified the tilting interaction techniques. Augmenting objects with sensors can potentially address the problem of the lack of intuitive and natural input devices in ubiquitous environments. We have conducted an experiment to test the usability of the proposed tilting interface to compare it with conventional input devices and hand gestures. The experiment results showed greater improvement of the tilt gestures in comparison with hand gestures in terms of speed, accuracy, and user satisfaction.Comment: 13 pages, 10 figure

    Proceedings of the 3rd IUI Workshop on Interacting with Smart Objects

    Get PDF
    These are the Proceedings of the 3rd IUI Workshop on Interacting with Smart Objects. Objects that we use in our everyday life are expanding their restricted interaction capabilities and provide functionalities that go far beyond their original functionality. They feature computing capabilities and are thus able to capture information, process and store it and interact with their environments, turning them into smart objects

    Human Movement Recognization using Radars

    Get PDF
    Human Movement recognization or gesture recognization via mathematical algorithm is attracting great attention in recent years mainly because of its applications the key element of such system is smart user interface which can actually recognizes the gestures or the movement and convey the same message. Which is actually quite challenging because of its various parameter? In this paper I am try to work on this challenge of traditional sensors and improve the interaction between machine and sensing system by usage of radars to detect the gestures

    Textile-based wearable sensors for assisting sports performance

    Get PDF
    There is a need for wearable sensors to assess physiological signals and body kinematics during exercise. Such sensors need to be straightforward to use, and ideally the complete system integrated fully within a garment. This would allow wearers to monitor their progress as they undergo an exercise training programme without the need to attach external devices. This takes physiological monitoring into a more natural setting. By developing textile sensors the intelligence is integrated into a sports garment in an innocuous manner. A number of textile based sensors are presented here that have been integrated into garments for various sports applications

    Wearable performance

    Get PDF
    This is the post-print version of the article. The official published version can be accessed from the link below - Copyright @ 2009 Taylor & FrancisWearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment. Wearable computing devices worn on the body provide the potential for digital interaction in the world. A new stage of computing technology at the beginning of the 21st Century links the personal and the pervasive through mobile wearables. The convergence between the miniaturisation of microchips (nanotechnology), intelligent textile or interfacial materials production, advances in biotechnology and the growth of wireless, ubiquitous computing emphasises not only mobility but integration into clothing or the human body. In artistic contexts one expects such integrated wearable devices to have the two-way function of interface instruments (e.g. sensor data acquisition and exchange) worn for particular purposes, either for communication with the environment or various aesthetic and compositional expressions. 'Wearable performance' briefly surveys the context for wearables in the performance arts and distinguishes display and performative/interfacial garments. It then focuses on the authors' experiments with 'design in motion' and digital performance, examining prototyping at the DAP-Lab which involves transdisciplinary convergences between fashion and dance, interactive system architecture, electronic textiles, wearable technologies and digital animation. The concept of an 'evolving' garment design that is materialised (mobilised) in live performance between partners originates from DAP Lab's work with telepresence and distributed media addressing the 'connective tissues' and 'wearabilities' of projected bodies through a study of shared embodiment and perception/proprioception in the wearer (tactile sensory processing). Such notions of wearability are applied both to the immediate sensory processing on the performer's body and to the processing of the responsive, animate environment

    Middleware-Driven Intelligent Glove for Industrial Applications

    Get PDF
    It is estimated that by the year 2020, 700 million wearable technology devices will be sold worldwide. One of the reasons is the industries’ need to increase their productivity. Some of the tools welcomed by industries are handheld devices such as tablets, PDAs and mobile phones. However, handheld devices are not ideal for industrial applications because they often subject users to fatigue during their long working hours. A viable solution to this problem is wearable devices. The advantage of wearable devices is that they become part of the user. Hence, they subject the user to less fatigue, thereby increasing their productivity. This chapter presents the development of an intelligent glove, which is designed to control actuators in an industrial environment. This system utilizes RTI connext data distributed service middleware to facilitate communication over WiFi. Our experiments show very promising results with maximum power consumption of 310 mW and latency as low as 23 ms. These results make the proposed system a perfect fit for most industrial applications

    Implementation of Raspberry Pi Based Inteli Glove for Gesture to Voice Translation with Location Intimation for Deaf and Blind People

    Get PDF
    Communication plays an important role for human beings. Communication is treated as a life skill. This paper helps in improving the communication with the deaf and dumb using flex sensor technology. A device is developed that can translate different signs including Indian sign language to text as well as voice format. The people who are communicating with deaf and dumb may not understand their signs and expressions. Hence, an approach has been created and modified to hear the gesture based communication. It will be very helpful to them for conveying their thoughts to others.In the proposed system, RF module is used for transmitting and receiving the information and raspberry pi as a processor, GPS module is also used for blind people to identify their location. The entire framework has been executed, customized, cased and tried with great outcomes

    Time Complexity of Color Camera Depth Map Hand Edge Closing Recognition Algorithm

    Get PDF
    The objective of this paper is to calculate the time complexity of the colored camera depth map hand edge closing algorithm of the hand gesture recognition technique. It has been identified as hand gesture recognition through human-computer interaction using color camera and depth map technique, which is used to find the time complexity of the algorithms using 2D minima methods, brute force, and plane sweep. Human-computer interaction is a very much essential component of most people's daily life. The goal of gesture recognition research is to establish a system that can classify specific human gestures and can make its use to convey information for the device control. These methods have different input types and different classifiers and techniques to identify hand gestures. This paper includes the algorithm of one of the hand gesture recognition “Color camera depth map hand edge recognition” algorithm and its time complexity and simulation on MATLAB
    • …
    corecore