403 research outputs found

    PhysioSkin: Rapid Fabrication of Skin-Conformal Physiological Interfaces

    Get PDF
    Advances in rapid prototyping platforms have made physiological sensing accessible to a wide audience. However, off-the-shelf electrodes commonly used for capturing biosignals are typically thick, non-conformal and do not support customization. We present PhysioSkin, a rapid, do-it-yourself prototyping method for fabricating custom multi-modal physiological sensors, using commercial materials and a commodity desktop inkjet printer. It realizes ultrathin skin-conformal patches (~1ÎŒm) and interactive textiles that capture sEMG, EDA and ECG signals. It further supports fabricating devices with custom levels of thickness and stretchability. We present detailed fabrication explorations on multiple substrate materials, functional inks and skin adhesive materials. Informed from the literature, we also provide design recommendations for each of the modalities. Evaluation results show that the sensor patches achieve a high signal-to-noise ratio. Example applications demonstrate the functionality and versatility of our approach for prototyping a next generation of physiological devices that intimately couple with the human body

    Force-Aware Interface via Electromyography for Natural VR/AR Interaction

    Full text link
    While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.Comment: ACM Transactions on Graphics (SIGGRAPH Asia 2022

    CLASSIFICATION OF ARM MOVEMENT BASED ON UPPER LIMB MUSCLE SIGNAL FOR REHABILITATION DEVICE

    Get PDF
    Rehabilitation device is used as an exoskeleton for people who experience limb failure. Arm rehabilitation device may ease the rehabilitation programme for those who suffer arm dysfunctional. The device used to facilitate the tasks of the program should improve the electrical activity in the motor unit by minimising the mental effort of the user. Electromyography (EMG) is the techniques to analyse the presence of electrical activity in musculoskeletal systems. The electrical activity in muscles of disable person are failed to contract the muscle for movements. To prevent the muscles from paralysis becomes spasticity or flaccid the force of movements has to minimise the mental efforts. To minimise the used of cerebral strength, analysis on EMG signals from normal people are conducted before it can be implement in the device. The signals are collect according to procedure of surface electromyography for non-invasive assessment of muscles (SENIAM). The implementation of EMG signals is to set the movements’ pattern of the arm rehabilitation device. The filtered signal further the process by extracting the features as follows; Standard Deviation(STD), Mean Absolute Value(MAV), Root Mean Square(RMS), Zero Crossing(ZCS) and Variance(VAR). The extraction of EMG data is to have the reduced vector in the signal features for minimising the signals error than can be implement in classifier. The classification of features is by SOMToolbox using MATLAB. The features extraction of EMG signals is classified into several degree of arm movement visualize in U- Matrix form

    Biosensing and Actuation—Platforms Coupling Body Input-Output Modalities for Affective Technologies

    Get PDF
    Research in the use of ubiquitous technologies, tracking systems and wearables within mental health domains is on the rise. In recent years, affective technologies have gained traction and garnered the interest of interdisciplinary fields as the research on such technologies matured. However, while the role of movement and bodily experience to affective experience is well-established, how to best address movement and engagement beyond measuring cues and signals in technology-driven interactions has been unclear. In a joint industry-academia effort, we aim to remodel how affective technologies can help address body and emotional self-awareness. We present an overview of biosignals that have become standard in low-cost physiological monitoring and show how these can be matched with methods and engagements used by interaction designers skilled in designing for bodily engagement and aesthetic experiences. Taking both strands of work together offers unprecedented design opportunities that inspire further research. Through first-person soma design, an approach that draws upon the designer’s felt experience and puts the sentient body at the forefront, we outline a comprehensive work for the creation of novel interactions in the form of couplings that combine biosensing and body feedback modalities of relevance to affective health. These couplings lie within the creation of design toolkits that have the potential to render rich embodied interactions to the designer/user. As a result we introduce the concept of “orchestration”. By orchestration, we refer to the design of the overall interaction: coupling sensors to actuation of relevance to the affective experience; initiating and closing the interaction; habituating; helping improve on the users’ body awareness and engagement with emotional experiences; soothing, calming, or energising, depending on the affective health condition and the intentions of the designer. Through the creation of a range of prototypes and couplings we elicited requirements on broader orchestration mechanisms. First-person soma design lets researchers look afresh at biosignals that, when experienced through the body, are called to reshape affective technologies with novel ways to interpret biodata, feel it, understand it and reflect upon our bodies

    Computational design and optimization of electro-physiological sensors

    Get PDF
    Electro-physiological sensing devices are becoming increasingly common in diverse applications. However, designing such sensors in compact form factors and for high-quality signal acquisition is a challenging task even for experts, is typically done using heuristics, and requires extensive training. Our work proposes a computational approach for designing multi-modal electro-physiological sensors. By employing an optimization-based approach alongside an integrated predictive model for multiple modalities, compact sensors can be created which offer an optimal trade-off between high signal quality and small device size. The task is assisted by a graphical tool that allows to easily specify design preferences and to visually analyze the generated designs in real-time, enabling designer-in-the-loop optimization. Experimental results show high quantitative agreement between the prediction of the optimizer and experimentally collected physiological data. They demonstrate that generated designs can achieve an optimal balance between the size of the sensor and its signal acquisition capability, outperforming expert generated solutions

    The challenges in computer supported conceptual engineering design

    Get PDF
    Computer Aided Engineering Design (CAED) supports the engineering design process during the detail design, but it is not commonly used in the conceptual design stage. This article explores through literature why this is and how the engineering design research community is responding through the development of new conceptual CAED systems and HCI (Human Computer Interface) prototypes. First the requirements and challenges for future conceptual CAED and HCI solutions to better support conceptual design are explored and categorised. Then the prototypes developed in both areas, since 2000, are discussed. Characteristics already considered and those required for future development of CAED systems and HCIs are proposed and discussed, one of the key ones being experience. The prototypes reviewed offer innovative solutions, but only address selected requirements of conceptual design, and are thus unlikely to not provide a solution which would fit the wider needs of the engineering design industry. More importantly, while the majority of prototypes show promising results they are of low maturity and require further development

    A taxonomy and state of the art revision on affective games

    Full text link
    Affective Games are a sub-field of Affective Computing that tries to study how to design videogames that are able to react to the emotions expressed by the player, as well as provoking desired emotions to them. To achieve those goals it is necessary to research on how to measure and detect human emotions using a computer, and how to adapt videogames to the perceived emotions to finally provoke them to the players. This work presents a taxonomy for research on affective games centring on the aforementioned issues. Here we devise as well a revision of the most relevant published works known to the authors on this area. Finally, we analyse and discuss which important research problem are yet open and might be tackled by future investigations in the area of Affective GamesThis work has been co-funded by the following research projects: EphemeCH (TIN2014-56494-C4-{1,4}-P) and DeepBio (TIN2017-85727-C4-3-P) by Spanish Ministry of Economy and Competitivity, under the European Regional Development Fund FEDER, and Justice Programme of the European Union (2014–2020) 723180 – RiskTrack – JUST-2015-JCOO-AG/JUST-2015-JCOO-AG-
    • 

    corecore