621 research outputs found

    PRESENT AND FUTURE PERVASIVE HEALTHCARE METHODOLOGIES: INTELLIGENT BODY DEVICES, PROCESSING AND MODELING TO SEARCH FOR NEW CARDIOVASCULAR AND PHYSIOLOGICAL BIOMARKERS

    Get PDF
    The motivation behind this work comes from the area of pervasive computing technologies for healthcare and wearable healthcare IT systems, an emerging field of research that brings in revolutionary paradigms for computing models in the 21st century. The aim of this thesis is focused on emerging personal health technologies and pattern recognition strategies for early diagnosis and personalized treatment and rehabilitation for individuals with cardiovascular and neurophysiological diseases. Attention was paid to the development of an intelligent system for the automatic classification of cardiac valve disease for screening purposes. Promising results were reported with the possibility to implement a new screening strategy for the diagnosis of cardiac valve disease in developing countries. A novel assistive architecture for the elderly able to non-invasively assess muscle fatigue by surface electromyography using wireless platform during exercise with an ergonomic platform was presented. Finally a wearable chest belt for ECG monitoring to investigate the psycho-physiological effects of the autonomic system and a wearable technology for monitoring of knee kinematics and recognition of ambulatory activities were characterized to evaluate the reliability for clinical purposes of collected data. The potential impact in the clinical arena of this research would be extremely important, since promising data show how such emerging personal technologies and methodologies are effective in several scenarios to early screening and discovery of novel diagnostic and prognostic biomarkers

    PhysioAR: smart sensing and augmented reality for physical rehabilitation

    Get PDF
    The continuous evolution of technology allows for a better analysis of the human being. In certain medical areas such as physiotherapy is required a correct analysis of the patient's evolution. The development of Information and Communication Technologies and recent innovations in the Internet of Things opens new possibilities in the medical field as systems of remote monitoring of patients with new sensors that allow the correct analysis of the health data of patients. In physiotherapy one of the most common problems in the application of treatments is the patient demotivation, something that today can be reduced with the introduction of Augmented Reality that provides a new experience to the patient. For this purpose, a system was developed that combines intelligent sensors with Augmented Reality application that will help monitor patient performance. This system is capable of monitoring lower limb movements acceleration, knee joint angle, patient equilibrium, muscular activity and cardiac activity using electromyography and electrocardiography with a wearable set of tools for easy utilization.A evolução continua da tecnologia permite cada vez mais uma melhor análise do ser humano. Em certas áreas médicas, como a fisioterapia, é necessária uma correta análise da evolução do paciente. O desenvolvimento das Tecnologias de Informação e Comunicação, e as inovações no domínio de Internet das Coisas novas possibilidades no ramo da medicina, como sistemas de monitorização remota de pacientes com novos sensores que permitem a correta análise dos dados de saúde dos pacientes. Na fisioterapia um dos problemas mais comuns na aplicação dos tratamentos é a desmotivação do paciente, algo que hoje pode ser reduzido com introdução da aplicação da Realidade Aumentada que proporciona uma nova experiência ao paciente. Para isso nesta dissertação foi desenvolvido um sistema que combina sensores inteligentes com Realidade Aumentada que vai ajudar o paciente monitorizando o seu desempenho. Este sistema é capaz de monitorizar o ângulo do joelho, captar acelaração de movimentos dos membros inferiores, equilíbrio do paciente, atividade muscular e atividade cárdica usando electromiografia e electrocardiografia num conjunto wearable de fácil utilização

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe

    SensX: About Sensing and Assessment of Complex Human Motion

    Full text link
    The great success of wearables and smartphone apps for provision of extensive physical workout instructions boosts a whole industry dealing with consumer oriented sensors and sports equipment. But with these opportunities there are also new challenges emerging. The unregulated distribution of instructions about ambitious exercises enables unexperienced users to undertake demanding workouts without professional supervision which may lead to suboptimal training success or even serious injuries. We believe, that automated supervision and realtime feedback during a workout may help to solve these issues. Therefore we introduce four fundamental steps for complex human motion assessment and present SensX, a sensor-based architecture for monitoring, recording, and analyzing complex and multi-dimensional motion chains. We provide the results of our preliminary study encompassing 8 different body weight exercises, 20 participants, and more than 9,220 recorded exercise repetitions. Furthermore, insights into SensXs classification capabilities and the impact of specific sensor configurations onto the analysis process are given.Comment: Published within the Proceedings of 14th IEEE International Conference on Networking, Sensing and Control (ICNSC), May 16th-18th, 2017, Calabria Italy 6 pages, 5 figure

    Wearable devices for ergonomics: A systematic literature review

    Get PDF
    Wearable devices are pervasive solutions for increasing work efficiency, improving workers’ well-being, and creating interactions between users and the environment anytime and anywhere. Although several studies on their use in various fields have been performed, there are no systematic reviews on their utilisation in ergonomics. Therefore, we conducted a systematic review to identify wearable devices proposed in the scientific literature for ergonomic purposes and analyse how they can support the improvement of ergonomic conditions. Twenty-eight papers were retrieved and analysed thanks to eleven comparison dimensions related to ergonomic factors, purposes, and criteria, populations, application and validation. The majority of the available devices are sensor systems composed of different types and numbers of sensors located in diverse body parts. These solutions also represent the technology most frequently employed for monitoring and reducing the risk of awkward postures. In addition, smartwatches, body-mounted smartphones, insole pressure systems, and vibrotactile feedback interfaces have been developed for evaluating and/or controlling physical loads or postures. The main results and the defined framework of analysis provide an overview of the state of the art of smart wearables in ergonomics, support the selection of the most suitable ones in industrial and non-industrial settings, and suggest future research directions

    Sensing with Earables: A Systematic Literature Review and Taxonomy of Phenomena

    Get PDF
    Earables have emerged as a unique platform for ubiquitous computing by augmenting ear-worn devices with state-of-the-art sensing. This new platform has spurred a wealth of new research exploring what can be detected on a wearable, small form factor. As a sensing platform, the ears are less susceptible to motion artifacts and are located in close proximity to a number of important anatomical structures including the brain, blood vessels, and facial muscles which reveal a wealth of information. They can be easily reached by the hands and the ear canal itself is affected by mouth, face, and head movements. We have conducted a systematic literature review of 271 earable publications from the ACM and IEEE libraries. These were synthesized into an open-ended taxonomy of 47 different phenomena that can be sensed in, on, or around the ear. Through analysis, we identify 13 fundamental phenomena from which all other phenomena can be derived, and discuss the different sensors and sensing principles used to detect them. We comprehensively review the phenomena in four main areas of (i) physiological monitoring and health, (ii) movement and activity, (iii) interaction, and (iv) authentication and identification. This breadth highlights the potential that earables have to offer as a ubiquitous, general-purpose platform
    corecore