4 research outputs found

    Detection and analysis of heartbeats in seismocardiogram signals

    Get PDF
    This paper presents an unsupervised methodology to analyze SeismoCardioGram (SCG) signals. Starting from raw accelerometric data, heartbeat complexes are extracted and annotated, using a two-step procedure. An unsupervised calibration procedure is added to better adapt to different user patterns. Results show that the performance scores achieved by the proposed methodology improve over related literature: on average, 98.5% sensitivity and 98.6% precision are achieved in beat detection, whereas RMS (Root Mean Square) error in heartbeat interval estimation is as low as 4.6 ms. This allows SCG heartbeat complexes to be reliably extracted. Then, the morphological information of such waveforms is further processed by means of a modular Convolutional Variational AutoEncoder network, aiming at extracting compressed, meaningful representation. After unsupervised training, the VAE network is able to recognize different signal morphologies, associating each user to its specific patterns with high accuracy, as indicated by specific performance metrics (including adjusted random and mutual information score, completeness, and homogeneity). Finally, a Linear Model is used to interpret the results of clustering in the learned latent space, highlighting the impact of different VAE architectural parameters (i.e., number of stacked convolutional units and dimension of latent space)

    Plug&Play brain-computer interfaces for effective active and assisted living control

    No full text
    Brain–Computer Interfaces (BCI) rely on the interpretation of brain activity to provide people with disabilities with an alternative/augmentative interaction path. In light of this, BCI could be considered as enabling technology in many fields, including Active and Assisted Living (AAL) systems control. Interaction barriers could be removed indeed, enabling user with severe motor impairments to gain control over a wide range of AAL features. In this paper, a cost-effective BCI solution, targeted (but not limited) to AAL system control is presented. A custom hardware module is briefly reviewed, while signal processing techniques are covered in more depth. Steady-state visual evoked potentials (SSVEP) are exploited in this work as operating BCI protocol. In contrast with most common SSVEP-BCI approaches, we propose the definition of a prediction confidence indicator, which is shown to improve overall classification accuracy. The confidence indicator is derived without any subject-specific approach and is stable across users: it can thus be defined once and then shared between different persons. This allows some kind of Plug&Play interaction. Furthermore, by modelling rest/idle periods with the confidence indicator, it is possible to detect active control periods and separate them from “background activity”: this is capital for real-time, self-paced operation. Finally, the indicator also allows to dynamically choose the most appropriate observation window length, improving system’s responsiveness and user’s comfort. Good results are achieved under such operating conditions, achieving, for instance, a false positive rate of 0.16 min−1, which outperform current literature findings

    Plug&Play Brain–Computer Interfaces for effective Active and Assisted Living control

    No full text
    Brain–Computer Interfaces (BCI) rely on the interpretation of brain activity to provide people with disabilities with an alternative/augmentative interaction path. In light of this, BCI could be considered as enabling technology in many fields, including Active and Assisted Living (AAL) systems control. Interaction barriers could be removed indeed, enabling user with severe motor impairments to gain control over a wide range of AAL features. In this paper, a cost-effective BCI solution, targeted (but not limited) to AAL system control is presented. A custom hardware module is briefly reviewed, while signal processing techniques are covered in more depth. Steady-state visual evoked potentials (SSVEP) are exploited in this work as operating BCI protocol. In contrast with most common SSVEP-BCI approaches, we propose the definition of a prediction confidence indicator, which is shown to improve overall classification accuracy. The confidence indicator is derived without any subject-specific approach and is stable across users: it can thus be defined once and then shared between different persons. This allows some kind of Plug&Play interaction. Furthermore, by modelling rest/idle periods with the confidence indicator, it is possible to detect active control periods and separate them from “background activity”: this is capital for real-time, self-paced operation. Finally, the indicator also allows to dynamically choose the most appropriate observation window length, improving system’s responsiveness and user’s comfort. Good results are achieved under such operating conditions, achieving, for instance, a false positive rate of 0.16 min−1, which outperform current literature findings
    corecore