581 research outputs found
A Wearable Brain-Computer Interface Instrument for Augmented Reality-Based Inspection in Industry 4.0
This paper proposes a wearable monitoring system for inspection in the framework of Industry 4.0. The instrument integrates augmented reality (AR) glasses with a noninvasive single-channel brain-computer interface (BCI), which replaces the classical input interface of AR platforms. Steady-state visually evoked potentials (SSVEP) are measured by a single-channel electroencephalography (EEG) and simple power spectral density analysis. The visual stimuli for SSVEP elicitation are provided by AR glasses while displaying the inspection information. The real-time metrological performance of the BCI is assessed by the receiver operating characteristic curve on the experimental data from 20 subjects. The characterization was carried out by considering stimulation times from 10.0 down to 2.0 s. The thresholds for the classification were found to be dependent on the subject and the obtained average accuracy goes from 98.9% at 10.0 s to 81.1% at 2.0 s. An inspection case study of the integrated AR-BCI device shows encouraging accuracy of about 80% of lab values
High-wearable EEG-based distraction detection in motor rehabilitation
A method for EEG-based distraction detection during motor-rehabilitation tasks is proposed. A wireless cap guarantees very high wearability with dry electrodes and a low number of channels. Experimental validation is performed on a dataset from 17 volunteers. Different feature extractions from spatial, temporal, and frequency domain and classification strategies were evaluated. The performances of five supervised classifiers in discriminating between attention on pure movement and with distractors were compared. A k-Nearest Neighbors classifier achieved an accuracy of 92.8 ± 1.6%. In this last case, the feature extraction is based on a custom 12 pass-band Filter-Bank (FB) and the Common Spatial Pattern (CSP) algorithm. In particular, the mean Recall of classification (percentage of true positive in distraction detection) is higher than 92% and allows the therapist or an automated system to know when to stimulate the patient’s attention for enhancing the therapy effectiveness
Wearable Brain-Computer Interface Instrumentation for Robot-Based Rehabilitation by Augmented Reality
An instrument for remote control of the robot by wearable brain-computer interface (BCI) is proposed for rehabilitating children with attention-deficit/hyperactivity disorder (ADHD). Augmented reality (AR) glasses generate flickering stimuli, and a single-channel electroencephalographic BCI detects the elicited steady-state visual evoked potentials (SSVEPs). This allows benefiting from the SSVEP robustness by leaving available the view of robot movements. Together with the lack of training, a single channel maximizes the device's wearability, fundamental for the acceptance by ADHD children. Effectively controlling the movements of a robot through a new channel enhances rehabilitation engagement and effectiveness. A case study at an accredited rehabilitation center on ten healthy adult subjects highlighted an average accuracy higher than 83%, with information transfer rate (ITR) up to 39 b/min. Preliminary further tests on four ADHD patients between six- and eight-years old provided highly positive feedback on device acceptance and attentional performance
Latest Advancements in SSVEPs Classification for Single-Channel, Extended Reality-based Brain-Computer Interfaces
This work details the latest advancements on a single-channel, reactive Brain-Computer Interfaces developed at the Interdepartmental Research Center in Health Management and Innovation in Healthcare (CIRMIS) of the University of Naples Federico II. The proposed instrumentation is based on Extended Reality (XR) and exploits the acquisition and classification of the Steady-State Visually Evoked Potentials (SSVEPs). In particular, an XR headset is employed for generating the flickering stimuli necessary to the SSVEP elicitation. The users brain signals are captured by means of a highly wearable and portable electroencephalografic acquisition unit, which is connected to a portable processing unit in charge of processing in real time the incoming data. In this way, a deeper interaction between users and external devices with respect to traditional architectures is guaranteed. The classification capability of the proposed instrument has been significantly improved over the years. Currently, in fact, a classification accuracy up to 90 % is obtained with at least 2 s of acquisition time
Vascular Patterns in Cutaneous Ulcerated Basal Cell Carcinoma: A Retrospective Blinded Study Including Dermoscopy.
Abstract The aim of this retrospective study was to determine the type and prevalence of vascular patterns in the ulcerated and non-ulcerated portions of histologically proven basal cell carcinomas (BCCs) and correlate them with other dermoscopic and clinical features, including the clinically supposed diagnosis. Three authors retrospectively collected 156 clinical and 156 dermoscopic digital images of ulcerated BCCs (histologically confirmed); each image was blindly evaluated by 2 other authors, who did not know the histological diagnosis. Seventeen lesions were completely ulcerated, while 139 lesions presented ulcerated and non-ulcerated portions. Correct clinical diagnosis was associated with the type of lesion, in particular 90.6% of partially ulcerated lesions were correctly diagnosed with clinical-dermoscopic examination, compared with 11.8% of totally ulcerated lesions (χ2 = 64.00, p = 0.000). Presence of arborizing pattern in the ulcerated portion was associated with a correct diagnosis (Fisher's exact test, p = 0.015). Correct diagnosis was also associated with absence of dotted pattern in the non-ulcerated area (χ2 = 16.18, p = 0.000); the absence of hairpin (χ2 = 6.08, p = 0.000) and glomerular patterns were associated with correct diagnosis in the ulcerated areas (χ2 = 18.64, p = 0.000). In case of completely ulcerated BCC the clinician lacks the means to correctly identify the correct nature of the lesion, and is driven towards an incorrect diagnostic conclusion
Robotic Autism Rehabilitation by Wearable Brain-Computer Interface and Augmented Reality
An instrument based on the integration of Brain Computer Interface (BCI) and Augmented Reality (AR) is proposed for robotic autism rehabilitation. Flickering stimuli at fixed frequencies appear on the display of Augmented Reality (AR) glasses. When the user focuses on one of the stimuli a Steady State Visual Evoked Potentials (SSVEP) occurs on his occipital region. A single-channel electroencephalographic Brain Computer Interface detects the elicited SSVEP and sends the corresponding commands to a mobile robot. The device's high wearability (single channel and dry electrodes), and the trainingless usability are fundamental for the acceptance by Autism Spectrum Disorder (ASD) children. Effectively controlling the movements of a robot through a new channel enhances rehabilitation engagement and effectiveness. A case study at an accredited rehabilitation center on 10 healthy adult subjects highlighted an average accuracy higher than 83%. Preliminary further tests at the Department of Translational Medical Sciences of University of Naples Federico II on 3 ASD patients between 8 and 10 years old provided positive feedback on device acceptance and attentional performance
Metrological performance of a single-channel brain-computer interface based on motor imagery
In this paper, the accuracy in classifying Motor Imagery (MI) tasks for a Brain-Computer Interface (BCI) is analyzed. Electroencephalographic (EEG) signals were taken into account, notably by employing one channel per time. Four classes were to distinguish, i.e. imagining the movement of left hand, right hand, feet, or tongue. The dataset '2a' of BCI Competition IV (2008) was considered. Brain signals were processed by applying a short-time Fourier transform, a common spatial pattern filter for feature extraction, and a support vector machine for classification. With this work, the aim is to give a contribution to the development of wearable MI-based BCIs by relying on single channel EEG
Enhancement of SSVEPs Classification in BCI-based Wearable Instrumentation Through Machine Learning Techniques
This work addresses the adoption of Machine Learning classifiers and Convolutional Neural Networks to improve the performance of highly wearable, single-channel instrumentation for Brain-Computer Interfaces. The proposed measurement system is based on the classification of Steady-State Visually Evoked Potentials (SSVEPs). In particular, Head-Mounted Displays for Augmented Reality are used to generate and display the flickering stimuli for the SSVEPs elicitation. Four experiments were conducted by employing, in turn, a different Head-Mounted Display. For each experiment, two different algorithms were applied and compared with the state-of-the-art-techniques. Furthermore, the impact of different Augmented Reality technologies in the elicitation and classification of SSVEPs was also explored. The experimental metrological characterization demonstrates (i) that the proposed Machine Learning-based processing strategies provide a significant enhancement of the SSVEP classification accuracy with respect to the state of the art, and (ii) that choosing an adequate Head-Mounted Display is crucial to obtain acceptable performance. Finally, it is also shown that the adoption of inter-subjective validation strategies such as the Leave-One-Subject-Out Cross Validation successfully leads to an increase in the inter-individual 1-σ reproducibility: this, in turn, anticipates an easier development of ready-to-use systems
A ML-based Approach to Enhance Metrological Performance of Wearable Brain-Computer Interfaces
In this paper, the adoption of Machine Learning (ML) classifiers is addressed to improve the performance of highly wearable, single-channel instrumentation for Brain-Computer Interfaces (BCIs). The proposed BCI is based on the classification of Steady-State Visually Evoked Potentials (SSVEPs). In this setup, Augmented Reality Smart Glasses are used to generate and display the flickering stimuli for the SSVEP elicitation. An experimental campaign was conducted on 20 adult volunteers. Successively, a Leave-One-Subject-Out Cross Validation was performed to validate the proposed algorithm. The obtained experimental results demonstrate that suitable ML-based processing strategies outperform the state-of-the-art techniques in terms of classification accuracy. Furthermore, it was also shown that the adoption of an inter-subjective model successfully led to a decrease in the 3-σ uncertainty: this can facilitate future developments of ready-to-use systems
Dynamical charge density fluctuations pervading the phase diagram of a Cu-based high-Tc superconductor
Charge density waves are a common occurrence in all families of high critical
temperature superconducting cuprates. Although consistently observed in the
underdoped region of the phase diagram and at relatively low temperatures, it
is still unclear to what extent they influence the unusual properties of these
systems. Using resonant x-ray scattering we carefully determined the
temperature dependence of charge density modulations in
(Y,Nd)BaCuO for three doping levels. We discovered
short-range dynamical charge density fluctuations besides the previously known
quasi-critical charge density waves. They persist up to well above the
pseudogap temperature T*, are characterized by energies of few meV and pervade
a large area of the phase diagram, so that they can play a key role in shaping
the peculiar normal-state properties of cuprates.Comment: 34 pages, 4 figures, 11 supplementary figure
- …