1,138 research outputs found
ΠΠ½ΡΠ΅Π»Π»Π΅ΠΊΡΡΠ°Π»ΡΠ½ΠΎΠ΅ ΠΊΡΠ΅ΡΠ»ΠΎ-ΡΠΎΠ±ΠΎΡ ΡΠΎ Π²ΡΠΏΠΎΠΌΠΎΠ³Π°ΡΠ΅Π»ΡΠ½ΡΠΌΠΈ ΡΡΠ΅Π΄ΡΡΠ²Π°ΠΌΠΈ ΡΠ²ΡΠ·ΠΈ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΎΡΠΊΠ»ΠΈΠΊΠΎΠ² TEP ΠΈ Ρ Π°ΡΠ°ΠΊΡΠ΅ΡΠΈΡΡΠΈΠΊ Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½Π° ΡΠΏΠ΅ΠΊΡΡΠ° Π±ΠΎΠ»Π΅Π΅ Π²ΡΡΠΎΠΊΠΎΠ³ΠΎ ΠΏΠΎΡΡΠ΄ΠΊΠ°
In recent years, electroencephalography-based navigation and communication systems for differentially enabled communities have been progressively receiving more attention. To provide a navigation system with a communication aid, a customized protocol using thought evoked potentials has been proposed in this research work to aid the differentially enabled communities. This study presents the higher order spectra based features to categorize seven basic tasks that include Forward, Left, Right, Yes, NO, Help and Relax; that can be used for navigating a robot chair and also for communications using an oddball paradigm. The proposed system records the eight-channel wireless electroencephalography signal from ten subjects while the subject was perceiving seven different tasks. The recorded brain wave signals are pre-processed to remove the interference waveforms and segmented into six frequency band signals, i. e. Delta, Theta, Alpha, Beta, Gamma 1-1 and Gamma 2. The frequency band signals are segmented into frame samples of equal length and are used to extract the features using bispectrum estimation. Further, statistical features such as the average value of bispectral magnitude and entropy using the bispectrum field are extracted and formed as a feature set. The extracted feature sets are tenfold cross validated using multilayer neural network classifier. From the results, it is observed that the entropy of bispectral magnitude feature based classifier model has the maximum classification accuracy of 84.71 % and the value of the bispectral magnitude feature based classifier model has the minimum classification accuracy of 68.52 %.Π ΠΏΠΎΡΠ»Π΅Π΄Π½ΠΈΠ΅ Π³ΠΎΠ΄Ρ Π²ΡΠ΅ Π±ΠΎΠ»ΡΡΠ΅ Π²Π½ΠΈΠΌΠ°Π½ΠΈΡ ΡΠ΄Π΅Π»ΡΠ΅ΡΡΡ Π½Π°Π²ΠΈΠ³Π°ΡΠΈΠΎΠ½Π½ΡΠΌ ΠΈ ΠΊΠΎΠΌΠΌΡΠ½ΠΈΠΊΠ°ΡΠΈΠΎΠ½Π½ΡΠΌ ΡΠΈΡΡΠ΅ΠΌΠ°ΠΌ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ ΡΠ»Π΅ΠΊΡΡΠΎΡΠ½ΡΠ΅ΡΠ°Π»ΠΎΠ³ΡΠ°ΠΌΠΌΡ Π³ΠΎΠ»ΠΎΠ²Π½ΠΎΠ³ΠΎ ΠΌΠΎΠ·Π³Π° Π΄Π»Ρ ΡΠΎΠΎΠ±ΡΠ΅ΡΡΠ² Ρ ΡΠ°Π·Π½ΡΠΌΠΈ Π²ΠΎΠ·ΠΌΠΎΠΆΠ½ΠΎΡΡΡΠΌΠΈ. ΠΠ»Ρ ΠΏΡΠ΅Π΄ΠΎΡΡΠ°Π²Π»Π΅Π½ΠΈΡ Π½Π°Π²ΠΈΠ³Π°ΡΠΈΠΎΠ½Π½ΠΎΠΉ ΡΠΈΡΡΠ΅ΠΌΠ΅ Π²ΡΠΏΠΎΠΌΠΎΠ³Π°ΡΠ΅Π»ΡΠ½ΡΡ
ΡΡΠ΅Π΄ΡΡΠ² ΡΠ²ΡΠ·ΠΈ Π² ΡΠ°Π±ΠΎΡΠ΅ ΠΏΡΠ΅Π΄Π»ΠΎΠΆΠ΅Π½ Π½Π°ΡΡΡΠ°ΠΈΠ²Π°Π΅ΠΌΡΠΉ ΠΏΡΠΎΡΠΎΠΊΠΎΠ», ΠΈΡΠΏΠΎΠ»ΡΠ·ΡΡΡΠΈΠΉ Π²ΡΠ·Π²Π°Π½Π½ΡΠ΅ ΠΌΡΡΠ»ΠΈΡΠ΅Π»ΡΠ½ΡΠ΅ ΠΏΠΎΡΠ΅Π½ΡΠΈΠ°Π»Ρ, ΡΡΠΎΠ±Ρ ΠΏΠΎΠΌΠΎΡΡ ΡΠΎΠΎΠ±ΡΠ΅ΡΡΠ²Π°ΠΌ Ρ ΡΠ°Π·Π½ΡΠΌΠΈ Π²ΠΎΠ·ΠΌΠΎΠΆΠ½ΠΎΡΡΡΠΌΠΈ. ΠΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½Ρ ΡΡΠ½ΠΊΡΠΈΠΈ, ΠΎΡΠ½ΠΎΠ²Π°Π½Π½ΡΠ΅ Π½Π° ΡΠΏΠ΅ΠΊΡΡΠ°Ρ
Π±ΠΎΠ»Π΅Π΅ Π²ΡΡΠΎΠΊΠΎΠ³ΠΎ ΠΏΠΎΡΡΠ΄ΠΊΠ°, Π΄Π»Ρ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΠΈ ΡΠ΅ΠΌΠΈ ΠΎΡΠ½ΠΎΠ²Π½ΡΡ
Π·Π°Π΄Π°Ρ, ΡΠ°ΠΊΠΈΡ
ΠΊΠ°ΠΊ ΠΠΏΠ΅ΡΠ΅Π΄, ΠΠ»Π΅Π²ΠΎ, ΠΠΏΡΠ°Π²ΠΎ, ΠΠ°, ΠΠΠ’, ΠΠΎΠΌΠΎΡΡ ΠΈ Π Π°ΡΡΠ»Π°Π±Π»Π΅Π½ΠΈΠ΅, ΠΊΠΎΡΠΎΡΡΠ΅ ΠΌΠΎΠΆΠ½ΠΎ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°ΡΡ Π΄Π»Ρ ΡΠΏΡΠ°Π²Π»Π΅Π½ΠΈΡ ΠΊΡΠ΅ΡΠ»ΠΎΠΌ-ΡΠΎΠ±ΠΎΡΠΎΠΌ, Π° ΡΠ°ΠΊΠΆΠ΅ Π΄Π»Ρ ΡΠ²ΡΠ·ΠΈ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π½Π΅ΠΎΠ±ΡΡΠ½ΠΎΠΉ ΠΏΠ°ΡΠ°Π΄ΠΈΠ³ΠΌΡ. ΠΡΠ΅Π΄Π»Π°Π³Π°Π΅ΠΌΠ°Ρ ΡΠΈΡΡΠ΅ΠΌΠ° Π·Π°ΠΏΠΈΡΡΠ²Π°Π΅Ρ Π²ΠΎΡΡΠΌΠΈΠΊΠ°Π½Π°Π»ΡΠ½ΡΠΉ Π±Π΅ΡΠΏΡΠΎΠ²ΠΎΠ΄Π½ΠΎΠΉ ΡΠΈΠ³Π½Π°Π» ΡΠ»Π΅ΠΊΡΡΠΎΡΠ½ΡΠ΅ΡΠ°Π»ΠΎΠ³ΡΠ°ΡΠΈΠΈ ΠΎΡ Π΄Π΅ΡΡΡΠΈ ΡΡΠ±ΡΠ΅ΠΊΡΠΎΠ², Π² ΡΠΎ Π²ΡΠ΅ΠΌΡ ΠΊΠ°ΠΊ ΡΡΠ±ΡΠ΅ΠΊΡ Π²ΠΎΡΠΏΡΠΈΠ½ΠΈΠΌΠ°Π» ΡΠ΅ΠΌΡ ΡΠ°Π·Π»ΠΈΡΠ½ΡΡ
Π·Π°Π΄Π°Ρ. ΠΠ°ΠΏΠΈΡΠ°Π½Π½ΡΠ΅ ΡΠΈΠ³Π½Π°Π»Ρ ΠΌΠΎΠ·Π³ΠΎΠ²ΡΡ
Π²ΠΎΠ»Π½ ΠΏΡΠ΅Π΄Π²Π°ΡΠΈΡΠ΅Π»ΡΠ½ΠΎ ΠΎΠ±ΡΠ°Π±Π°ΡΡΠ²Π°ΡΡΡΡ Π΄Π»Ρ ΡΠ΄Π°Π»Π΅Π½ΠΈΡ ΠΈΠ½ΡΠ΅ΡΡΠ΅ΡΠ΅Π½ΡΠΈΠΎΠ½Π½ΡΡ
Π²ΠΎΠ»Π½ ΠΈ ΡΠ΅Π³ΠΌΠ΅Π½ΡΠΈΡΡΡΡΡΡ Π½Π° ΡΠΈΠ³Π½Π°Π»Ρ ΡΠ΅ΡΡΠΈ ΡΠ°ΡΡΠΎΡΠ½ΡΡ
Π΄ΠΈΠ°ΠΏΠ°Π·ΠΎΠ½ΠΎΠ²: Π΄Π΅Π»ΡΡΠ°, ΡΠ΅ΡΠ°, Π°Π»ΡΡΠ°, Π±Π΅ΡΠ°, Π³Π°ΠΌΠΌΠ° 1-1 ΠΈ Π³Π°ΠΌΠΌΠ° 2. Π‘ΠΈΠ³Π½Π°Π»Ρ ΠΏΠΎΠ»ΠΎΡΡ ΡΠ°ΡΡΠΎΡ ΡΠ΅Π³ΠΌΠ΅Π½ΡΠΈΡΡΡΡΡΡ Π½Π° Π²ΡΠ±ΠΎΡΠΊΠΈ ΠΊΠ°Π΄ΡΠΎΠ² ΡΠ°Π²Π½ΠΎΠΉ Π΄Π»ΠΈΠ½Ρ ΠΈ ΠΈΡΠΏΠΎΠ»ΡΠ·ΡΡΡΡΡ Π΄Π»Ρ ΠΈΠ·Π²Π»Π΅ΡΠ΅Π½ΠΈΡ ΠΏΡΠΈΠ·Π½Π°ΠΊΠΎΠ² Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΎΡΠ΅Π½ΠΊΠΈ Π±ΠΈΡΠΏΠ΅ΠΊΡΡΠ°. ΠΡΠΎΠΌΠ΅ ΡΠΎΠ³ΠΎ, ΡΡΠ°ΡΠΈΡΡΠΈΡΠ΅ΡΠΊΠΈΠ΅ Ρ
Π°ΡΠ°ΠΊΡΠ΅ΡΠΈΡΡΠΈΠΊΠΈ, ΡΠ°ΠΊΠΈΠ΅ ΠΊΠ°ΠΊ ΡΡΠ΅Π΄Π½Π΅Π΅ Π·Π½Π°ΡΠ΅Π½ΠΈΠ΅ Π±ΠΈΡΠΏΠ΅ΠΊΡΡΠ°Π»ΡΠ½ΠΎΠΉ Π²Π΅Π»ΠΈΡΠΈΠ½Ρ ΠΈ ΡΠ½ΡΡΠΎΠΏΠΈΡ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΎΠ±Π»Π°ΡΡΠΈ Π±ΠΈΡΠΏΠ΅ΠΊΡΡΠ°, ΠΈΠ·Π²Π»Π΅ΠΊΠ°ΡΡΡΡ ΠΈ ΡΠΎΡΠΌΠΈΡΡΡΡΡΡ ΠΊΠ°ΠΊ Π½Π°Π±ΠΎΡ Ρ
Π°ΡΠ°ΠΊΡΠ΅ΡΠΈΡΡΠΈΠΊ. ΠΠ·Π²Π»Π΅ΡΠ΅Π½Π½ΡΠ΅ Π½Π°Π±ΠΎΡΡ ΡΡΠ½ΠΊΡΠΈΠΉ ΠΏΡΠΎΡ
ΠΎΠ΄ΡΡ Π΄Π΅ΡΡΡΠΈΠΊΡΠ°ΡΠ½ΡΡ ΠΏΠ΅ΡΠ΅ΠΊΡΠ΅ΡΡΠ½ΡΡ ΠΏΡΠΎΠ²Π΅ΡΠΊΡ Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΎΡΠ° ΠΌΠ½ΠΎΠ³ΠΎΡΠ»ΠΎΠΉΠ½ΠΎΠΉ Π½Π΅ΠΉΡΠΎΠ½Π½ΠΎΠΉ ΡΠ΅ΡΠΈ. Π Π΅Π·ΡΠ»ΡΡΠ°ΡΡ ΠΏΠΎΠΊΠ°Π·Π°Π»ΠΈ, ΡΡΠΎ ΡΠ½ΡΡΠΎΠΏΠΈΡ ΠΌΠΎΠ΄Π΅Π»ΠΈ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΎΡΠ° Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Ρ
Π°ΡΠ°ΠΊΡΠ΅ΡΠΈΡΡΠΈΠΊ Π±ΠΈΡΠΏΠ΅ΠΊΡΡΠ°Π»ΡΠ½ΠΎΠΉ Π²Π΅Π»ΠΈΡΠΈΠ½Ρ ΠΈΠΌΠ΅Π΅Ρ ΠΌΠ°ΠΊΡΠΈΠΌΠ°Π»ΡΠ½ΡΡ ΡΠΎΡΠ½ΠΎΡΡΡ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΠΈ 84,71 %, Π° ΡΡΠ΅Π΄Π½Π΅Π΅ Π·Π½Π°ΡΠ΅Π½ΠΈΠ΅ ΠΌΠΎΠ΄Π΅Π»ΠΈ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΎΡΠ° Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ Ρ
Π°ΡΠ°ΠΊΡΠ΅ΡΠΈΡΡΠΈΠΊ Π±ΠΈΡΠΏΠ΅ΠΊΡΡΠ°Π»ΡΠ½ΠΎΠΉ Π²Π΅Π»ΠΈΡΠΈΠ½Ρ β ΠΌΠΈΠ½ΠΈΠΌΠ°Π»ΡΠ½ΡΡ ΡΠΎΡΠ½ΠΎΡΡΡ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΠΈ 68,52 %
Intelligent sensing technologies for the diagnosis, monitoring and therapy of alzheimerβs disease:A systematic review
Alzheimerβs disease is a lifelong progressive neurological disorder. It is associated with high disease management and caregiver costs. Intelligent sensing systems have the capability to provide context-aware adaptive feedback. These can assist Alzheimerβs patients with, continuous monitoring, functional support and timely therapeutic interventions for whom these are of paramount importance. This review aims to present a summary of such systems reported in the extant literature for the management of Alzheimerβs disease. Four databases were searched, and 253 English language articles were identified published between the years 2015 to 2020. Through a series of filtering mechanisms, 20 articles were found suitable to be included in this review. This study gives an overview of the depth and breadth of the efficacy as well as the limitations of these intelligent systems proposed for Alzheimerβs. Results indicate two broad categories of intelligent technologies, distributed systems and self-contained devices. Distributed systems base their outcomes mostly on long-term monitoring activity patterns of individuals whereas handheld devices give quick assessments through touch, vision and voice. The review concludes by discussing the potential of these intelligent technologies for clinical practice while highlighting future considerations for improvements in the design of these solutions for Alzheimerβs disease
Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm
[EN] Robotics has been successfully applied in the design of collaborative robots for assistance to people with motor disabilities. However, man-machine interaction is difficult for those who suffer severe motor disabilities. The aim of this study was to test the feasibility of a low-cost robotic arm control system with an EEG-based brain-computer interface (BCI). The BCI system relays on the Steady State Visually Evoked Potentials (SSVEP) paradigm. A cross-platform application was obtained in C++. This C++ platform, together with the open-source software Openvibe was used to control a Staubli robot arm model TX60. Communication between Openvibe and the robot was carried out through the Virtual Reality Peripheral Network (VRPN) protocol. EEG signals were acquired with the 8-channel Enobio amplifier from Neuroelectrics. For the processing of the EEG signals, Common Spatial Pattern (CSP) filters and a Linear Discriminant Analysis classifier (LDA) were used. Five healthy subjects tried the BCI. This work allowed the communication and integration of a well-known BCI development platform such as Openvibe with the specific control software of a robot arm such as Staubli TX60 using the VRPN protocol. It can be concluded from this study that it is possible to control the robotic arm with an SSVEP-based BCI with a reduced number of dry electrodes to facilitate the use of the system.Funding for open access charge: Universitat Politecnica de Valencia.Quiles Cucarella, E.; Dadone, J.; Chio, N.; GarcΓa Moreno, E. (2022). Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm. Sensors. 22(13):1-26. https://doi.org/10.3390/s22135000126221
On Tackling Fundamental Constraints in Brain-Computer Interface Decoding via Deep Neural Networks
A Brain-Computer Interface (BCI) is a system that provides a communication and control medium between human cortical signals and external devices, with the primary aim to assist or to be used by patients who suffer from a neuromuscular disease. Despite significant recent progress in the area of BCI, there are numerous shortcomings associated with decoding Electroencephalography-based BCI signals in real-world environments. These include, but are not limited to, the cumbersome nature of the equipment, complications in collecting large quantities of real-world data, the rigid experimentation protocol and the challenges of accurate signal decoding, especially in making a system work in real-time. Hence, the core purpose of this work is to investigate improving the applicability and usability of BCI systems, whilst preserving signal decoding accuracy.
Recent advances in Deep Neural Networks (DNN) provide the possibility for signal processing to automatically learn the best representation of a signal, contributing to improved performance even with a noisy input signal. Subsequently, this thesis focuses on the use of novel DNN-based approaches for tackling some of the key underlying constraints within the area of BCI. For example, recent technological improvements in acquisition hardware have made it possible to eliminate the pre-existing rigid experimentation procedure, albeit resulting in noisier signal capture. However, through the use of a DNN-based model, it is possible to preserve the accuracy of the predictions from the decoded signals. Moreover, this research demonstrates that by leveraging DNN-based image and signal understanding, it is feasible to facilitate real-time BCI applications in a natural environment. Additionally, the capability of DNN to generate realistic synthetic data is shown to be a potential solution in reducing the requirement for costly data collection. Work is also performed in addressing the well-known issues regarding subject bias in BCI models by generating data with reduced subject-specific features.
The overall contribution of this thesis is to address the key fundamental limitations of BCI systems. This includes the unyielding traditional experimentation procedure, the mandatory extended calibration stage and sustaining accurate signal decoding in real-time. These limitations lead to a fragile BCI system that is demanding to use and only suited for deployment in a controlled laboratory. Overall contributions of this research aim to improve the robustness of BCI systems and enable new applications for use in the real-world
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5βs to 3βs, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5βs samples and 73.16% for 3βs samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications
Recent Developments in Smart Healthcare
Medicine is undergoing a sector-wide transformation thanks to the advances in computing and networking technologies. Healthcare is changing from reactive and hospital-centered to preventive and personalized, from disease focused to well-being centered. In essence, the healthcare systems, as well as fundamental medicine research, are becoming smarter. We anticipate significant improvements in areas ranging from molecular genomics and proteomics to decision support for healthcare professionals through big data analytics, to support behavior changes through technology-enabled self-management, and social and motivational support. Furthermore, with smart technologies, healthcare delivery could also be made more efficient, higher quality, and lower cost. In this special issue, we received a total 45 submissions and accepted 19 outstanding papers that roughly span across several interesting topics on smart healthcare, including public health, health information technology (Health IT), and smart medicine
- β¦