42 research outputs found
Operant EEG-based BMI: Learning and consolidating device control with brain activity
"Whether you are reading this thesis on paper or screen, it is easy to take
for granted all the highly specialized movements you are doing at this very
moment just to go through each page. Just to turn a page, you have to
reach for and grasp it, turn it and let go at the precise moment not to rip it.(...)
Recommended from our members
Real-Time Electroencephalogram Sonification for Neurofeedback
Electroencephalography (EEG) is the measurement via the scalp of the electrical activity of the brain. The established therapeutic intervention of neurofeedback involves presenting people with their own EEG in real-time to enable them to modify their EEG for purposes of improving performance or health.
The aim of this research is to develop and validate real-time sonifications of EEG for use in neurofeedback and methods for assessing such sonifications. Neurofeedback generally uses a visual display. Where auditory feedback is used, it is mostly limited to pre-recorded sounds triggered by the EEG activity crossing a threshold. However, EEG generates time-series data with meaningful detail at fine temporal resolution and with complex temporal dynamics. Human hearing has a much higher temporal resolution than human vision, and auditory displays do not require people to focus on a screen with their eyes open for extended periods of time – e.g. if they are engaged in some other task. Sonification of EEG could allow more rapid, contingent, salient and temporally detailed feedback. This could improve the efficiency of neurofeedback training and reduce the number and duration of sessions for successful neurofeedback.
The same two deliberately simple sonification techniques were used in all three experiments of this research: Amplitude Modulation (AM) sonification, which maps the fluctuations in the power of the EEG to the volume of a pure tone; and Frequency Modulation (FM) sonification, which uses the changes in the EEG power to modify the frequency. Measures included, a listening task, NASA task load index; a measure of how much work it was to do the task, Pre & post measures of mood, and EEG.
The first experiment used pre-recorded single channel EEG and participants were asked to listen to the sound of the sonified EEG and try and track the activity that they could hear by moving a slider on a computer screen using a computer mouse. This provided a quantitative assessment of how well people could perceive the sonified fluctuations in EEG level. The tracking accuracy scores were higher for the FM sonification but self-assessments of task load rated the AM sonification as easier to track.
The second experiment used the same two sonifications, in a real neurofeedback task using participants own live EEG. Unbeknownst to the participants the neurofeedback task was designed to improve mood. A Pre-Post questionnaire showed that participants changed their self-rated mood in the intended direction with the EEG training, but there was no statistically significant change in EEG. Again the FM sonification showed a better performance but AM was rated as less effortful. The performance of sonifications in the tracking task in experiment 1 was found to predict their relative efficacy at blind self-rated mood modification in experiment 2.
The third experiment used both the tracking as in experiment 1 and neurofeedback tasks as in experiment 2, but with modified versions of the AM and FM sonifications to allow two-channel EEG sonifications. This experiment introduced a physical slider as opposed to a mouse for the tracking task. Tracking accuracy increased, but this time no significant difference was found between the two sonification techniques on the tracking task. In the training task, once more the blind self-rated mood did improve in the intended direction with the EEG training, but as again there was no significant change in EEG, this cannot necessarily be attributed to the neurofeedback. There was only a slight difference between the two sonification techniques in the effort measure.
In this way, a prototype method has been devised and validated for the quantitative assessment of real-time EEG sonifications. Conventional evaluations of neurofeedback techniques are expensive and time consuming. By contrast, this method potentially provides a rapid, objective and efficient method for evaluating the suitability of candidate sonifications for EEG neurofeedback
Recommended from our members
Finding a link in neural correlates of human’s perceptual and illusion
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonWith the ability to feel through artificial limbs, users regain more function and increasingly see the prosthetics as parts of their own bodies. So, main focus of this project was dedicated to recuperating sensation by deception both in sighted and unsighted patients, started with illusionary experiments on healthy volunteers, brain signals were captured with medical EEG headsets during these tests to have a better understanding of how the brain works during body ownership illusions. EEG results suggest that gender difference exists in the perception of body transfer illusion. Visual input can be induced to trick the brain. Using the results, a new device has been designed (sound generator system-SGS) with the principal goal to find ways to include rich sensory feedback in prosthetic devices that would aid their incorporation of the user’s body representation or schema. Studying the brain is fascinating; SGS tested and was found to have an adequate level of dexterity over course of one-month multiple times. After each try, the results were more tolerable than before that proved the idea that brain can learn and understand anything and can be manipulated temporary or lasting due to influences. Different methods used to validate the results, EEG acquisition, mapping subject brain function with EEG and finally interviewing participant after each attempt. Although the results of the illusion shows that when heat applies on rubber hand, subjects behave in similar manner as if their real hand was effected, but main question is still remains. How can the conditioning apply to daily life of amputees so that illusion become permanent? This is a rapidly developing field with advancements in technology and greater interdisciplinary integration of medicine, mechatronics and control engineering with the future looking to have permanent, low power consumption, highly functional devices with a greater intuitive almost natural feel using a variety of body signals including EMG, ultrasound, and Electrocorticography
Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis
Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness.
Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks.
Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience.
Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice
Transcranial alternating current stimulation to areas associated with the human mirror neuron system reveals modulation to mu-suppression and corresponding behaviour
This study was carried out in order to validate the use of EEG mu (ÎĽ) suppression as an index of human mirror neuron system (hMNS) related activity. The hMNS is characterized by neuronal activity that responds to both action observation and execution of the same movement. This activity has been directly observed in both macaque monkeys and in humans. There is an abundance of studies using indirect measures of neuronal activity to indicate hMNS-related activity such as TMS, fMRI/PET and EEG/MEG. However, relating indirect indices of neuronal activity to a conceptual group of neurons is controversial because the activity observed could also reflect other neuronal processes. Therefore, the current thesis was designed to establish more direct and causal evidence for the use of EEG in indicating hMNS-related activity through the use of transcranial alternating current stimulation (tACS). This was achieved in six experiments; the first three established an efficient protocol to induce ÎĽ-suppression during action observation, and the last three demonstrated by means of tACS that activity in hMNS-related areas is directly related to ÎĽ-reactivity during observation of motor movements and in relation to imitation of the movement observed. To this extent, ÎĽ-suppression was related to both action observation, and the ability to perform the movement observed. This is interpreted as evidence that EEG ÎĽ-suppression is a valid indicator of hMNS-related activity
The development of social processing in young children: insights from somatosensory activations during observation and experience of touch in typically developing children and speech processing in children with autism spectrum disorders
This thesis explores the neural mechanisms underlying the observation of touch and tactile processing in adults and typically developing children and speech versus computerized speech processing in children with autism spectrum disorders (ASD). Chapter 1 reviews the literature on mirror functioning, embodied cognition and typical and atypical development of social and speech processing in infancy and childhood. Chapter 2 investigates the neural mechanisms underlying hand and object touch observation in adults. In Chapter 3, a similar procedure is employed to investigate tactile mirroring mechanisms in children. The findings demonstrate that these mechanisms are relatively developed in 4- to 5- year old children. Chapter 4 further explores somatosensory activity during touch in adults and children. The findings reveal the modulation of somatosensory beta (15-24 Hz) activity during touch in adults, but not in children. Chapter 5 examines the neural mechanisms underlying speech versus computerized speech perception in children with ASD. These results suggest an impaired classification of speech sounds preceded by computerized speech, and atypical lateralization of speech processing in children with ASD. Together, these findings make a notable contribution to our understanding of typical development of tactile mirroring and touch processing mechanisms, and social processing dysfunctions in children with ASD