25 research outputs found

    Development and applications of a smartphone-based mobile electroencephalography (EEG) system

    Get PDF
    Electroencephalography (EEG) is a clinical and research technique used to non-invasively acquire brain activity. EEG is performed using static systems in specialist laboratories where participant mobility is constrained. It is desirable to have EEG systems which enable acquisition of brain activity outside such settings. Mobile systems seek to reduce the constraining factors of EEG device and participant mobility to enable recordings in various environments but have had limited success due to various factors including low system specification. The main aim of this thesis was to design, build, test and validate a novel smartphone-based mobile EEG system.A literature review found that the term ‘mobile EEG’ has an ambiguous meaning as researchers have used it to describe many differing degrees of participant and device mobility. A novel categorisation of mobile EEG (CoME) scheme was derived from thirty published EEG studies which defined scores for participant and device mobilities, and system specifications. The CoME scheme was subsequently applied to generate a specification for the proposed mobile EEG system which had 24 channels, sampled at 24 bit at a rate of 250 Hz. Unique aspects of the EEG system were the introduction of a smartphone into the specification, along with the use of Wi-Fi for communications. The smartphone’s processing power was used to remotely control the EEG device so as to enable EEG data capture and storage as well as electrode impedance checking via the app. This was achieved by using the Unity game engine to code an app which provided the flexibility for future development possibilities with its multi-platform support.The prototype smartphone-based waist-mounted mobile EEG system (termed ‘io:bio’) was validated against a commercial FDA clinically approved mobile system (Micromed). The power spectral frequency, amplitude and area of alpha frequency waves were determined in participants with their eyes closed in various postures: lying, sitting, standing and standing with arms raised. Since a correlation analysis to compare two systems has interpretability problems, Bland and Altman plots were utilised with a priori justified limits of agreement to statistically assess the agreement between the two EEG systems. Overall, the results found similar agreements between the io:bio and Micromed systems indicating that the systems could be used interchangeably. Utilising the io:bio and Micromed systems in a walking configuration, led to contamination of EEG channels with artifacts thought to arise from movement and muscle-related sources, and electrode displacement.To enable an event related potential (ERP) capability of the EEG system, additional coding of the smartphone app was undertaken to provide stimulus delivery and associated data marking. Using the waist-mounted io:bio system, an auditory oddball paradigm was also coded into the app, and delivery of auditory tones (standard and deviant) to the participant (sitting posture) achieved via headphones connected to the smartphone. N100, N200 and P300 ERP components were recorded in participants sitting, and larger amplitudes were found for the deviant tones compared to the standard ones. In addition, when the paradigm was tested in individual participants during walking, movement-related artifacts impacted negatively upon the quality of the ERP components, although components were discernible in the grand mean ERP.The io:bio system was redesigned into a head-mounted configuration in an attempt to reduce EEG artifacts during participant walking. The initial approach taken to redesign the system involved using electronic components populated onto a flexible PCB proved to be non-robust. Instead, the rigid PCB form of the circuitry was taken from the io:bio waist-mounted system and placed onto the rear head section of the electrode cap via a bespoke cradle. Using this head-mounted system, in a preliminary auditory oddball paradigm study, ERP responses were obtained in participants whilst walking. Initial results indicate that artifacts are reduced in this head-mounted configuration, and N100, N200 and P300 components are clearly identifiable in some channels

    Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications

    Get PDF
    This reprint is dedicated to the study of brain activity related to emotional and attentional involvement as measured by Brain–computer interface (BCI) systems designed for different purposes. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action in the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game). These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional and attentional states by training a classification algorithm to recognize mental states. The success of BCI systems in contemporary neuroscientific research relies on the fact that they allow one to “think outside the lab”. The integration of technological solutions, artificial intelligence and cognitive science allowed and will allow researchers to envision more and more applications for the future. The clinical and everyday uses are described with the aim to invite readers to open their minds to imagine potential further developments

    A Comparison of Mobile VR Display Running on an Ordinary Smartphone With Standard PC Display for P300-BCI Stimulus Presentation

    No full text

    Accessibility of Health Data Representations for Older Adults: Challenges and Opportunities for Design

    Get PDF
    Health data of consumer off-the-shelf wearable devices is often conveyed to users through visual data representations and analyses. However, this is not always accessible to people with disabilities or older people due to low vision, cognitive impairments or literacy issues. Due to trade-offs between aesthetics predominance or information overload, real-time user feedback may not be conveyed easily from sensor devices through visual cues like graphs and texts. These difficulties may hinder critical data understanding. Additional auditory and tactile feedback can also provide immediate and accessible cues from these wearable devices, but it is necessary to understand existing data representation limitations initially. To avoid higher cognitive and visual overload, auditory and haptic cues can be designed to complement, replace or reinforce visual cues. In this paper, we outline the challenges in existing data representation and the necessary evidence to enhance the accessibility of health information from personal sensing devices used to monitor health parameters such as blood pressure, sleep, activity, heart rate and more. By creating innovative and inclusive user feedback, users will likely want to engage and interact with new devices and their own data

    Serious Games and Mixed Reality Applications for Healthcare

    Get PDF
    Virtual reality (VR) and augmented reality (AR) have long histories in the healthcare sector, offering the opportunity to develop a wide range of tools and applications aimed at improving the quality of care and efficiency of services for professionals and patients alike. The best-known examples of VR–AR applications in the healthcare domain include surgical planning and medical training by means of simulation technologies. Techniques used in surgical simulation have also been applied to cognitive and motor rehabilitation, pain management, and patient and professional education. Serious games are ones in which the main goal is not entertainment, but a crucial purpose, ranging from the acquisition of knowledge to interactive training.These games are attracting growing attention in healthcare because of their several benefits: motivation, interactivity, adaptation to user competence level, flexibility in time, repeatability, and continuous feedback. Recently, healthcare has also become one of the biggest adopters of mixed reality (MR), which merges real and virtual content to generate novel environments, where physical and digital objects not only coexist, but are also capable of interacting with each other in real time, encompassing both VR and AR applications.This Special Issue aims to gather and publish original scientific contributions exploring opportunities and addressing challenges in both the theoretical and applied aspects of VR–AR and MR applications in healthcare

    Emotion and Stress Recognition Related Sensors and Machine Learning Technologies

    Get PDF
    This book includes impactful chapters which present scientific concepts, frameworks, architectures and ideas on sensing technologies and machine learning techniques. These are relevant in tackling the following challenges: (i) the field readiness and use of intrusive sensor systems and devices for capturing biosignals, including EEG sensor systems, ECG sensor systems and electrodermal activity sensor systems; (ii) the quality assessment and management of sensor data; (iii) data preprocessing, noise filtering and calibration concepts for biosignals; (iv) the field readiness and use of nonintrusive sensor technologies, including visual sensors, acoustic sensors, vibration sensors and piezoelectric sensors; (v) emotion recognition using mobile phones and smartwatches; (vi) body area sensor networks for emotion and stress studies; (vii) the use of experimental datasets in emotion recognition, including dataset generation principles and concepts, quality insurance and emotion elicitation material and concepts; (viii) machine learning techniques for robust emotion recognition, including graphical models, neural network methods, deep learning methods, statistical learning and multivariate empirical mode decomposition; (ix) subject-independent emotion and stress recognition concepts and systems, including facial expression-based systems, speech-based systems, EEG-based systems, ECG-based systems, electrodermal activity-based systems, multimodal recognition systems and sensor fusion concepts and (x) emotion and stress estimation and forecasting from a nonlinear dynamical system perspective

    Proceedings of the 9th international conference on disability, virtual reality and associated technologies (ICDVRAT 2012)

    Get PDF
    The proceedings of the conferenc
    corecore