3 research outputs found

    Electroencephalogram signal acquisition in unshielded noisy environment

    Get PDF
    Researchers have used electroencephalography (EEG) as a window into the activities of the brain. High temporal resolution coupled with relatively low cost compares favourably to other neuroimaging techniques such as magnetoencephalography (MEG). For many years silver metal electrodes have been used for non-invasive monitoring electrical activities of the brain. Although these electrodes provide a reliable method for recording EEG they suffer from noise, such as offset potentials and drifts, and usability issues, e.g. skin prepa- ration and short circuiting of adjacent electrodes due to gel running. Low frequency noise performance is the key indicator in determining the signal to noise ratio of an EEG sensor. In order to tackle these issues a prototype Electric Potential Sensor (EPS) device based on an auto-zero operational amplifier has been developed and evaluated. The absence of 1/f noise in these devices makes them ideal for use with signal frequencies ~10Hz or less. The EPS is a novel active electrode electric potential sensor with ultrahigh input impedance. The active electrodes are designed to be physically and electrically robust and chemically and biochemically inert. They are electrically insulated (anodized) and scalable. These sensors are designed to be immersed in alcohol for sterilization purposes. A comprehensive study was undertaken to compare the results of EEG signals recorded by the EPS with different commercial systems. These studies comprised measurements of both free running EEG and Event Related Potentials. Strictly comparable signals were observed with cross correlations of higher than 0.9 between the EPS and other systems

    Towards smart glasses for facial expression recognition using OMG and machine learning

    No full text
    Abstract This study aimed to evaluate the use of novel optomyography (OMG) based smart glasses, OCOsense, for the monitoring and recognition of facial expressions. Experiments were conducted on data gathered from 27 young adult participants, who performed facial expressions varying in intensity, duration, and head movement. The facial expressions included smiling, frowning, raising the eyebrows, and squeezing the eyes. The statistical analysis demonstrated that: (i) OCO sensors based on the principles of OMG can capture distinct variations in cheek and brow movements with a high degree of accuracy and specificity; (ii) Head movement does not have a significant impact on how well these facial expressions are detected. The collected data were also used to train a machine learning model to recognise the four facial expressions and when the face enters a neutral state. We evaluated this model in conditions intended to simulate real-world use, including variations in expression intensity, head movement and glasses position relative to the face. The model demonstrated an overall accuracy of 93% (0.90 f1-score)—evaluated using a leave-one-subject-out cross-validation technique

    emteqPRO—Fully Integrated Biometric Sensing Array for Non-Invasive Biomedical Research in Virtual Reality

    Get PDF
    Virtual Reality (VR) enables the simulation of ecologically validated scenarios, which are ideal for studying behaviour in controllable conditions. Physiological measures captured in these studies provide a deeper insight into how an individual responds to a given scenario. However, the combination of the various biosensing devices presents several challenges, such as efficient time synchronisation between multiple devices, replication between participants and settings, as well as managing cumbersome setups. Additionally, important salient facial information is typically covered by the VR headset, requiring a different approach to facial muscle measurement. These challenges can restrict the use of these devices in laboratory settings. This paper describes a solution to this problem. More specifically, we introduce the emteqPRO system which provides an all-in-one solution for the collection of physiological data through a multi-sensor array built into the VR headset. EmteqPRO is a ready to use, flexible sensor platform enabling convenient, heterogenous, and multimodal emotional research in VR. It enables the capture of facial muscle activations, heart rate features, skin impedance, and movement data—important factors for the study of emotion and behaviour. The platform provides researchers with the ability to monitor data from users in real-time, in co-located and remote set-ups, and to detect activations in physiology that are linked to arousal and valence changes. The SDK (Software Development Kit), developed specifically for the Unity game engine enables easy integration of the emteqPRO features into VR environments. Code available at: (https://github.com/emteqlabs/emteqvr-unity/releases
    corecore