3,468 research outputs found

    Transparent authentication: Utilising heart rate for user authentication

    Get PDF
    There has been exponential growth in the use of wearable technologies in the last decade with smart watches having a large share of the market. Smart watches were primarily used for health and fitness purposes but recent years have seen a rise in their deployment in other areas. Recent smart watches are fitted with sensors with enhanced functionality and capabilities. For example, some function as standalone device with the ability to create activity logs and transmit data to a secondary device. The capability has contributed to their increased usage in recent years with researchers focusing on their potential. This paper explores the ability to extract physiological data from smart watch technology to achieve user authentication. The approach is suitable not only because of the capacity for data capture but also easy connectivity with other devices - principally the Smartphone. For the purpose of this study, heart rate data is captured and extracted from 30 subjects continually over an hour. While security is the ultimate goal, usability should also be key consideration. Most bioelectrical signals like heart rate are non-stationary time-dependent signals therefore Discrete Wavelet Transform (DWT) is employed. DWT decomposes the bioelectrical signal into n level sub-bands of detail coefficients and approximation coefficients. Biorthogonal Wavelet (bior 4.4) is applied to extract features from the four levels of detail coefficents. Ten statistical features are extracted from each level of the coffecient sub-band. Classification of each sub-band levels are done using a Feedforward neural Network (FF-NN). The 1 st , 2 nd , 3 rd and 4 th levels had an Equal Error Rate (EER) of 17.20%, 18.17%, 20.93% and 21.83% respectively. To improve the EER, fusion of the four level sub-band is applied at the feature level. The proposed fusion showed an improved result over the initial result with an EER of 11.25% As a one-off authentication decision, an 11% EER is not ideal, its use on a continuous basis makes this more than feasible in practice

    A LightGBM-Based EEG Analysis Method for Driver Mental States Classification

    Get PDF
    Fatigue driving can easily lead to road traffic accidents and bring great harm to individuals and families. Recently, electroencephalography- (EEG-) based physiological and brain activities for fatigue detection have been increasingly investigated. However, how to find an effective method or model to timely and efficiently detect the mental states of drivers still remains a challenge. In this paper, we combine common spatial pattern (CSP) and propose a light-weighted classifier, LightFD, which is based on gradient boosting framework for EEG mental states identification. ,e comparable results with traditional classifiers, such as support vector machine (SVM), convolutional neural network (CNN), gated recurrent unit (GRU), and large margin nearest neighbor (LMNN), show that the proposed model could achieve better classification performance, as well as the decision efficiency. Furthermore, we also test and validate that LightFD has better transfer learning performance in EEG classification of driver mental states. In summary, our proposed LightFD classifier has better performance in real-time EEG mental state prediction, and it is expected to have broad application prospects in practical brain-computer interaction (BCI)

    A Python-based Brain-Computer Interface Package for Neural Data Analysis

    Get PDF
    Anowar, Md Hasan, A Python-based Brain-Computer Interface Package for Neural Data Analysis. Master of Science (MS), December, 2020, 70 pp., 4 tables, 23 figures, 74 references. Although a growing amount of research has been dedicated to neural engineering, only a handful of software packages are available for brain signal processing. Popular brain-computer interface packages depend on commercial software products such as MATLAB. Moreover, almost every brain-computer interface software is designed for a specific neuro-biological signal; there is no single Python-based package that supports motor imagery, sleep, and stimulated brain signal analysis. The necessity to introduce a brain-computer interface package that can be a free alternative for commercial software has motivated me to develop a toolbox using the python platform. In this thesis, the structure of MEDUSA, a brain-computer interface toolbox, is presented. The features of the toolbox are demonstrated with publicly available data sources. The MEDUSA toolbox provides a valuable tool to biomedical engineers and computational neuroscience researchers

    EEG Based Emotion Monitoring Using Wavelet and Learning Vector Quantization

    Get PDF
    Emotional identification is necessary for example in Brain Computer Interface (BCI) application and when emotional therapy and medical rehabilitation take place. Some emotional states can be characterized in the frequency of EEG signal, such excited, relax and sad. The signal extracted in certain frequency useful to distinguish the three emotional state. The classification of the EEG signal in real time depends on extraction methods to increase class distinction, and identification methods with fast computing. This paper proposed human emotion monitoring in real time using Wavelet and Learning Vector Quantization (LVQ). The process was done before the machine learning using training data from the 10 subjects, 10 trial, 3 classes and 16 segments (equal to 480 sets of data). Each data set processed in 10 seconds and extracted into Alpha, Beta, and Theta waves using Wavelet. Then they become input for the identification system using LVQ three emotional state that is excited, relax, and sad. The results showed that by using wavelet we can improve the accuracy of 72% to 87% and number of training data variation increased the accuracy. The system was integrated with wireless EEG to monitor emotion state in real time with change each 10 seconds. It takes 0.44 second, was not significant toward 10 seconds
    • …
    corecore