199 research outputs found

    Heart Rate Variability Feature Selection using Random Forest for Mental Stress Quantification

    Get PDF
    Mental stress is considered as an essential element that affects decision making. Apart from mental stress, cognitive workload, mental effort, attention, and cognitive engagement are also involved in the decision-making process. Ambiguities of these concepts lead to confusion in their applications. One objective of this thesis is to explore the relationship between mental stress and stress-related concepts. By investigating the mechanisms for decision-making, the difference and correlation of mental stress and other concepts are disclosed. Heart rate variability (HRV) is a common method to measure mental stress. By investigating the correlation between HRV and mental stress, it can be confirmed that HRV does respond to mental stress changes instead of other concepts. HRV features are used to assess whether there is a relationship between baseline HRV and mental stress. However, the extracted features usually contain a large amount of redundancy, which adds computational complexity to mental stress quantification while not contributing to quantification accuracy. Recently, researchers have resorted to the random forest as a tool for HRV feature selection. Another objective of this thesis is to select significant HRV features to quantify the mental stresses using the random forest method. In this thesis, an open-source data set, called the SWELL-KW data set, is used for mental stress measurement, where three labels are assigned according to different mental stress conditions, i.e., neutral, time pressure, and interruption. A set of HRV features are proposed based on time domain and frequency domain analysis for mental stress measurement. Statistical analysis is performed to select the essential features that reflect mental stress. The random forest algorithm of feature selection is then studied, and the accuracy in measuring mental stress is validated by comparing the extracted features of the training set and the testing set. In order to evaluate the random forest algorithm's performance, the comparisons with other related algorithms, including support vector machine (SVM), decision tree, gradient boosting decision tree (GBDT), k-nearest neighbor algorithm (KNN), and deep neural networks (DNN), are also conducted in terms of accuracy and time cost. The optimal HRV feature subset is proposed for mental stress quantification, including median RR, mean RR, median REL RR, HR, pNN25, SDRR RMSSD, SDRR RMSSD REL RR, TP, SD2, and SDRR. It is shown that this subset of features gives a high feature importance score and thus has a significant effect on mental stress quantification. Performing random forest analysis with a sufficient amount of labeled data shows that the optimal HRV feature subset yields high mental stress quantification accuracy by using random forest. Moreover, random forest always makes the best overall performance in feature selection compared with other algorithms in terms of accuracy and time cost. It also infers the potential relation between physiological responses and mental activities

    Social media mental health analysis framework through applied computational approaches

    Get PDF
    Studies have shown that mental illness burdens not only public health and productivity but also established market economies throughout the world. However, mental disorders are difficult to diagnose and monitor through traditional methods, which heavily rely on interviews, questionnaires and surveys, resulting in high under-diagnosis and under-treatment rates. The increasing use of online social media, such as Facebook and Twitter, is now a common part of people’s everyday life. The continuous and real-time user-generated content often reflects feelings, opinions, social status and behaviours of individuals, creating an unprecedented wealth of person-specific information. With advances in data science, social media has already been increasingly employed in population health monitoring and more recently mental health applications to understand mental disorders as well as to develop online screening and intervention tools. However, existing research efforts are still in their infancy, primarily aimed at highlighting the potential of employing social media in mental health research. The majority of work is developed on ad hoc datasets and lacks a systematic research pipeline. [Continues.]</div

    Algorithms and Software for the Analysis of Large Complex Networks

    Get PDF
    The work presented intersects three main areas, namely graph algorithmics, network science and applied software engineering. Each computational method discussed relates to one of the main tasks of data analysis: to extract structural features from network data, such as methods for community detection; or to transform network data, such as methods to sparsify a network and reduce its size while keeping essential properties; or to realistically model networks through generative models

    Machine Learning and Statistical Analysis of Complex Mathematical Models: An Application to Epilepsy

    Get PDF
    The electroencephalogram (EEG) is a commonly used tool for studying the emergent electrical rhythms of the brain. It has wide utility in psychology, as well as bringing a useful diagnostic aid for neurological conditions such as epilepsy. It is of growing importance to better understand the emergence of these electrical rhythms and, in the case of diagnosis of neurological conditions, to find mechanistic differences between healthy individuals and those with a disease. Mathematical models are an important tool that offer the potential to reveal these otherwise hidden mechanisms. In particular Neural Mass Models (NMMs), which describe the macroscopic activity of large populations of neurons, are increasingly used to uncover large-scale mechanisms of brain rhythms in both health and disease. The dynamics of these models is dependent upon the choice of parameters, and therefore it is crucial to be able to understand how dynamics change when parameters are varied. Despite they are considered low-dimensional in comparison to micro-scale neural network models, with regards to understanding the relationship between parameters and dynamics NMMs are still prohibitively high dimensional for classical approaches such as numerical continuation. We need alternative methods to characterise the dynamics of NMMs in high dimensional parameter spaces. The primary aim of this thesis is to develop a method to explore and analyse the high dimensional parameter space of these mathematical models. We develop an approach based on statistics and machine learning methods called decision tree mapping (DTM). This method is used to analyse the parameter space of a mathematical model by studying all the parameters simultaneously. With this approach, the parameter space can efficiently be mapped in high dimension. We have used measures linked with this method to determine which parameters play a key role in the output of the model. This approach recursively splits the parameter space into smaller subspaces with an increasing homogeneity of dynamics. The concepts of decision tree learning, random forest, measures of importance, statistical tests and visual tools are introduced to explore and analyse the parameter space. We introduce formally the theoretical background and the methods with examples. The DTM approach is used in three distinct studies to: • Identify the role of parameters on the dynamic model. For example, which parameters have a role in the emergence of seizure dynamics? • Constrain the parameter space, such that regions of the parameter space which give implausible dynamic are removed. • Compare the parameter sets to fit different groups. How does the thalamocortical connectivity of people with and without epilepsy differ? We demonstrate that classical studies have not taken into account the complexity of the parameter space. DTM can easily be extended to other fields using mathematical models. We advocate the use of this method in the future to constrain high dimensional parameter spaces in order to enable more efficient, person-specific model calibration

    Geographic Information Systems and Science

    Get PDF
    Geographic information science (GISc) has established itself as a collaborative information-processing scheme that is increasing in popularity. Yet, this interdisciplinary and/or transdisciplinary system is still somewhat misunderstood. This book talks about some of the GISc domains encompassing students, researchers, and common users. Chapters focus on important aspects of GISc, keeping in mind the processing capability of GIS along with the mathematics and formulae involved in getting each solution. The book has one introductory and eight main chapters divided into five sections. The first section is more general and focuses on what GISc is and its relation to GIS and Geography, the second is about location analytics and modeling, the third on remote sensing data analysis, the fourth on big data and augmented reality, and, finally, the fifth looks over volunteered geographic information.info:eu-repo/semantics/publishedVersio

    Automation of the anesthetic process: New computer-based solutions to deal with the current frontiers in the assessment, modeling and control of anesthesia

    Get PDF
    The current trend in automating the anesthetic process focuses on developing a system for fully controlling the different variables involved in anesthesia. To this end, several challenges need to be addressed first. The main objective of this thesis is to propose new solutions that provide answers to the current problems in the field of assessing, modeling and controlling the anesthetic process. Undoubtedly, the main handicap to the development of a comprehensive proposal lies in the absence of a reliable measure of analgesia. This thesis proposes a novel fuzzy-logic-based scheme to evaluate the impact of including a new variable in a decision-making process. This scheme is validated by way of a preliminary analysis of the Analgesia Nociception Index (ANI) monitor on analgesic drug titration. Furthermore, the capacity of the ANI monitor to provide information to replicate the decisions of the experts in different clinical situations is studied. To this end, different artificial intelligence-based algorithms are used: specifically, the suitability of this index is evaluated against other variables commonly used in clinical practice. Regarding the modeling of anesthesia, this thesis presents an adaptive model that allows characterizing the pharmacological interaction effects between the hypnotic and analgesic drug on the depth of hypnosis. In addition, the proposed model takes into account both inter- and intra-patient variabilities observed in the response of the subjects. Finally, this work presents the synthesis of a robust optimal PID controller for regulating the depth of hypnosis by considering the effect of the uncertainties derived from the patient's pharmacological response. Moreover, a study is conducted on the limitations introduced when using a PID controller versus the development of higher order solutions under the same clinical and technical considerations

    Machine Learning for Understanding Focal Epilepsy

    Get PDF
    The study of neural dysfunctions requires strong prior knowledge on brain physiology combined with expertise on data analysis, signal processing, and machine learning. One of the unsolved issues regarding epilepsy consists in the localization of pathological brain areas causing seizures. Nowadays the analysis of neural activity conducted with this goal still relies on visual inspection by clinicians and is therefore subjected to human error, possibly leading to negative surgical outcome. In absence of any evidence from standard clinical tests, medical experts resort to invasive electrophysiological recordings, such as stereoelectroencephalography to assess the pathological areas. This data is high dimensional, it could suffer from spatial and temporal correlation, as well as be affected by high variability across the population. These aspects make the automatization attempt extremely challenging. In this context, this thesis tackles the problem of characterizing drug resistant focal epilepsy. This work proposes methods to analyze the intracranial electrophysiological recordings during the interictal state, leveraging on the presurgical assessment of the pathological areas. The first contribution of the thesis consists in the design of a support tool for the identification of epileptic zones. This method relies on the multi-decomposition of the signal and similarity metrics. We built personalized models which share common usage of features across patients. The second main contribution aims at understanding if there are particular frequency bands related to the epileptic areas and if it is worthy to focus on shorter periods of time. Here we leverage on the post-surgical outcome deriving from the Engel classification. The last contribution focuses on the characterization of short patterns of activity at specific frequencies. We argue that this effort could be helpful in the clinical routine and at the same time provides useful insight for the understanding of focal epilepsy

    Complexity in Economic and Social Systems

    Get PDF
    There is no term that better describes the essential features of human society than complexity. On various levels, from the decision-making processes of individuals, through to the interactions between individuals leading to the spontaneous formation of groups and social hierarchies, up to the collective, herding processes that reshape whole societies, all these features share the property of irreducibility, i.e., they require a holistic, multi-level approach formed by researchers from different disciplines. This Special Issue aims to collect research studies that, by exploiting the latest advances in physics, economics, complex networks, and data science, make a step towards understanding these economic and social systems. The majority of submissions are devoted to financial market analysis and modeling, including the stock and cryptocurrency markets in the COVID-19 pandemic, systemic risk quantification and control, wealth condensation, the innovation-related performance of companies, and more. Looking more at societies, there are papers that deal with regional development, land speculation, and the-fake news-fighting strategies, the issues which are of central interest in contemporary society. On top of this, one of the contributions proposes a new, improved complexity measure

    Quantifying cognitive and mortality outcomes in older patients following acute illness using epidemiological and machine learning approaches

    Get PDF
    Introduction: Cognitive and functional decompensation during acute illness in older people are poorly understood. It remains unclear how delirium, an acute confusional state reflective of cognitive decompensation, is contextualised by baseline premorbid cognition and relates to long-term adverse outcomes. High-dimensional machine learning offers a novel, feasible and enticing approach for stratifying acute illness in older people, improving treatment consistency while optimising future research design. Methods: Longitudinal associations were analysed from the Delirium and Population Health Informatics Cohort (DELPHIC) study, a prospective cohort ≥70 years resident in Camden, with cognitive and functional ascertainment at baseline and 2-year follow-up, and daily assessments during incident hospitalisation. Second, using routine clinical data from UCLH, I constructed an extreme gradient-boosted trees predicting 600-day mortality for unselected acute admissions of oldest-old patients with mechanistic inferences. Third, hierarchical agglomerative clustering was performed to demonstrate structure within DELPHIC participants, with predictive implications for survival and length of stay. Results: i. Delirium is associated with increased rates of cognitive decline and mortality risk, in a dose-dependent manner, with an interaction between baseline cognition and delirium exposure. Those with highest delirium exposure but also best premorbid cognition have the “most to lose”. ii. High-dimensional multimodal machine learning models can predict mortality in oldest-old populations with 0.874 accuracy. The anterior cingulate and angular gyri, and extracranial soft tissue, are the highest contributory intracranial and extracranial features respectively. iii. Clinically useful acute illness subtypes in older people can be described using longitudinal clinical, functional, and biochemical features. Conclusions: Interactions between baseline cognition and delirium exposure during acute illness in older patients result in divergent long-term adverse outcomes. Supervised machine learning can robustly predict mortality in in oldest-old patients, producing a valuable prognostication tool using routinely collected data, ready for clinical deployment. Preliminary findings suggest possible discernible subtypes within acute illness in older people

    Computational socioeconomics

    Get PDF
    Uncovering the structure of socioeconomic systems and timely estimation of socioeconomic status are significant for economic development. The understanding of socioeconomic processes provides foundations to quantify global economic development, to map regional industrial structure, and to infer individual socioeconomic status. In this review, we will make a brief manifesto about a new interdisciplinary research field named Computational Socioeconomics, followed by detailed introduction about data resources, computational tools, data-driven methods, theoretical models and novel applications at multiple resolutions, including the quantification of global economic inequality and complexity, the map of regional industrial structure and urban perception, the estimation of individual socioeconomic status and demographic, and the real-time monitoring of emergent events. This review, together with pioneering works we have highlighted, will draw increasing interdisciplinary attentions and induce a methodological shift in future socioeconomic studies
    • …
    corecore