27 research outputs found

    EEG characterization of the Alzheimer's disease continuum by means of multiscale entropies

    Get PDF
    Alzheimer's disease (AD) is a neurodegenerative disorder with high prevalence, known for its highly disabling symptoms. The aim of this study was to characterize the alterations in the irregularity and the complexity of the brain activity along the AD continuum. Both irregularity and complexity can be studied applying entropy-based measures throughout multiple temporal scales. In this regard, multiscale sample entropy (MSE) and refined multiscale spectral entropy (rMSSE) were calculated from electroencephalographic (EEG) data. Five minutes of resting-state EEG activity were recorded from 51 healthy controls, 51 mild cognitive impaired (MCI) subjects, 51 mild AD patients (ADMIL), 50 moderate AD patients (ADMOD), and 50 severe AD patients (ADSEV). Our results show statistically significant differences (p-values < 0.05, FDR-corrected Kruskal-Wallis test) between the five groups at each temporal scale. Additionally, average slope values and areas under MSE and rMSSE curves revealed significant changes in complexity mainly for controls vs. MCI, MCI vs. ADMIL and ADMOD vs. ADSEV comparisons (p-values < 0.05, FDR-corrected Mann-Whitney U-test). These findings indicate that MSE and rMSSE reflect the neuronal disturbances associated with the development of dementia, and may contribute to the development of new tools to track the AD progression.This research was supported by European Commission and European Regional Development Fund (FEDER) under project “Análisis y correlación entre el genoma completo y la actividad cerebral para la ayuda en el diagnóstico de la enfermedad de Alzheimer” (Cooperation Programme Interreg V-A Spain-Portugal, POCTEP 2014-2020); by “Ministerio de Ciencia, Innovación y Universidades” and FEDER under projects PGC2018-098214-A-I00 and DPI2017-84280-R; and by “Fundação para a Ciência e a Tecnologia/Ministério da Ciência, Tecnologia e Inovação” and FEDER under projects POCI-01-0145-FEDER-007274 and UID/MAT/00144/2013

    Permutation entropy and its main biomedical and econophysics applications: a review

    Get PDF
    Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, an aspect which is not usually taken into account. The idea of calculating entropy based on permutation patterns (that is, permutations defined by the order relations among values of a time series) has received a lot of attention in the last years, especially for the understanding of complex and chaotic systems. Permutation entropy directly accounts for the temporal information contained in the time series; furthermore, it has the quality of simplicity, robustness and very low computational cost. To celebrate the tenth anniversary of the original work, here we analyze the theoretical foundations of the permutation entropy, as well as the main recent applications to the analysis of economical markets and to the understanding of biomedical systems.Facultad de Ingenierí

    Time-series Generation by Contrastive Imitation

    Full text link
    Consider learning a generative model for time-series data. The sequential setting poses a unique challenge: Not only should the generator capture the conditional dynamics of (stepwise) transitions, but its open-loop rollouts should also preserve the joint distribution of (multi-step) trajectories. On one hand, autoregressive models trained by MLE allow learning and computing explicit transition distributions, but suffer from compounding error during rollouts. On the other hand, adversarial models based on GAN training alleviate such exposure bias, but transitions are implicit and hard to assess. In this work, we study a generative framework that seeks to combine the strengths of both: Motivated by a moment-matching objective to mitigate compounding error, we optimize a local (but forward-looking) transition policy, where the reinforcement signal is provided by a global (but stepwise-decomposable) energy model trained by contrastive estimation. At training, the two components are learned cooperatively, avoiding the instabilities typical of adversarial objectives. At inference, the learned policy serves as the generator for iterative sampling, and the learned energy serves as a trajectory-level measure for evaluating sample quality. By expressly training a policy to imitate sequential behavior of time-series features in a dataset, this approach embodies "generation by imitation". Theoretically, we illustrate the correctness of this formulation and the consistency of the algorithm. Empirically, we evaluate its ability to generate predictively useful samples from real-world datasets, verifying that it performs at the standard of existing benchmarks

    Organic electrochemical networks for biocompatible and implantable machine learning: Organic bioelectronic beyond sensing

    Get PDF
    How can the brain be such a good computer? Part of the answer lies in the astonishing number of neurons and synapses that process electrical impulses in parallel. Part of it must be found in the ability of the nervous system to evolve in response to external stimuli and grow, sharpen, and depress synaptic connections. However, we are far from understanding even the basic mechanisms that allow us to think, be aware, recognize patterns, and imagine. The brain can do all this while consuming only around 20 Watts, out-competing any human-made processor in terms of energy-efficiency. This question is of particular interest in a historical era and technological stage where phrases like machine learning and artificial intelligence are more and more widespread, thanks to recent advances produced in the field of computer science. However, brain-inspired computation is today still relying on algorithms that run on traditional silicon-made, digital processors. Instead, the making of brain-like hardware, where the substrate itself can be used for computation and it can dynamically update its electrical pathways, is still challenging. In this work, I tried to employ organic semiconductors that work in electrolytic solutions, called organic mixed ionic-electronic conductors (OMIECs) to build hardware capable of computation. Moreover, by exploiting an electropolymerization technique, I could form conducting connections in response to electrical spikes, in analogy to how synapses evolve when the neuron fires. After demonstrating artificial synapses as a potential building block for neuromorphic chips, I shifted my attention to the implementation of such synapses in fully operational networks. In doing so, I borrowed the mathematical framework of a machine learning approach known as reservoir computing, which allows computation with random (neural) networks. I capitalized my work on demonstrating the possibility of using such networks in-vivo for the recognition and classification of dangerous and healthy heartbeats. This is the first demonstration of machine learning carried out in a biological environment with a biocompatible substrate. The implications of this technology are straightforward: a constant monitoring of biological signals and fluids accompanied by an active recognition of the presence of malign patterns may lead to a timely, targeted and early diagnosis of potentially mortal conditions. Finally, in the attempt to simulate the random neural networks, I faced difficulties in the modeling of the devices with the state-of-the-art approach. Therefore, I tried to explore a new way to describe OMIECs and OMIECs-based devices, starting from thermodynamic axioms. The results of this model shine a light on the mechanism behind the operation of the organic electrochemical transistors, revealing the importance of the entropy of mixing and suggesting new pathways for device optimization for targeted applications

    Fundamental frequency estimation of low-quality electroglottographic signals

    Get PDF
    Fundamental frequency (fo) is often estimated based on electroglottographic (EGG) signals. Due to the nature of the method, the quality of EGG signals may be impaired by certain features like amplitude or baseline drifts, mains hum or noise. The potential adverse effects of these factors on fo estimation has to date not been investigated. Here, the performance of thirteen algorithms for estimating fo was tested, based on 147 synthesized EGG signals with varying degrees of signal quality deterioration. Algorithm performance was assessed through the standard deviation σfo of the difference between known and estimated fo data, expressed in octaves. With very few exceptions, simulated mains hum, and amplitude and baseline drifts did not influence fo results, even though some algorithms consistently outperformed others. When increasing either cycle-to-cycle fo variation or the degree of subharmonics, the SIGMA algorithm had the best performance (max. σfo = 0.04). That algorithm was however more easily disturbed by typical EGG equipment noise, whereas the NDF and Praat's auto-correlation algorithms performed best in this category (σfo = 0.01). These results suggest that the algorithm for fo estimation of EGG signals needs to be selected specifically for each particular data set. Overall, estimated fo data should be interpreted with care

    Procedures and Methodologies for the Control and Improvement of Energy-Environmental Quality in Construction

    Get PDF
    This Special Issue aims at providing the state-of-the-art on procedures and methodologies developed to improve energy and environmental performance through building renovation. We are greatly thankful to our colleagues building physics experts, building technology researchers, and urban environment scholars who contributed to this Special Issue, for sharing their original works in the field

    The Nanoformulation of Brain Derived Neurotrophic Factor and Reformulation with PEG-free Polymers

    Get PDF
    Therapeutic proteins and other biologics are becoming increasingly common across wide swathes of the healthcare system. However, biological macromolecules are particularly susceptible to immune recognition, systemic clearance, and degradation. Various methods of evading the immune system and prolonging the bioavailability of a given therapeutic have been devised and implemented. Polyion complexation, whereby a block ionomer is complexed with a polyelectrolyte to form a core-shell structure, has become an increasingly popular method of encapsulation. Notably, the most common shell forming block is poly(ethylene glycol) (PEG). Understanding the complexity of polyion complex systems requires background knowledge of how complexation is driven. Chapter 1 of this dissertation reviews polymer complexation, encapsulation of proteins, PEG characteristics and alternative options, and the biology around brain derived neurotrophic factor (BDNF).Chapter 2 of this dissertation addresses the nanoformulation of BDNF and PEG-b-poly(glutamic acid) to yield a polyion complex nanoparticle, termed Nano-BDNF. Extensive characterization indicates a spherical, core-shell particle with size and dispersity appropriate for administration. Association is driven by electrostatic attraction between the polymer and protein, then stabilized via a hydrogen bonding network. The particle is stable in high ionic strength solutions, protects from common mucosal opsonins, and selectively releases to specific binding partners. Encapsulation preserves activity in the brain and mediates delivery via the intranasal to brain pathway. Also, treatment in a Parkinson’s disease model is efficacious. However, the particle was formulated with a polymer containing PEG. There are several issues with PEG, (reviewed in Chapter 1), primarily immunogenicity. Chapter 3 deals with the reformulation of Nano-BDNF with two novel polymers. Reformulation yielded relatively small and narrowly dispersed particles which have similar morphology and behavior to the original Nano-BDNF formulation. Additionally, we confirm cooperative binding and investigate the effects of pH change on formation. Indeed, we observe behavior consistent with the polyelectrolyte complexes which inspired the development of polyion complexes. This reformulation can offer a way to diversify and supplement the therapeutic arsenal in order to avoid disruptions by immunogenic PEG.Doctor of Philosoph

    Theoretical-experimental study on protein-ligand interactions based on thermodynamics methods, molecular docking and perturbation models

    Get PDF
    The current doctoral thesis focuses on understanding the thermodynamic events of protein-ligand interactions which have been of paramount importance from traditional Medicinal Chemistry to Nanobiotechnology. Particular attention has been made on the application of state-of-the-art methodologies to address thermodynamic studies of the protein-ligand interactions by integrating structure-based molecular docking techniques, classical fractal approaches to solve protein-ligand complementarity problems, perturbation models to study allosteric signal propagation, predictive nano-quantitative structure-toxicity relationship models coupled with powerful experimental validation techniques. The contributions provided by this work could open an unlimited horizon to the fields of Drug-Discovery, Materials Sciences, Molecular Diagnosis, and Environmental Health Sciences

    Generative adversarial networks for sequential learning

    Get PDF
    Generative modelling aims to learn the data generating mechanism from observations without supervision. It is a desirable and natural approach for learning unlabelled data which is easily accessible. Deep generative models refer to a class of generative models combined with the usage of deep learning techniques, taking advantage of the intuitive principles of generative models as well as the expressiveness and flexibility of neural networks. The applications of generative modelling include image, audio, and video synthesis, text summarisation and translation, and so on. The methods developed in this thesis particularly emphasise on domains involving data of sequential nature, such as video generation and prediction, weather forecasting, and dynamic 3D reconstruction. Firstly, we introduce a new adversarial algorithm for training generative models suitable for sequential data. This algorithm is built on the theory of Causal Optimal Transport (COT) which constrains the transport plans to respect the temporal dependencies exhibited in the data. Secondly, the algorithm is extended to learn conditional sequences, that is, how a sequence is likely to evolve given the observation of its past evolution. Meanwhile, we work with the modified empirical measures to guarantee the convergence of the COT distance when the sequences do not overlap at any time step. Thirdly, we show that state-of-the-art results in the complex spatio-temporal modelling using GANs can be further improved by leveraging prior knowledge in the spatial-temporal correlation in the domain of weather forecasting. Finally, we demonstrate how deep generative models can be adopted to address a classical statistical problem of conditional independence testing. A class of classic approaches for such a task requires computing a test statistic using samples drawn from two unknown conditional distributions. We therefore present a double GANs framework to learn two generative models that approximate both conditional distributions. The success of this approach sheds light on how certain challenging statistical problems can benefit from the adequate learning results as well as the efficient sampling procedure of deep generative models
    corecore