22,434 research outputs found

    Analysis of cross-correlations in electroencephalogram signals as an approach to proactive diagnosis of schizophrenia

    Full text link
    We apply flicker-noise spectroscopy (FNS), a time series analysis method operating on structure functions and power spectrum estimates, to study the clinical electroencephalogram (EEG) signals recorded in children/adolescents (11 to 14 years of age) with diagnosed schizophrenia-spectrum symptoms at the National Center for Psychiatric Health (NCPH) of the Russian Academy of Medical Sciences. The EEG signals for these subjects were compared with the signals for a control sample of chronically depressed children/adolescents. The purpose of the study is to look for diagnostic signs of subjects' susceptibility to schizophrenia in the FNS parameters for specific electrodes and cross-correlations between the signals simultaneously measured at different points on the scalp. Our analysis of EEG signals from scalp-mounted electrodes at locations F3 and F4, which are symmetrically positioned in the left and right frontal areas of cerebral cortex, respectively, demonstrates an essential role of frequency-phase synchronization, a phenomenon representing specific correlations between the characteristic frequencies and phases of excitations in the brain. We introduce quantitative measures of frequency-phase synchronization and systematize the values of FNS parameters for the EEG data. The comparison of our results with the medical diagnoses for 84 subjects performed at NCPH makes it possible to group the EEG signals into 4 categories corresponding to different risk levels of subjects' susceptibility to schizophrenia. We suggest that the introduced quantitative characteristics and classification of cross-correlations may be used for the diagnosis of schizophrenia at the early stages of its development.Comment: 36 pages, 6 figures, 2 tables; to be published in "Physica A

    Event-triggered Learning

    Full text link
    The efficient exchange of information is an essential aspect of intelligent collective behavior. Event-triggered control and estimation achieve some efficiency by replacing continuous data exchange between agents with intermittent, or event-triggered communication. Typically, model-based predictions are used at times of no data transmission, and updates are sent only when the prediction error grows too large. The effectiveness in reducing communication thus strongly depends on the quality of the prediction model. In this article, we propose event-triggered learning as a novel concept to reduce communication even further and to also adapt to changing dynamics. By monitoring the actual communication rate and comparing it to the one that is induced by the model, we detect a mismatch between model and reality and trigger model learning when needed. Specifically, for linear Gaussian dynamics, we derive different classes of learning triggers solely based on a statistical analysis of inter-communication times and formally prove their effectiveness with the aid of concentration inequalities

    Dynamic state reconciliation and model-based fault detection for chemical processes

    Get PDF
    In this paper, we present a method for the fault detection based on the residual generation. The main idea is to reconstruct the outputs of the system from the measurements using the extended Kalman filter. The estimations are compared to the values of the reference model and so, deviations are interpreted as possible faults. The reference model is simulated by the dynamic hybrid simulator, PrODHyS. The use of this method is illustrated through an application in the field of chemical processe

    A multilevel integrative approach to hospital case mix and capacity planning.

    Get PDF
    Hospital case mix and capacity planning involves the decision making both on patient volumes that can be taken care of at a hospital and on resource requirements and capacity management. In this research, to advance both the hospital resource efficiency and the health care service level, a multilevel integrative approach to the planning problem is proposed on the basis of mathematical programming modeling and simulation analysis. It consists of three stages, namely the case mix planning phase, the master surgery scheduling phase and the operational performance evaluation phase. At the case mix planning phase, a hospital is assumed to choose the optimal patient mix and volume that can bring the maximum overall financial contribution under the given resource capacity. Then, in order to improve the patient service level potentially, the total expected bed shortage due to the variable length of stay of patients is minimized through reallocating the bed capacity and building balanced master surgery schedules at the master surgery scheduling phase. After that, the performance evaluation is carried out at the operational stage through simulation analysis, and a few effective operational policies are suggested and analyzed to enhance the trade-offs between resource efficiency and service level. The three stages are interacting and are combined in an iterative way to make sound decisions both on the patient case mix and on the resource allocation.Health care; Case mix and capacity planning; Master surgery schedule; Multilevel; Resource efficiency; Service level;

    Deepr: A Convolutional Net for Medical Records

    Full text link
    Feature engineering remains a major bottleneck when creating predictive systems from electronic medical records. At present, an important missing element is detecting predictive regular clinical motifs from irregular episodic records. We present Deepr (short for Deep record), a new end-to-end deep learning system that learns to extract features from medical records and predicts future risk automatically. Deepr transforms a record into a sequence of discrete elements separated by coded time gaps and hospital transfers. On top of the sequence is a convolutional neural net that detects and combines predictive local clinical motifs to stratify the risk. Deepr permits transparent inspection and visualization of its inner working. We validate Deepr on hospital data to predict unplanned readmission after discharge. Deepr achieves superior accuracy compared to traditional techniques, detects meaningful clinical motifs, and uncovers the underlying structure of the disease and intervention space

    SimpactCyan 1.0 : an open-source simulator for individual-based models in HIV epidemiology with R and Python interfaces

    Get PDF
    SimpactCyan is an open-source simulator for individual-based models in HIV epidemiology. Its core algorithm is written in C++ for computational efficiency, while the R and Python interfaces aim to make the tool accessible to the fast-growing community of R and Python users. Transmission, treatment and prevention of HIV infections in dynamic sexual networks are simulated by discrete events. A generic “intervention” event allows model parameters to be changed over time, and can be used to model medical and behavioural HIV prevention programmes. First, we describe a more efficient variant of the modified Next Reaction Method that drives our continuous-time simulator. Next, we outline key built-in features and assumptions of individual-based models formulated in SimpactCyan, and provide code snippets for how to formulate, execute and analyse models in SimpactCyan through its R and Python interfaces. Lastly, we give two examples of applications in HIV epidemiology: the first demonstrates how the software can be used to estimate the impact of progressive changes to the eligibility criteria for HIV treatment on HIV incidence. The second example illustrates the use of SimpactCyan as a data-generating tool for assessing the performance of a phylodynamic inference framework
    corecore