180 research outputs found

    Clustering Approach to Quantify Long-Term Spatio-Temporal Interactions in Epileptic Intracranial Electroencephalography

    Get PDF
    Abnormal dynamical coupling between brain structures is believed to be primarily responsible for the generation of epileptic seizures and their propagation. In this study, we attempt to identify the spatio-temporal interactions of an epileptic brain using a previously proposed nonlinear dependency measure. Using a clustering model, we determine the average spatial mappings in an epileptic brain at different stages of a complex partial seizure. Results involving 8 seizures from 2 epileptic patients suggest that there may be a fixed pattern associated with regional spatio-temporal dynamics during the interictal to pre-post-ictal transition

    Automated smoother for the numerical decoupling of dynamics models

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST) offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN), which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure.</p> <p>Results</p> <p>In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from <it>in-vivo </it>NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations.</p> <p>Conclusion</p> <p>The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of mechanistic model descriptions from multivariate experimental time series.</p

    Deep Eyedentification: Biometric Identification using Micro-Movements of the Eye

    Full text link
    We study involuntary micro-movements of the eye for biometric identification. While prior studies extract lower-frequency macro-movements from the output of video-based eye-tracking systems and engineer explicit features of these macro-movements, we develop a deep convolutional architecture that processes the raw eye-tracking signal. Compared to prior work, the network attains a lower error rate by one order of magnitude and is faster by two orders of magnitude: it identifies users accurately within seconds

    Cloud computing in industrial SMEs: Identification of the barriers to its adoption and effects of its application

    Get PDF
    ABSTRACT: Cloud computing is a new technological paradigm that may revolutionize how organizations use IT by facilitating delivery of all technology as a service. In the literature, the Cloud is treated mainly through a technological approach focused on the concept definition, service models, infrastructures for its evelopment and security problems. However, there is an important lack of works which analyze this paradigm adoption in SMEs and its results, with a gap between the technological development and its adoption by organizations. This paper uses a qualitative technique methodology -group meetings with managers- and a quantitative one-survey- and identifies which factors act as barriers to Cloud adoption and which positive effects its application generates in 94 industrial SMEs. The conclusion is that the main barriers are of a cultural type and that the positive effects go well beyond reducing costs

    An industry experiment on the effects of test-driven development on external quality and productivity

    Get PDF
    Existing empirical studies on test-driven development (TDD) report different conclusions about its effects on quality and productivity. Very few of those studies are experiments conducted with software professionals in industry. We aim to analyse the effects of TDD on the external quality of the work done and the productivity of developers in an industrial setting. We conducted an experiment with 24 professionals from three different sites of a software organization. We chose a repeated-measures design, and asked subjects to implement TDD and incremental test last development (ITLD) in two simple tasks and a realistic application close to real-life complexity. To analyse our findings, we applied a repeated-measures general linear model procedure and a linear mixed effects procedure. We did not observe a statistical difference between the quality of the work done by subjects in both treatments. We observed that the subjects are more productive when they implement TDD on a simple task compared to ITLD, but the productivity drops significantly when applying TDD to a complex brownfield task. So, the task complexity significantly obscured the effect of TDD. Further evidence is necessary to conclude whether TDD is better or worse than ITLD in terms of external quality and productivity in an industrial setting. We found that experimental factors such as selection of tasks could dominate the findings in TDD studies.This research has been partly funded by Spanish Ministry of Science and Innovation projects TIN2011-23216, the Distinguished Professor Program of Tekes, and the Academy of Finland (Grant Decision No. 260871)

    A review of rapid serial visual presentation-based brain-computer interfaces

    Get PDF
    International audienceRapid serial visual presentation (RSVP) combined with the detection of event related brain responses facilitates the selection of relevant information contained in a stream of images presented rapidly to a human. Event related potentials (ERPs) measured non-invasively with electroencephalography (EEG) can be associated with infrequent targets amongst a stream of images. Human-machine symbiosis may be augmented by enabling human interaction with a computer, without overt movement, and/or enable optimization of image/information sorting processes involving humans. Features of the human visual system impact on the success of the RSVP paradigm, but pre-attentive processing supports the identification of target information post presentation of the information by assessing the co-occurrence or time-locked EEG potentials. This paper presents a comprehensive review and evaluation of the limited but significant literature on research in RSVP-based brain-computer interfaces (BCIs). Applications that use RSVP-based BCIs are categorized based on display mode and protocol design, whilst a range of factors influencing ERP evocation and detection are analyzed. Guidelines for using the RSVP-based BCI paradigms are recommended, with a view to further standardizing methods and enhancing the inter-relatability of experimental design to support future research and the use of RSVP-based BCIs in practice

    Challenges for automatically extracting molecular interactions from full-text articles

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The increasing availability of full-text biomedical articles will allow more biomedical knowledge to be extracted automatically with greater reliability. However, most Information Retrieval (IR) and Extraction (IE) tools currently process only abstracts. The lack of corpora has limited the development of tools that are capable of exploiting the knowledge in full-text articles. As a result, there has been little investigation into the advantages of full-text document structure, and the challenges developers will face in processing full-text articles.</p> <p>Results</p> <p>We manually annotated passages from full-text articles that describe interactions summarised in a Molecular Interaction Map (MIM). Our corpus tracks the process of identifying facts to form the MIM summaries and captures any factual dependencies that must be resolved to extract the fact completely. For example, a fact in the results section may require a synonym defined in the introduction. The passages are also annotated with negated and coreference expressions that must be resolved.</p> <p>We describe the guidelines for identifying relevant passages and possible dependencies. The corpus includes 2162 sentences from 78 full-text articles. Our corpus analysis demonstrates the necessity of full-text processing; identifies the article sections where interactions are most commonly stated; and quantifies the proportion of interaction statements requiring coherent dependencies. Further, it allows us to report on the relative importance of identifying synonyms and resolving negated expressions. We also experiment with an oracle sentence retrieval system using the corpus as a gold-standard evaluation set.</p> <p>Conclusion</p> <p>We introduce the MIM corpus, a unique resource that maps interaction facts in a MIM to annotated passages within full-text articles. It is an invaluable case study providing guidance to developers of biomedical IR and IE systems, and can be used as a gold-standard evaluation set for full-text IR tasks.</p
    corecore