167,500 research outputs found
Testing For Nonlinearity Using Redundancies: Quantitative and Qualitative Aspects
A method for testing nonlinearity in time series is described based on
information-theoretic functionals -- redundancies, linear and nonlinear forms
of which allow either qualitative, or, after incorporating the surrogate data
technique, quantitative evaluation of dynamical properties of scrutinized data.
An interplay of quantitative and qualitative testing on both the linear and
nonlinear levels is analyzed and robustness of this combined approach against
spurious nonlinearity detection is demonstrated. Evaluation of redundancies and
redundancy-based statistics as functions of time lag and embedding dimension
can further enhance insight into dynamics of a system under study.Comment: 32 pages + 1 table in separate postscript files, 12 figures in 12
encapsulated postscript files, all in uuencoded, compressed tar file. Also
available by anon. ftp to santafe.edu, in directory pub/Users/mp/qq. To be
published in Physica D., [email protected]
Efficient parameter search for qualitative models of regulatory networks using symbolic model checking
Investigating the relation between the structure and behavior of complex
biological networks often involves posing the following two questions: Is a
hypothesized structure of a regulatory network consistent with the observed
behavior? And can a proposed structure generate a desired behavior? Answering
these questions presupposes that we are able to test the compatibility of
network structure and behavior. We cast these questions into a parameter search
problem for qualitative models of regulatory networks, in particular
piecewise-affine differential equation models. We develop a method based on
symbolic model checking that avoids enumerating all possible parametrizations,
and show that this method performs well on real biological problems, using the
IRMA synthetic network and benchmark experimental data sets. We test the
consistency between the IRMA network structure and the time-series data, and
search for parameter modifications that would improve the robustness of the
external control of the system behavior
XTSC-Bench: Quantitative Benchmarking for Explainers on Time Series Classification
Despite the growing body of work on explainable machine learning in time
series classification (TSC), it remains unclear how to evaluate different
explainability methods. Resorting to qualitative assessment and user studies to
evaluate explainers for TSC is difficult since humans have difficulties
understanding the underlying information contained in time series data.
Therefore, a systematic review and quantitative comparison of explanation
methods to confirm their correctness becomes crucial. While steps to
standardized evaluations were taken for tabular, image, and textual data,
benchmarking explainability methods on time series is challenging due to a)
traditional metrics not being directly applicable, b) implementation and
adaption of traditional metrics for time series in the literature vary, and c)
varying baseline implementations. This paper proposes XTSC-Bench, a
benchmarking tool providing standardized datasets, models, and metrics for
evaluating explanation methods on TSC. We analyze 3 perturbation-, 6 gradient-
and 2 example-based explanation methods to TSC showing that improvements in the
explainers' robustness and reliability are necessary, especially for
multivariate data.Comment: Accepted at ICMLA 202
Qualitative robustness of statistical functionals under strong mixing
A new concept of (asymptotic) qualitative robustness for plug-in estimators
based on identically distributed possibly dependent observations is introduced,
and it is shown that Hampel's theorem for general metrics still holds.
Since Hampel's theorem assumes the UGC property w.r.t. , that is,
convergence in probability of the empirical probability measure to the true
marginal distribution w.r.t. uniformly in the class of all admissible laws
on the sample path space, this property is shown for a large class of strongly
mixing laws for three different metrics . For real-valued observations, the
UGC property is established for both the Kolomogorov -metric and the
L\'{e}vy -metric, and for observations in a general locally compact and
second countable Hausdorff space the UGC property is established for a certain
metric generating the -weak topology. The key is a new uniform weak LLN
for strongly mixing random variables. The latter is of independent interest and
relies on Rio's maximal inequality.Comment: Published at http://dx.doi.org/10.3150/14-BEJ608 in the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Structurally robust biological networks
Background:
The molecular circuitry of living organisms performs remarkably robust regulatory tasks, despite the often intrinsic variability of its components. A large body of research has in fact highlighted that robustness is often a structural property of biological systems. However, there are few systematic methods to mathematically model and describe structural robustness. With a few exceptions, numerical studies are often the preferred approach to this type of investigation.
Results:
In this paper, we propose a framework to analyze robust stability of equilibria in biological networks. We employ Lyapunov and invariant sets theory, focusing on the structure of ordinary differential equation models. Without resorting to extensive numerical simulations, often necessary to explore the behavior of a model in its parameter space, we provide rigorous proofs of robust stability of known bio-molecular networks. Our results are in line with existing literature.
Conclusions:
The impact of our results is twofold: on the one hand, we highlight that classical and simple control theory methods are extremely useful to characterize the behavior of biological networks analytically. On the other hand, we are able to demonstrate that some biological networks are robust thanks to their structure and some qualitative properties of the interactions, regardless of the specific values of their parameters
Integrating Quantitative Knowledge into a Qualitative Gene Regulatory Network
Despite recent improvements in molecular techniques, biological knowledge remains incomplete. Any theorizing about living systems is therefore necessarily based on the use of heterogeneous and partial information. Much current research has focused successfully on the qualitative behaviors of macromolecular networks. Nonetheless, it is not capable of taking into account available quantitative information such as time-series protein concentration variations. The present work proposes a probabilistic modeling framework that integrates both kinds of information. Average case analysis methods are used in combination with Markov chains to link qualitative information about transcriptional regulations to quantitative information about protein concentrations. The approach is illustrated by modeling the carbon starvation response in Escherichia coli. It accurately predicts the quantitative time-series evolution of several protein concentrations using only knowledge of discrete gene interactions and a small number of quantitative observations on a single protein concentration. From this, the modeling technique also derives a ranking of interactions with respect to their importance during the experiment considered. Such a classification is confirmed by the literature. Therefore, our method is principally novel in that it allows (i) a hybrid model that integrates both qualitative discrete model and quantities to be built, even using a small amount of quantitative information, (ii) new quantitative predictions to be derived, (iii) the robustness and relevance of interactions with respect to phenotypic criteria to be precisely quantified, and (iv) the key features of the model to be extracted that can be used as a guidance to design future experiments
Interpretable time series neural representation for classification purposes
Deep learning has made significant advances in creating efficient
representations of time series data by automatically identifying complex
patterns. However, these approaches lack interpretability, as the time series
is transformed into a latent vector that is not easily interpretable. On the
other hand, Symbolic Aggregate approximation (SAX) methods allow the creation
of symbolic representations that can be interpreted but do not capture complex
patterns effectively. In this work, we propose a set of requirements for a
neural representation of univariate time series to be interpretable. We propose
a new unsupervised neural architecture that meets these requirements. The
proposed model produces consistent, discrete, interpretable, and visualizable
representations. The model is learned independently of any downstream tasks in
an unsupervised setting to ensure robustness. As a demonstration of the
effectiveness of the proposed model, we propose experiments on classification
tasks using UCR archive datasets. The obtained results are extensively compared
to other interpretable models and state-of-the-art neural representation
learning models. The experiments show that the proposed model yields, on
average better results than other interpretable approaches on multiple
datasets. We also present qualitative experiments to asses the interpretability
of the approach.Comment: International Conference on Data Science and Advanced Analytics
(DSAA) 202
- …