112,278 research outputs found
Checking Dynamic Consistency of Conditional Hyper Temporal Networks via Mean Payoff Games (Hardness and (pseudo) Singly-Exponential Time Algorithm)
In this work we introduce the \emph{Conditional Hyper Temporal Network
(CHyTN)} model, which is a natural extension and generalization of both the
\CSTN and the \HTN model. Our contribution goes as follows. We show that
deciding whether a given \CSTN or CHyTN is dynamically consistent is
\coNP-hard. Then, we offer a proof that deciding whether a given CHyTN is
dynamically consistent is \PSPACE-hard, provided that the input instances are
allowed to include both multi-head and multi-tail hyperarcs. In light of this,
we continue our study by focusing on CHyTNs that allow only multi-head or only
multi-tail hyperarcs, and we offer the first deterministic (pseudo)
singly-exponential time algorithm for the problem of checking the
dynamic-consistency of such CHyTNs, also producing a dynamic execution strategy
whenever the input CHyTN is dynamically consistent. Since \CSTN{s} are a
special case of CHyTNs, this provides as a byproduct the first
sound-and-complete (pseudo) singly-exponential time algorithm for checking
dynamic-consistency in CSTNs. The proposed algorithm is based on a novel
connection between CSTN{s}/CHyTN{s} and Mean Payoff Games. The presentation of
the connection between \CSTN{s}/CHyTNs and \MPG{s} is mediated by the \HTN
model. In order to analyze the algorithm, we introduce a refined notion of
dynamic-consistency, named -dynamic-consistency, and present a sharp
lower bounding analysis on the critical value of the reaction time
where a \CSTN/CHyTN transits from being, to not being,
dynamically consistent. The proof technique introduced in this analysis of
is applicable more generally when dealing with linear
difference constraints which include strict inequalities.Comment: arXiv admin note: text overlap with arXiv:1505.0082
Efficient CTL Verification via Horn Constraints Solving
The use of temporal logics has long been recognised as a fundamental approach
to the formal specification and verification of reactive systems. In this
paper, we take on the problem of automatically verifying a temporal property,
given by a CTL formula, for a given (possibly infinite-state) program. We
propose a method based on encoding the problem as a set of Horn constraints.
The method takes a program, modeled as a transition system, and a property
given by a CTL formula as input. It first generates a set of forall-exists
quantified Horn constraints and well-foundedness constraints by exploiting the
syntactic structure of the CTL formula. Then, the generated set of constraints
are solved by applying an off-the-shelf Horn constraints solving engine. The
program is said to satisfy the property if and only if the generated set of
constraints has a solution. We demonstrate the practical promises of the method
by applying it on a set of challenging examples. Although our method is based
on a generic Horn constraint solving engine, it is able to outperform
state-of-art methods specialised for CTL verification.Comment: In Proceedings HCVS2016, arXiv:1607.0403
Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data
Constraint Programming (CP) has proved an effective paradigm to model and
solve difficult combinatorial satisfaction and optimisation problems from
disparate domains. Many such problems arising from the commercial world are
permeated by data uncertainty. Existing CP approaches that accommodate
uncertainty are less suited to uncertainty arising due to incomplete and
erroneous data, because they do not build reliable models and solutions
guaranteed to address the user's genuine problem as she perceives it. Other
fields such as reliable computation offer combinations of models and associated
methods to handle these types of uncertain data, but lack an expressive
framework characterising the resolution methodology independently of the model.
We present a unifying framework that extends the CP formalism in both model
and solutions, to tackle ill-defined combinatorial problems with incomplete or
erroneous data. The certainty closure framework brings together modelling and
solving methodologies from different fields into the CP paradigm to provide
reliable and efficient approches for uncertain constraint problems. We
demonstrate the applicability of the framework on a case study in network
diagnosis. We define resolution forms that give generic templates, and their
associated operational semantics, to derive practical solution methods for
reliable solutions.Comment: Revised versio
Flow Logic
Flow networks have attracted a lot of research in computer science. Indeed,
many questions in numerous application areas can be reduced to questions about
flow networks. Many of these applications would benefit from a framework in
which one can formally reason about properties of flow networks that go beyond
their maximal flow. We introduce Flow Logics: modal logics that treat flow
functions as explicit first-order objects and enable the specification of rich
properties of flow networks. The syntax of our logic BFL* (Branching Flow
Logic) is similar to the syntax of the temporal logic CTL*, except that atomic
assertions may be flow propositions, like or , for
, which refer to the value of the flow in a vertex, and
that first-order quantification can be applied both to paths and to flow
functions. We present an exhaustive study of the theoretical and practical
aspects of BFL*, as well as extensions and fragments of it. Our extensions
include flow quantifications that range over non-integral flow functions or
over maximal flow functions, path quantification that ranges over paths along
which non-zero flow travels, past operators, and first-order quantification of
flow values. We focus on the model-checking problem and show that it is
PSPACE-complete, as it is for CTL*. Handling of flow quantifiers, however,
increases the complexity in terms of the network to , even
for the LFL and BFL fragments, which are the flow-counterparts of LTL and CTL.
We are still able to point to a useful fragment of BFL* for which the
model-checking problem can be solved in polynomial time. Finally, we introduce
and study the query-checking problem for BFL*, where under-specified BFL*
formulas are used for network exploration
Flux imbalance analysis and the sensitivity of cellular growth to changes in metabolite pools
Stoichiometric models of metabolism, such as flux balance analysis (FBA), are classically applied to predicting steady state rates - or fluxes - of metabolic reactions in genome-scale metabolic networks. Here we revisit the central assumption of FBA, i.e. that intracellular metabolites are at steady state, and show that deviations from flux balance (i.e. flux imbalances) are informative of some features of in vivo metabolite concentrations. Mathematically, the sensitivity of FBA to these flux imbalances is captured by a native feature of linear optimization, the dual problem, and its corresponding variables, known as shadow prices. First, using recently published data on chemostat growth of Saccharomyces cerevisae under different nutrient limitations, we show that shadow prices anticorrelate with experimentally measured degrees of growth limitation of intracellular metabolites. We next hypothesize that metabolites which are limiting for growth (and thus have very negative shadow price) cannot vary dramatically in an uncontrolled way, and must respond rapidly to perturbations. Using a collection of published datasets monitoring the time-dependent metabolomic response of Escherichia coli to carbon and nitrogen perturbations, we test this hypothesis and find that metabolites with negative shadow price indeed show lower temporal variation following a perturbation than metabolites with zero shadow price. Finally, we illustrate the broader applicability of flux imbalance analysis to other constraint-based methods. In particular, we explore the biological significance of shadow prices in a constraint-based method for integrating gene expression data with a stoichiometric model. In this case, shadow prices point to metabolites that should rise or drop in concentration in order to increase consistency between flux predictions and gene expression data. In general, these results suggest that the sensitivity of metabolic optima to violations of the steady state constraints carries biologically significant information on the processes that control intracellular metabolites in the cell.Published versio
Quantifying Differential Privacy under Temporal Correlations
Differential Privacy (DP) has received increased attention as a rigorous
privacy framework. Existing studies employ traditional DP mechanisms (e.g., the
Laplace mechanism) as primitives, which assume that the data are independent,
or that adversaries do not have knowledge of the data correlations. However,
continuously generated data in the real world tend to be temporally correlated,
and such correlations can be acquired by adversaries. In this paper, we
investigate the potential privacy loss of a traditional DP mechanism under
temporal correlations in the context of continuous data release. First, we
model the temporal correlations using Markov model and analyze the privacy
leakage of a DP mechanism when adversaries have knowledge of such temporal
correlations. Our analysis reveals that the privacy leakage of a DP mechanism
may accumulate and increase over time. We call it temporal privacy leakage.
Second, to measure such privacy leakage, we design an efficient algorithm for
calculating it in polynomial time. Although the temporal privacy leakage may
increase over time, we also show that its supremum may exist in some cases.
Third, to bound the privacy loss, we propose mechanisms that convert any
existing DP mechanism into one against temporal privacy leakage. Experiments
with synthetic data confirm that our approach is efficient and effective.Comment: appears at ICDE 201
- …