295,525 research outputs found

    Towards a Maude tool for model checking temporal graph properties

    Get PDF
    We present our prototypical tool for the verification of graph transformation systems. The major novelty of our tool is that it provides a model checker for temporal graph properties based on counterpart semantics for quantified m-calculi. Our tool can be considered as an instantiation of our approach to counterpart semantics which allows for a neat handling of creation, deletion and merging in systems with dynamic structure. Our implementation is based on the object-based machinery of Maude, which provides the basics to deal with attributed graphs. Graph transformation systems are specified with term rewrite rules. The model checker evaluates logical formulae of second-order modal m-calculus in the automatically generated CounterpartModel (a sort of unfolded graph transition system) of the graph transformation system under study. The result of evaluating a formula is a set of assignments for each state, associating node variables to actual nodes

    Symbolic Attributed Graphs for Attributed Graph Transformation

    Get PDF
    In this paper we present a new approach to deal with attributed graphs and attributed graph transformation. This approach is based on working with what we call symbolic graphs, which are graphs labelled with variables together with a formula that constrains the possible values that we may assign to these variables. In particular, in this paper we will compare in detail this new approach with the standard approach to attributed graph transformation

    Graph-Based Search Procedure for Vector Autoregressive Models

    Get PDF
    Vector Autoregressions (VARs) are a class of time series models commonly used in econometrics to study the dynamic effect of exogenous shocks to the economy. While the estimation of a VAR is straightforward, there is a problem of finding the transformation of the estimated model consistent with the causal relations among the contemporaneous variables. Such problem, which is a version of what is called in econometrics ā€œthe problem of identification,ā€ is faced in this paper using a semi-automated search procedure. The unobserved causal relations of the structural form, to be identified, are represented by a directed graph. Discovery algorithms are developed to infer features of the causal graph from tests on vanishing partial correlations among the VAR residuals. Such tests cannot be based on the usual tests of conditional independence, because of sampling problems due to the time series nature of the data. This paper proposes consistent tests on vanishing partial correlations based on the asymptotic distribution of the estimated VAR residuals. Two different types of search algorithm are considered. A first algorithm restricts the analysis to direct causation among the contemporaneous variables, a second algorithm allows the possibility of cycles (feedback loops) and common shocks among contemporaneous variables. Recovering the causal structure allows a reliable transformation of the estimated vector autoregressive model which is very useful for macroeconomic empirical investigations, such as comparing the effects of different shocks (real vs. nominal) on the economy and finding a measure of the monetary policy shock.VARs, Problem of Identification, Causal Graphs, Structural Shocks

    Exponential Time Complexity of Weighted Counting of Independent Sets

    Full text link
    We consider weighted counting of independent sets using a rational weight x: Given a graph with n vertices, count its independent sets such that each set of size k contributes x^k. This is equivalent to computation of the partition function of the lattice gas with hard-core self-repulsion and hard-core pair interaction. We show the following conditional lower bounds: If counting the satisfying assignments of a 3-CNF formula in n variables (#3SAT) needs time 2^{\Omega(n)} (i.e. there is a c>0 such that no algorithm can solve #3SAT in time 2^{cn}), counting the independent sets of size n/3 of an n-vertex graph needs time 2^{\Omega(n)} and weighted counting of independent sets needs time 2^{\Omega(n/log^3 n)} for all rational weights x\neq 0. We have two technical ingredients: The first is a reduction from 3SAT to independent sets that preserves the number of solutions and increases the instance size only by a constant factor. Second, we devise a combination of vertex cloning and path addition. This graph transformation allows us to adapt a recent technique by Dell, Husfeldt, and Wahlen which enables interpolation by a family of reductions, each of which increases the instance size only polylogarithmically.Comment: Introduction revised, differences between versions of counting independent sets stated more precisely, minor improvements. 14 page

    Belief Propagation and Loop Series on Planar Graphs

    Full text link
    We discuss a generic model of Bayesian inference with binary variables defined on edges of a planar graph. The Loop Calculus approach of [1, 2] is used to evaluate the resulting series expansion for the partition function. We show that, for planar graphs, truncating the series at single-connected loops reduces, via a map reminiscent of the Fisher transformation [3], to evaluating the partition function of the dimer matching model on an auxiliary planar graph. Thus, the truncated series can be easily re-summed, using the Pfaffian formula of Kasteleyn [4]. This allows to identify a big class of computationally tractable planar models reducible to a dimer model via the Belief Propagation (gauge) transformation. The Pfaffian representation can also be extended to the full Loop Series, in which case the expansion becomes a sum of Pfaffian contributions, each associated with dimer matchings on an extension to a subgraph of the original graph. Algorithmic consequences of the Pfaffian representation, as well as relations to quantum and non-planar models, are discussed.Comment: Accepted for publication in Journal of Statistical Mechanics: theory and experimen

    Dimension Reduction via Colour Refinement

    Full text link
    Colour refinement is a basic algorithmic routine for graph isomorphism testing, appearing as a subroutine in almost all practical isomorphism solvers. It partitions the vertices of a graph into "colour classes" in such a way that all vertices in the same colour class have the same number of neighbours in every colour class. Tinhofer (Disc. App. Math., 1991), Ramana, Scheinerman, and Ullman (Disc. Math., 1994) and Godsil (Lin. Alg. and its App., 1997) established a tight correspondence between colour refinement and fractional isomorphisms of graphs, which are solutions to the LP relaxation of a natural ILP formulation of graph isomorphism. We introduce a version of colour refinement for matrices and extend existing quasilinear algorithms for computing the colour classes. Then we generalise the correspondence between colour refinement and fractional automorphisms and develop a theory of fractional automorphisms and isomorphisms of matrices. We apply our results to reduce the dimensions of systems of linear equations and linear programs. Specifically, we show that any given LP L can efficiently be transformed into a (potentially) smaller LP L' whose number of variables and constraints is the number of colour classes of the colour refinement algorithm, applied to a matrix associated with the LP. The transformation is such that we can easily (by a linear mapping) map both feasible and optimal solutions back and forth between the two LPs. We demonstrate empirically that colour refinement can indeed greatly reduce the cost of solving linear programs

    More on Graph Rewriting With Contextual Refinement

    Get PDF
    In GRGEN , a graph rewrite generator tool, rules have the outstandingfeature that variables in their pattern and replacement graphs may be refined withmeta-rules based on contextual hyperedge replacement grammars. A refined rule maydelete, copy, and transform subgraphs of unbounded size and of variable shape. Inthis paper, we show that rules with contextual refinement can be transformed to stan-dard graph rewrite rules that perform the refinement incrementally, and are appliedaccording to a strategy called residual rewriting. With this transformation, it is possi-ble to state precisely whether refinements can be determined in finitely many steps ornot, and whether refinements are unique for every form of refined pattern or not

    Graph Rewriting with Contextual Refinement

    Get PDF
    In the standard theory of graph transformation, a rule modifies only subgraphs of constant size and fixed shape.Ā  The rules supported by the graph-rewriting tool GrGen are far more expressive: they may modify subgraphs of unbounded size and variable shape. Therefore properties like termination and confluence cannot be analyzed as for the standard case. In order to lift such results, we formalize the outstanding feature of GrGen rules by using plain rules on two levels: schemata} are rules with variables; they are refined with meta-rules, which are based on contextual hyperedge replacement, before they are used for rewriting.We show that every rule based on single pushouts, on neighborhood-controlled embedding, or on variable substitution can be modeled by a schema with appropriate meta-rules. It turns out that the question whether schemata may have overlapping refinements is not decidable
    • ā€¦
    corecore