103,444 research outputs found

    Constrained set-up of the tGAP structure for progressive vector data transfer

    Get PDF
    A promising approach to submit a vector map from a server to a mobile client is to send a coarse representation first, which then is incrementally refined. We consider the problem of defining a sequence of such increments for areas of different land-cover classes in a planar partition. In order to submit well-generalised datasets, we propose a method of two stages: First, we create a generalised representation from a detailed dataset, using an optimisation approach that satisfies certain cartographic constraints. Second, we define a sequence of basic merge and simplification operations that transforms the most detailed dataset gradually into the generalised dataset. The obtained sequence of gradual transformations is stored without geometrical redundancy in a structure that builds up on the previously developed tGAP (topological Generalised Area Partitioning) structure. This structure and the algorithm for intermediate levels of detail (LoD) have been implemented in an object-relational database and tested for land-cover data from the official German topographic dataset ATKIS at scale 1:50 000 to the target scale 1:250 000. Results of these tests allow us to conclude that the data at lowest LoD and at intermediate LoDs is well generalised. Applying specialised heuristics the applied optimisation method copes with large datasets; the tGAP structure allows users to efficiently query and retrieve a dataset at a specified LoD. Data are sent progressively from the server to the client: First a coarse representation is sent, which is refined until the requested LoD is reached

    R-local Delaunay inhibition model

    Full text link
    Let us consider the local specification system of Gibbs point process with inhib ition pairwise interaction acting on some Delaunay subgraph specifically not con taining the edges of Delaunay triangles with circumscribed circle of radius grea ter than some fixed positive real value RR. Even if we think that there exists at least a stationary Gibbs state associated to such system, we do not know yet how to prove it mainly due to some uncontrolled "negative" contribution in the expression of the local energy needed to insert any number of points in some large enough empty region of the space. This is solved by introducing some subgraph, called the RR-local Delaunay graph, which is a slight but tailored modification of the previous one. This kind of model does not inherit the local stability property but satisfies s ome new extension called RR-local stability. This weakened property combined with the local property provides the existence o f Gibbs state.Comment: soumis \`{a} Journal of Statistical Physics 27 page

    Partially ordered models

    Full text link
    We provide a formal definition and study the basic properties of partially ordered chains (POC). These systems were proposed to model textures in image processing and to represent independence relations between random variables in statistics (in the later case they are known as Bayesian networks). Our chains are a generalization of probabilistic cellular automata (PCA) and their theory has features intermediate between that of discrete-time processes and the theory of statistical mechanical lattice fields. Its proper definition is based on the notion of partially ordered specification (POS), in close analogy to the theory of Gibbs measure. This paper contains two types of results. First, we present the basic elements of the general theory of POCs: basic geometrical issues, definition in terms of conditional probability kernels, extremal decomposition, extremality and triviality, reconstruction starting from single-site kernels, relations between POM and Gibbs fields. Second, we prove three uniqueness criteria that correspond to the criteria known as bounded uniformity, Dobrushin and disagreement percolation in the theory of Gibbs measures.Comment: 54 pages, 11 figures, 6 simulations. Submited to Journal of Stat. Phy

    A Metric for Linear Temporal Logic

    Full text link
    We propose a measure and a metric on the sets of infinite traces generated by a set of atomic propositions. To compute these quantities, we first map properties to subsets of the real numbers and then take the Lebesgue measure of the resulting sets. We analyze how this measure is computed for Linear Temporal Logic (LTL) formulas. An implementation for computing the measure of bounded LTL properties is provided and explained. This implementation leverages SAT model counting and effects independence checks on subexpressions to compute the measure and metric compositionally

    Relative entropy and variational properties of generalized Gibbsian measures

    Get PDF
    We study the relative entropy density for generalized Gibbs measures. We first show its existence and obtain a familiar expression in terms of entropy and relative energy for a class of ``almost Gibbsian measures'' (almost sure continuity of conditional probabilities). For quasilocal measures, we obtain a full variational principle. For the joint measures of the random field Ising model, we show that the weak Gibbs property holds, with an almost surely rapidly decaying translation-invariant potential. For these measures we show that the variational principle fails as soon as the measures lose the almost Gibbs property. These examples suggest that the class of weakly Gibbsian measures is too broad from the perspective of a reasonable thermodynamic formalism.Comment: Published by the Institute of Mathematical Statistics (http://www.imstat.org) in the Annals of Probability (http://www.imstat.org/aop/) at http://dx.doi.org/10.1214/00911790400000034

    Transport Inequalities. A Survey

    Full text link
    This is a survey of recent developments in the area of transport inequalities. We investigate their consequences in terms of concentration and deviation inequalities and sketch their links with other functional inequalities and also large deviation theory.Comment: Proceedings of the conference Inhomogeneous Random Systems 2009; 82 pages

    Measuring time-varying economic fears with consumption-based stochastic discount factors

    Get PDF
    This paper analyzes empirically the volatility of consumption-based stochastic discount factors as a measure of implicit economic fears by studying its relationship with future economic and stock market cycles. Time-varying economic fears seem to be well captured by the volatility of stochastic discount factors. In particular, the volatility of recursive utility-based stochastic discount factor with contemporaneous growth explains between 9 and 34 percent of future changes in industrial production at short and long horizons respectively. They also explain ex-ante uncertainty and risk aversion. However, future stock market cycles are better explained by a similar stochastic discount factor with long-run consumption growth. This specification of the stochastic discount factor presents higher volatility and lower pricing errors than the specification with contemporaneous consumption growth.Stochastic discount factor, economic fears, distance between probability measures, volatility of stochastic discount factor, consumption
    corecore