159 research outputs found

    Undirected Graphs of Entanglement Two

    Full text link
    Entanglement is a complexity measure of directed graphs that origins in fixed point theory. This measure has shown its use in designing efficient algorithms to verify logical properties of transition systems. We are interested in the problem of deciding whether a graph has entanglement at most k. As this measure is defined by means of games, game theoretic ideas naturally lead to design polynomial algorithms that, for fixed k, decide the problem. Known characterizations of directed graphs of entanglement at most 1 lead, for k = 1, to design even faster algorithms. In this paper we present an explicit characterization of undirected graphs of entanglement at most 2. With such a characterization at hand, we devise a linear time algorithm to decide whether an undirected graph has this property

    Are there any good digraph width measures?

    Full text link
    Several different measures for digraph width have appeared in the last few years. However, none of them shares all the "nice" properties of treewidth: First, being \emph{algorithmically useful} i.e. admitting polynomial-time algorithms for all \MS1-definable problems on digraphs of bounded width. And, second, having nice \emph{structural properties} i.e. being monotone under taking subdigraphs and some form of arc contractions. As for the former, (undirected) \MS1 seems to be the least common denominator of all reasonably expressive logical languages on digraphs that can speak about the edge/arc relation on the vertex set.The latter property is a necessary condition for a width measure to be characterizable by some version of the cops-and-robber game characterizing the ordinary treewidth. Our main result is that \emph{any reasonable} algorithmically useful and structurally nice digraph measure cannot be substantially different from the treewidth of the underlying undirected graph. Moreover, we introduce \emph{directed topological minors} and argue that they are the weakest useful notion of minors for digraphs

    Challenges for Efficient Query Evaluation on Structured Probabilistic Data

    Full text link
    Query answering over probabilistic data is an important task but is generally intractable. However, a new approach for this problem has recently been proposed, based on structural decompositions of input databases, following, e.g., tree decompositions. This paper presents a vision for a database management system for probabilistic data built following this structural approach. We review our existing and ongoing work on this topic and highlight many theoretical and practical challenges that remain to be addressed.Comment: 9 pages, 1 figure, 23 references. Accepted for publication at SUM 201

    Response of the topological surface state to surface disorder in TlBiSe2_2

    Get PDF
    Through a combination of experimental techniques we show that the topmost layer of the topo- logical insulator TlBiSe2_2 as prepared by cleavage is formed by irregularly shaped Tl islands at cryogenic temperatures and by mobile Tl atoms at room temperature. No trivial surface states are observed in photoemission at low temperatures, which suggests that these islands can not be re- garded as a clear surface termination. The topological surface state is, however, clearly resolved in photoemission experiments. This is interpreted as a direct evidence of its topological self-protection and shows the robust nature of the Dirac cone like surface state. Our results can also help explain the apparent mass acquisition in S-doped TlBiSe2_2.Comment: 16 pages, 5 figure

    Distributed Synthesis in Continuous Time

    Get PDF
    We introduce a formalism modelling communication of distributed agents strictly in continuous-time. Within this framework, we study the problem of synthesising local strategies for individual agents such that a specified set of goal states is reached, or reached with at least a given probability. The flow of time is modelled explicitly based on continuous-time randomness, with two natural implications: First, the non-determinism stemming from interleaving disappears. Second, when we restrict to a subclass of non-urgent models, the quantitative value problem for two players can be solved in EXPTIME. Indeed, the explicit continuous time enables players to communicate their states by delaying synchronisation (which is unrestricted for non-urgent models). In general, the problems are undecidable already for two players in the quantitative case and three players in the qualitative case. The qualitative undecidability is shown by a reduction to decentralized POMDPs for which we provide the strongest (and rather surprising) undecidability result so far

    Non-Zero Sum Games for Reactive Synthesis

    Get PDF
    In this invited contribution, we summarize new solution concepts useful for the synthesis of reactive systems that we have introduced in several recent publications. These solution concepts are developed in the context of non-zero sum games played on graphs. They are part of the contributions obtained in the inVEST project funded by the European Research Council.Comment: LATA'16 invited pape

    An international randomised placebo-controlled trial of a four-component combination pill ("polypill") in people with raised cardiovascular risk.

    Full text link
    BACKGROUND:There has been widespread interest in the potential of combination cardiovascular medications containing aspirin and agents to lower blood pressure and cholesterol ('polypills') to reduce cardiovascular disease. However, no reliable placebo-controlled data are available on both efficacy and tolerability. METHODS:We conducted a randomised, double-blind placebo-controlled trial of a polypill (containing aspirin 75 mg, lisinopril 10 mg, hydrochlorothiazide 12.5 mg and simvastatin 20 mg) in 378 individuals without an indication for any component of the polypill, but who had an estimated 5-year cardiovascular disease risk over 7.5%. The primary outcomes were systolic blood pressure (SBP), LDL-cholesterol and tolerability (proportion discontinued randomised therapy) at 12 weeks follow-up. FINDINGS:At baseline, mean BP was 134/81 mmHg and mean LDL-cholesterol was 3.7 mmol/L. Over 12 weeks, polypill treatment reduced SBP by 9.9 (95% CI: 7.7 to 12.1) mmHg and LDL-cholesterol by 0.8 (95% CI 0.6 to 0.9) mmol/L. The discontinuation rates in the polypill group compared to placebo were 23% vs 18% (RR 1.33, 95% CI 0.89 to 2.00, p = 0.2). There was an excess of side effects known to the component medicines (58% vs 42%, p = 0.001), which was mostly apparent within a few weeks, and usually did not warrant cessation of trial treatment. CONCLUSIONS:This polypill achieved sizeable reductions in SBP and LDL-cholesterol but caused side effects in about 1 in 6 people. The halving in predicted cardiovascular risk is moderately lower than previous estimates and the side effect rate is moderately higher. Nonetheless, substantial net benefits would be expected among patients at high risk. TRIAL REGISTRATION:Australian New Zealand Clinical Trials Registry ACTRN12607000099426
    • …
    corecore