255 research outputs found
Near-Optimal Scheduling for LTL with Future Discounting
We study the search problem for optimal schedulers for the linear temporal
logic (LTL) with future discounting. The logic, introduced by Almagor, Boker
and Kupferman, is a quantitative variant of LTL in which an event in the far
future has only discounted contribution to a truth value (that is a real number
in the unit interval [0, 1]). The precise problem we study---it naturally
arises e.g. in search for a scheduler that recovers from an internal error
state as soon as possible---is the following: given a Kripke frame, a formula
and a number in [0, 1] called a margin, find a path of the Kripke frame that is
optimal with respect to the formula up to the prescribed margin (a truly
optimal path may not exist). We present an algorithm for the problem; it works
even in the extended setting with propositional quality operators, a setting
where (threshold) model-checking is known to be undecidable
Discounting in LTL
In recent years, there is growing need and interest in formalizing and
reasoning about the quality of software and hardware systems. As opposed to
traditional verification, where one handles the question of whether a system
satisfies, or not, a given specification, reasoning about quality addresses the
question of \emph{how well} the system satisfies the specification. One
direction in this effort is to refine the "eventually" operators of temporal
logic to {\em discounting operators}: the satisfaction value of a specification
is a value in , where the longer it takes to fulfill eventuality
requirements, the smaller the satisfaction value is.
In this paper we introduce an augmentation by discounting of Linear Temporal
Logic (LTL), and study it, as well as its combination with propositional
quality operators. We show that one can augment LTL with an arbitrary set of
discounting functions, while preserving the decidability of the model-checking
problem. Further augmenting the logic with unary propositional quality
operators preserves decidability, whereas adding an average-operator makes some
problems undecidable. We also discuss the complexity of the problem, as well as
various extensions
Sedimentology, Provenance and Radiometric Dating of the Silante Formation: Implications for the Cenozoic Evolution of the Western Andes of Ecuador
The Silante Formation is a thick series of continental deposits, exposed along a trench-parallel distance of approximately 300 km within the Western Cordillera of Ecuador. The origin, tectonic setting, age and stratigraphic relationships are poorly known, although these are key to understand the Cenozoic evolution of the Ecuadorian Andes. We present new sedimentological, stratigraphic, petrographic, radiometric and provenance data from the Silante Formation and underlying rocks. The detailed stratigraphic analysis shows that the Silante Formation unconformably overlies Paleocene submarine fan deposits of the Pilalo Formation, which was coeval with submarine tholeiitic volcanism. The lithofacies of the Silante Formation suggest that the sediments were deposited in a debris flow dominated alluvial fan. Provenance analysis including heavy mineral assemblages and detrital zircon U-Pb ages indicate that sediments of the Silante Formation were derived from the erosion of a continental, calc-alkaline volcanic arc, pointing to the Oligocene to Miocene San Juan de Lachas volcanic arc. Thermochronological data and regional correlations suggest that deposition of the Silante Formation was coeval with regional rock and surface uplift of the Andean margin that deposited alluvial fans in intermontane and back-arc domains
Recommended from our members
Containment and equivalence of weighted automata: Probabilistic and max-plus cases
This paper surveys some results regarding decision problems for probabilistic and max-plus automata, such as containment and equivalence. Probabilistic and max-plus automata are part of the general family of weighted automata, whose semantics are maps from words to real values. Given two weighted automata, the equivalence problem asks whether their semantics are the same, and the containment problem whether one is point-wise smaller than the other one. These problems have been studied intensively and this paper will review some techniques used to show (un)decidability and state a list of open questions that still remain
A methodology pruning the search space of six compiler transformations by addressing them together as one problem and by exploiting the hardware architecture details
Today’s compilers have a plethora of optimizations-transformations to choose from, and the correct choice, order as well parameters of transformations have a significant/large impact on performance; choosing the correct order and parameters of optimizations has been a long standing problem in compilation research, which until now remains unsolved; the separate sub-problems optimization gives a different schedule/binary for each sub-problem and these schedules cannot coexist, as by refining one degrades the other. Researchers try to solve this problem by using iterative compilation techniques but the search space is so big that it cannot be searched even by using modern supercomputers. Moreover, compiler transformations do not take into account the hardware architecture details and data reuse in an efficient way. In this paper, a new iterative compilation methodology is presented which reduces the search space of six compiler transformations by addressing the above problems; the search space is reduced by many orders of magnitude and thus an efficient solution is now capable to be found. The transformations are the following: loop tiling (including the number of the levels of tiling), loop unroll, register allocation, scalar replacement, loop interchange and data array layouts. The search space is reduced (a) by addressing the aforementioned transformations together as one problem and not separately, (b) by taking into account the custom hardware architecture details (e.g., cache size and associativity) and algorithm characteristics (e.g., data reuse). The proposed methodology has been evaluated over iterative compilation and gcc/icc compilers, on both embedded and general purpose processors; it achieves significant performance gains at many orders of magnitude lower compilation time
Prompt interval temporal logic
Interval temporal logics are expressive formalisms for temporal representation and reasoning, which use time intervals as primitive temporal entities. They have been extensively studied for the past two decades and successfully applied in AI and computer science. Unfortunately, they lack the ability of expressing promptness conditions, as it happens with the commonly-used temporal logics, e.g., LTL: whenever we deal with a liveness request, such as \u201csomething good eventually happens\u201d, there is no way to impose a bound on the delay with which it is fulfilled. In the last years, such an issue has been addressed in automata theory, game theory, and temporal logic. In this paper, we approach it in the interval temporal logic setting. First, we introduce PROMPT-PNL, a prompt extension of the well-studied interval temporal logic PNL, and we prove the undecidability of its satisfiability problem; then, we show how to recover decidability (NEXPTIME-completeness) by imposing a natural syntactic restriction on it
Racism and hate speech – A critique of Scanlon’s Contractual Theory
The First Amendment is an important value in American liberal polity. Under this value, racism, hate speech and offensive speech are protected speech. This article scrutinizes one of the clear representatives of the American liberal polity - Thomas Scanlon. The paper tracks the developments in his theory over the years. It is argued that Scanlon’s arguments downplay tangible harm that speech might inflict on its target victim audience. Scanlon’s distinction between participant interests, audience interests, and the interests of bystanders is put under close scrutiny. The article criticizes viewpoint neutrality and suggests a balancing approach, further arguing that democracy is required to develop protective mechanisms against harm-facilitating speech as well as profound offences. Both should be taken most seriously
First international new intravascular rigid-flex endovascular stent study (FINESS): Clinical and angiographic results after elective and urgent stent implantation
Objectives. The purpose of this study was to determine the feasibility, safety and efficacy of elective and urgent deployment of the new intravascular rigid-flex (NIR) stent in patients with coronary artery disease. Background. Stent implantation has been shown to be effective in the treatment of focal, new coronary stenoses and in restoring coronary flow after coronary dissection and abrupt vessel closure. However, currently available stents either lack flexibility, hindering navigation through tortuous arteries, or lack axial strength, resulting in suboptimal scaffolding of the vessel. The unique transforming multicellular design of the NIR stent appears to provide both longitudinal flexibility and radial strength. Methods. NIR stent implantation was attempted in 255 patients (341 lesions) enrolled prospectively in a multicenter international registry from December 1995 through March 1996. Nine-, 16- and 32-mm long NIR stents were manually crimped onto coronary balloons and deployed in native coronary (94%) and saphenous vein graft (6%) lesions. Seventy-four percent of patients underwent elective stenting for primary or restenotic lesions, 21% for a suboptimal angioplasty result and 5% for threatened or abrupt vessel closure. Fifty-two percent of patients presented with unstable angina, 48% had a previous myocardial infarction, and 45% had multivessel disease. Coronary lesions were frequently complex, occurring in relatively small arteries (mean [±SD] reference diameter 2.8 ± 0.6 mm). Patients were followed up for 6 months for the occurrence of major adverse cardiovascular events. Results. Stent deployment was accomplished in 98% of lesions. Mean minimal lumen diameter increased by 1.51 ± 0.51 mm (from 1.09 ± 0.43 mm before to 2.60 ± 0.50 mm after the procedure). Mean percent diameter stenosis decreased from 61 ± 13% before to 17 ± 7% after intervention. A successful interventional procedure with <50% diameter stenosis of all treatment site lesions and no major adverse cardiac events within 30 days occurred in 95% of patients. Event-free survival at 6 months was 82%. Ninety-four percent of surviving patients were either asymptomatic or had mild stable angina at 6 month follow-up. Conclusions. Despite unfavorable clinical and angiographic characteristics of the majority of patients enrolled, the acute angiographic results and early clinical outcome after NIR stent deployment were very promising. A prospective, randomized trial comparing the NIR stent with other currently available stents appears warranted
- …