1,130 research outputs found

    Algorithmic Verification of Asynchronous Programs

    Full text link
    Asynchronous programming is a ubiquitous systems programming idiom to manage concurrent interactions with the environment. In this style, instead of waiting for time-consuming operations to complete, the programmer makes a non-blocking call to the operation and posts a callback task to a task buffer that is executed later when the time-consuming operation completes. A co-operative scheduler mediates the interaction by picking and executing callback tasks from the task buffer to completion (and these callbacks can post further callbacks to be executed later). Writing correct asynchronous programs is hard because the use of callbacks, while efficient, obscures program control flow. We provide a formal model underlying asynchronous programs and study verification problems for this model. We show that the safety verification problem for finite-data asynchronous programs is expspace-complete. We show that liveness verification for finite-data asynchronous programs is decidable and polynomial-time equivalent to Petri Net reachability. Decidability is not obvious, since even if the data is finite-state, asynchronous programs constitute infinite-state transition systems: both the program stack and the task buffer of pending asynchronous calls can be potentially unbounded. Our main technical construction is a polynomial-time semantics-preserving reduction from asynchronous programs to Petri Nets and conversely. The reduction allows the use of algorithmic techniques on Petri Nets to the verification of asynchronous programs. We also study several extensions to the basic models of asynchronous programs that are inspired by additional capabilities provided by implementations of asynchronous libraries, and classify the decidability and undecidability of verification questions on these extensions.Comment: 46 pages, 9 figure

    Satellite Information on Orlando, Florida

    Get PDF
    The author has identified the following significant results. Computer classification, accompanied by human interpretation and manual simplification, can produce land use maps which are useful on a regional, county, and for special purpose, a city basis. Change monitoring is potentially an effective application of such data at all planning levels

    Planning applications in east central Florida

    Get PDF
    The author has identified the following significant results. Lake Apopka and three lakes downstream of it (Dora, Eustis, and Griffin) are in an advanced state of eutrophication with high algal concentrations. This feature has shown up consistently on ERTS-1 images in the form of a characteristic water color for those lakes. As expected, EREP photographs also show a characteristic color for those lakes. What was not expected is that Lake Griffin shows a clear pattern of this coloration. Personnel familiar with the lake believe that the photograph does, indeed, show an algal bloom. It is reported that the algal concentration is often significantly higher in the southern portion of the lake. What the photograph shows that was not otherwise known is the pattern of the algal bloom. A similar, but less pronounced, effect is seen in Lake Tohopekaliga. Personnel stationed at Kissimmee reported that there was an algal bloom on that lake at the time of the EREP pass and that its extent corresponded approximately to that shown on the photograph. Again, the EREP photograph gives information about the extent of the bloom that could not be obtained practically by sampling. ERTS-1 images give some indication of this algal distribution on Lake Griffin in some cases, but are inconclusive

    Analysis of Probabilistic Basic Parallel Processes

    Full text link
    Basic Parallel Processes (BPPs) are a well-known subclass of Petri Nets. They are the simplest common model of concurrent programs that allows unbounded spawning of processes. In the probabilistic version of BPPs, every process generates other processes according to a probability distribution. We study the decidability and complexity of fundamental qualitative problems over probabilistic BPPs -- in particular reachability with probability 1 of different classes of target sets (e.g. upward-closed sets). Our results concern both the Markov-chain model, where processes are scheduled randomly, and the MDP model, where processes are picked by a scheduler.Comment: This is the technical report for a FoSSaCS'14 pape

    On Verifying Causal Consistency

    Full text link
    Causal consistency is one of the most adopted consistency criteria for distributed implementations of data structures. It ensures that operations are executed at all sites according to their causal precedence. We address the issue of verifying automatically whether the executions of an implementation of a data structure are causally consistent. We consider two problems: (1) checking whether one single execution is causally consistent, which is relevant for developing testing and bug finding algorithms, and (2) verifying whether all the executions of an implementation are causally consistent. We show that the first problem is NP-complete. This holds even for the read-write memory abstraction, which is a building block of many modern distributed systems. Indeed, such systems often store data in key-value stores, which are instances of the read-write memory abstraction. Moreover, we prove that, surprisingly, the second problem is undecidable, and again this holds even for the read-write memory abstraction. However, we show that for the read-write memory abstraction, these negative results can be circumvented if the implementations are data independent, i.e., their behaviors do not depend on the data values that are written or read at each moment, which is a realistic assumption.Comment: extended version of POPL 201

    LANDSAT planning applications in east central Florida

    Get PDF
    There are no author-identified significant results in this report

    Planning applications in east central Florida

    Get PDF
    There are no author-identified significant results in this report

    2-D Niblett-Bostick magnetotelluric inversion

    Get PDF
    A simple and robust imaging technique for two-dimensional magnetotelluric interpretations has been developed following the well known Niblett-Bostick transformation for one-dimensional profiles. The algorithm processes series and parallel magnetotelluric impedances and their analytical influence functions using a regularized Hopfield artificial neural network. The adaptive, weighted average approximation preserves part of the nonlinearity of the original problem, yet no initial model in the usual sense is required for the recovery of the model; rather, the built-in relationship between model and data automatically and concurrently considers many half spaces whose electrical conductivities vary according to the data. The use of series and parallel impedances, a self-contained pair of invariants of the impedance tensor, avoids the need to decide on best angles of rotation for identifying TE and TM modes. Field data from a given profile can thus be fed directly into the algorithm without much processing. The solutions offered by the regularized Hopfield neural network correspond to spatial averages computed through rectangular windows that can be chosen at will. Applications of the algorithm to simple synthetic models and to the standard COPROD2 data set illustrate the performance of the approximation
    • …
    corecore