2,983 research outputs found
Recommended from our members
Graph models for reachability analysis of concurrent programs
Reachability analysis is an attractive technique for analysis of concurrent programs because it is simple and relatively straightforward to automate, and can be used in conjunction with model-checking procedures to check for application-specific as well as general properties. Several techniques have been proposed differing mainly on the model used; some of these propose the use of flowgraph based models, some others of Petri nets.This paper addresses the question: What essential difference does it make, if any, what sort of finite-state model we extract from program texts for purposes of reachability analysis? How do they differ in expressive power, decision power, or accuracy? Since each is intended to model synchronization structure while abstracting away other features, one would expect them to be roughly equivalent.We confirm that there is no essential semantic difference between the most well known models proposed in the literature by providing algorithms for translation among these models. This implies that the choice of model rests on other factors, including convenience and efficiency.Since combinatorial explosion is the primary impediment to application of reachability analysis, a particular concern in choosing a model is facilitating divide-and-conquer analysis of large programs. Recently, much interest in finite-state verification systems has centered on algebraic theories of concurrency. Yeh and Young have exploited algebraic structure to decompose reachability analysis based on a flowgraph model. The semantic equivalence of graph and Petri net based models suggests that one ought to be able to apply a similar strategy for decomposing Petri nets. We show this is indeed possible through application of category theory
Recommended from our members
Hybrid analysis techniques for software fault detection
Since the question "Does program P obey specification S" is undecidable in general, every practical software validation technique must compromise accuracy in some way. Testing techniques admit the possibility that a fault will go undetected, as the price for quitting after a finite number of test cases. Formal verification admits the possibility that a proof will not be found for a valid assertion, as the price for quitting after a finite amount of proof effort. No technique so dominates others that a wise validation strategy consists of applying that technique alone; rather, effective validation requires applying several techniques
Recommended from our members
Arcadia, a software development environment research project
The research objectives of the Arcadia project are two-fold: discovery and development of environment architecture principles and creation of novel software development tools, particularly powerful analysis tools, which will function within an environment built upon these architectural principles.Work in the architecture area is concerned with providing the framework to support integration while also supporting the often conflicting goal of extensibility. Thus, this area of research is directed toward achieving external integration by providing a consistent, uniform user interface, while still admitting customization and addition of new tools and interface functions. In an effort to also attain internal integration, research is aimed at developing mechanisms for structuring and managing the tools and data objects that populate a software development environment, while facilitating the insertion of new kinds of tools and new classes of objects.The unifying theme of work in the tools area is support for effective analysis at every stage of a software development project. Research is directed toward tools suitable for analyzing pre-implementation descriptions of software, software itself, and towards the production of testing and debugging tools. In many cases, these tools are specifically tailored for applicability to concurrent, distributed, or real-time software systems.The initial focus of Arcadia research is on creating a prototype environment, embodying the architectural principles, which supports Ada1 software development. This prototype environment is itself being developed in Ada.Arcadia is being developed by a consortium of researchers from the University of California at Irvine, the University of Colorado at Boulder, the University of Massachusetts at Amherst, TRW, Incremental Systems Corporation, and The Aerospace Corporation. This paper delineates the research objectives and describes the approaches being taken, the organization of the research endeavor, and current status of the work
Recommended from our members
Contingency and latency in associative learning : computational, algorithmic and implementation analyses
Contingency (the learned relative salience of environmental features) and latency (the learned timing of response to stimuli) are central phenomena of learning and memory. This paper provides a computational analysis of, and algorithms for, a set of empirical data on contingency and latency in classical and instrumental conditioning. These analyses are presented within the framework of an information-processing architecture that describes a set of modules which operate in parallel and asynchronously to store, retrieve and modify experiential information. The architecture (called 'CEL', for 'Components of Experiential Learning') provides a way of making explicit the interactions among a number of otherwise separate algorithms for related phenomena. The modules comprising the architecture each emerge from the operation of an indexed network memory. The algorithms presented are also implemented in working computer programs that interact with a simulated environment to produce contingent associative learning and differential response latencies that correspond to the relevant behavioral data. The model makes a number of specific empirical predictions that can be experimentally tested
Computing and Diagnosing Changes in Unit Test Energy Consumption
Many developers have reason to be concerned with with power consumption. For example, mobile app developers want to minimize how much power their applications draw, while still providing useful functionality. However, developers have few tools to get feedback about changes to their application\u27s power consumption behavior as they implement an application and make changes to it over time. We present a tool that, using a team\u27s existing test cases, performs repeated measurements of energy consumption based on instructions executed, objects generated, and blocking latency, generating a distribution of energy use estimates for each test run, recording these distributions in a time series of distributions over time. Then, when these distributions change substantially, we inform the developer of this change, and offer them diagnostic information about the elements of their code potentially responsible for the change and the inputs responsible. Through this information, we believe that developers will be better enabled to relate recent changes in their code to changes in energy consumption, enabling them to better incorporate changes in software energy consumption into their software evolution decisions
Next generation software environments : principles, problems, and research directions
The past decade has seen a burgeoning of research and development in software environments. Conferences have been devoted to the topic of practical environments, journal papers produced, and commercial systems sold. Given all the activity, one might expect a great deal of consensus on issues, approaches, and techniques. This is not the case, however. Indeed, the term "environment" is still used in a variety of conflicting ways. Nevertheless substantial progress has been made and we are at least nearing consensus on many critical issues.The purpose of this paper is to characterize environments, describe several important principles that have emerged in the last decade or so, note current open problems, and describe some approaches to these problems, with particular emphasis on the activities of one large-scale research program, the Arcadia project. Consideration is also given to two related topics: empirical evaluation and technology transition. That is, how can environments and their constituents be evaluated, and how can new developments be moved effectively into the production sector
Compact Remnant Mass Function: Dependence on the Explosion Mechanism and Metallicity
The mass distribution of neutron stars and stellar-mass black holes provides
vital clues into the nature of stellar core collapse and the physical engine
responsible for supernova explosions. Using recent advances in our
understanding of supernova engines, we derive mass distributions of stellar
compact remnants. We provide analytical prescriptions for compact object masses
for major population synthesis codes. In an accompanying paper, Belczynski et
al., we demonstrate that these qualitatively new results for compact objects
can explain the observed gap in the remnant mass distribution between ~2-5
solar masses and that they place strong constraints on the nature of the
supernova engine. Here, we show that advanced gravitational radiation detectors
(like LIGO/VIRGO or the Einstein Telescope) will be able to further test the
supernova explosion engine models once double black hole inspirals are
detected.Comment: 37 pages with 16 figures, submitted to Ap
Recommended from our members
A Comparison of X-Ray Microdiffraction and Coherent Gradient Sensing in Measuring Discontinuous Curvatures in Thin Film: Substrate Systems
The coherent gradient sensor (CGS) is a shearing interferometer which has been proposed for the rapid, full-field measurement of deformation states (slopes and curvatures) in thin film-wafer substrate systems, and for the subsequent inference of stresses in the thin films. This approach needs to be verified using a more well-established but time-consuming grain orientation and stress measurement tool, X-ray microdiffraction (XRD). Both CGS and XRD are used to measure the deformation state of the same W film/Si wafer at room temperature. CGS provides a global, wafer-level measurement of slopes while XRD provides a local micromeasurement of lattice rotations. An extreme case of a circular Si wafer with a circular W film island in its center is used because of the presence of discontinuous system curvatures across the wafer. The results are also compared with a theoretical model based on elastic plate analysis of the axisymmetric biomaterial film-substrate system. Slope and curvature measurements by XRD and by CGS compare very well with each other and with theory. The favorable comparison demonstrates that wafer-level CGS metrology provides a quick and accurate alternative to other measurements. It also demonstrates the accuracy of plate theory in modeling thin film-substrate systems, even in the presence of curvature discontinuities
An exploratory trial implementing a community-based child oral health promotion intervention for Australian families from refugee and migrant backgrounds: a protocol paper for Teeth Tales
Introduction: Inequalities are evident in early childhood caries rates with the socially disadvantaged experiencing greater burden of disease. This study builds on formative qualitative research, conducted in the Moreland/Hume local government areas of Melbourne, Victoria 2006–2009, in response to community concerns for oral health of children from refugee and migrant backgrounds. Development of the community-based intervention described here extends the partnership approach to cogeneration of contemporary evidence with continued and meaningful involvement of investigators, community, cultural and government partners. This trial aims to establish a model for child oral health promotion for culturally diverse communities in Australia.<p></p>
Methods and analysis: This is an exploratory trial implementing a community-based child oral health promotion intervention for Australian families from refugee and migrant backgrounds. Families from an Iraqi, Lebanese or Pakistani background with children aged 1–4 years, residing in metropolitan Melbourne, were invited to participate in the trial by peer educators from their respective communities using snowball and purposive sampling techniques. Target sample size was 600. Moreland, a culturally diverse, inner-urban metropolitan area of Melbourne, was chosen as the intervention site. The intervention comprised peer educator led community oral health education sessions and reorienting of dental health and family services through cultural Competency Organisational Review (CORe).<p></p>
Ethics and dissemination: Ethics approval for this trial was granted by the University of Melbourne Human Research Ethics Committee and the Department of Education and Early Childhood Development Research Committee. Study progress and output will be disseminated via periodic newsletters, peer-reviewed research papers, reports, community seminars and at National and International conferences.<p></p>
- …