8,368 research outputs found

    Model Checking Markov Chains with Actions and State Labels

    Get PDF
    In the past, logics of several kinds have been proposed for reasoning about discrete- or continuous-time Markov chains. Most of these logics rely on either state labels (atomic propositions) or on transition labels (actions). However, in several applications it is useful to reason about both state-properties and action-sequences. For this purpose, we introduce the logic asCSL which provides powerful means to characterize execution paths of Markov chains with actions and state labels. asCSL can be regarded as an extension of the purely state-based logic asCSL (continuous stochastic logic). \ud In asCSL, path properties are characterized by regular expressions over actions and state-formulas. Thus, the truth value of path-formulas does not only depend on the available actions in a given time interval, but also on the validity of certain state formulas in intermediate states.\ud We compare the expressive power of CSL and asCSL and show that even the state-based fragment of asCSL is strictly more expressive than CSL if time intervals starting at zero are employed. Using an automaton-based technique, an asCSL formula and a Markov chain with actions and state labels are combined into a product Markov chain. For time intervals starting at zero we establish a reduction of the model checking problem for asCSL to CSL model checking on this product Markov chain. The usefulness of our approach is illustrated by through an elaborate model of a scalable cellular communication system for which several properties are formalized by means of asCSL-formulas, and checked using the new procedure

    The South Dakota cooperative land use effort: A state level remote sensing demonstration project

    Get PDF
    Remote sensing technology can satisfy or make significant contributions toward satisfying many of the information needs of governmental natural resource planners and policy makers. Recognizing this potential, the South Dakota State Planning Bureau and the EROS Data Center together formulated the framework for an ongoing Land Use and Natural Resource Inventory and Information System Program. Statewide land use/land cover information is generated from LANDSAT digital data and high altitude photography. Many applications of the system are anticipated as it evolves and data are added from more conventional sources. The conceptualization, design, and implementation of the program are discussed

    Space biology initiative program definition review. Trade study 3: Hardware miniaturization versus cost

    Get PDF
    The optimum hardware miniaturization level with the lowest cost impact for space biology hardware was determined. Space biology hardware and/or components/subassemblies/assemblies which are the most likely candidates for application of miniaturization are to be defined and relative cost impacts of such miniaturization are to be analyzed. A mathematical or statistical analysis method with the capability to support development of parametric cost analysis impacts for levels of production design miniaturization are provided

    Space biology initiative program definition review. Trade study 4: Design modularity and commonality

    Get PDF
    The relative cost impacts (up or down) of developing Space Biology hardware using design modularity and commonality is studied. Recommendations for how the hardware development should be accomplished to meet optimum design modularity requirements for Life Science investigation hardware will be provided. In addition, the relative cost impacts of implementing commonality of hardware for all Space Biology hardware are defined. Cost analysis and supporting recommendations for levels of modularity and commonality are presented. A mathematical or statistical cost analysis method with the capability to support development of production design modularity and commonality impacts to parametric cost analysis is provided

    SWATI: Synthesizing Wordlengths Automatically Using Testing and Induction

    Full text link
    In this paper, we present an automated technique SWATI: Synthesizing Wordlengths Automatically Using Testing and Induction, which uses a combination of Nelder-Mead optimization based testing, and induction from examples to automatically synthesize optimal fixedpoint implementation of numerical routines. The design of numerical software is commonly done using floating-point arithmetic in design-environments such as Matlab. However, these designs are often implemented using fixed-point arithmetic for speed and efficiency reasons especially in embedded systems. The fixed-point implementation reduces implementation cost, provides better performance, and reduces power consumption. The conversion from floating-point designs to fixed-point code is subject to two opposing constraints: (i) the word-width of fixed-point types must be minimized, and (ii) the outputs of the fixed-point program must be accurate. In this paper, we propose a new solution to this problem. Our technique takes the floating-point program, specified accuracy and an implementation cost model and provides the fixed-point program with specified accuracy and optimal implementation cost. We demonstrate the effectiveness of our approach on a set of examples from the domain of automated control, robotics and digital signal processing

    Lower bounds on the non-Clifford resources for quantum computations

    Full text link
    We establish lower-bounds on the number of resource states, also known as magic states, needed to perform various quantum computing tasks, treating stabilizer operations as free. Our bounds apply to adaptive computations using measurements and an arbitrary number of stabilizer ancillas. We consider (1) resource state conversion, (2) single-qubit unitary synthesis, and (3) computational tasks. To prove our resource conversion bounds we introduce two new monotones, the stabilizer nullity and the dyadic monotone, and make use of the already-known stabilizer extent. We consider conversions that borrow resource states, known as catalyst states, and return them at the end of the algorithm. We show that catalysis is necessary for many conversions and introduce new catalytic conversions, some of which are close to optimal. By finding a canonical form for post-selected stabilizer computations, we show that approximating a single-qubit unitary to within diamond-norm precision ε\varepsilon requires at least 1/7log2(1/ε)4/31/7\cdot\log_2(1/\varepsilon) - 4/3 TT-states on average. This is the first lower bound that applies to synthesis protocols using fall-back, mixing techniques, and where the number of ancillas used can depend on ε\varepsilon. Up to multiplicative factors, we optimally lower bound the number of TT or CCZCCZ states needed to implement the ubiquitous modular adder and multiply-controlled-ZZ operations. When the probability of Pauli measurement outcomes is 1/2, some of our bounds become tight to within a small additive constant.Comment: 62 page

    The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    Get PDF
    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage

    Application of NASTRAN for stress analysis of left ventricle of the heart

    Get PDF
    Knowing the stress and strain distributions in the left ventricular wall of the heart is a prerequisite for the determination of the muscle elasticity and contractility in the process of assessing the functional status of the heart. NASTRAN was applied for the calculation of these stresses and strains and to help in verifying the results obtained by the computer program FEAMPS which was specifically designed for the plane-strain finite-element analysis of the left ventricular cross sections. Adopted for the analysis are the true shape and dimensions of the cross sections reconstructed from multiplanar X-ray views of a left ventricle which was surgically isolated from a dog's heart but metabolically supported to sustain its beating. A preprocessor was prepared to accommodate both FEAMPS and NASTRAN, and it has also facilitated the application of both the triangular element and isoparameteric quadrilateral element versions of NASTRAN. The stresses in several crucial regions of the left ventricular wall calculated by these two independently developed computer programs are found to be in good agreement. Such confirmation of the results is essential in the development of a method which assesses the heart performance

    Main propulsion system test requirements for the two-engine Shuttle-C

    Get PDF
    The Shuttle-C is an unmanned cargo carrying derivative of the space shuttle with optional two or three space shuttle main engines (SSME's), whereas the shuttle has three SSME's. Design and operational differences between the Shuttle-C and shuttle were assessed to determine requirements for additional main propulsion system (MPS) verification testing. Also, reviews were made of the shuttle main propulsion test program objectives and test results and shuttle flight experience. It was concluded that, if significant MPS modifications are not made beyond those currently planned, then main propulsion system verification can be concluded with an on-pad flight readiness firing

    Elliptic theory in domains with boundaries of mixed dimension

    Full text link
    Take an open domain ΩRn\Omega \subset \mathbb R^n whose boundary may be composed of pieces of different dimensions. For instance, Ω\Omega can be a ball on R3\mathbb R^3, minus one of its diameters DD, or ΩR3\Omega \subset \mathbb R^3 could be a so-called saw-tooth domain, with a boundary consisting of pieces of 1-dimensional curves intercepted by 2-dimensional spheres. Under appropriate geometric assumptions, such as the existence of doubling measures on Ω\Omega and Ω\partial \Omega with appropriate size conditions, we construct a class of degenerate elliptic operators LL adapted to the geometry, and establish key estimates of elliptic theory associated to those operators. This includes boundary Poincar\'e and Harnack inequalities, maximum principle, and H\"older continuity of solutions at the boundary. We introduce Hilbert spaces naturally associated to the geometry, construct appropriate trace and extension operators, and use them to define weak solutions to Lu=0Lu=0. Then we prove De Giorgi-Nash-Moser estimates inside Ω\Omega and on the boundary, solve the Dirichlet problem and thus construct an elliptic measure ωL\omega_L associated to LL. At last, we introduce Green functions, and use them to prove a comparison principle. Since our theory emphasizes measures, rather than the geometry per se, the results are new even in the classical setting of a half-plane R+2\mathbb R^2_+ when the boundary R+2=R\partial \mathbb R^2_+= \mathbb R is equipped with a doubling measure μ\mu singular with respect to the Lebesgue measure on R\mathbb R. Finally, the present paper provides a generalization of the celebrated Caffarelli-Sylvestre extension operator from its classical setting of R+n+1\mathbb R^{n+1}_+ to general open sets, and hence, an extension of the concept of fractional Laplacian to Ahlfors regular boundaries and beyond.Comment: 116 pages. In version 2, we completed our theory with Green functions and a comparison principl
    corecore