1,237 research outputs found

    Special section on advances in reachability analysis and decision procedures: contributions to abstraction-based system verification

    No full text
    Reachability analysis asks whether a system can evolve from legitimate initial states to unsafe states. It is thus a fundamental tool in the validation of computational systems - be they software, hardware, or a combination thereof. We recall a standard approach for reachability analysis, which captures the system in a transition system, forms another transition system as an over-approximation, and performs an incremental fixed-point computation on that over-approximation to determine whether unsafe states can be reached. We show this method to be sound for proving the absence of errors, and discuss its limitations for proving the presence of errors, as well as some means of addressing this limitation. We then sketch how program annotations for data integrity constraints and interface specifications - as in Bertrand Meyers paradigm of Design by Contract - can facilitate the validation of modular programs, e.g., by obtaining more precise verification conditions for software verification supported by automated theorem proving. Then we recap how the decision problem of satisfiability for formulae of logics with theories - e.g., bit-vector arithmetic - can be used to construct an over-approximating transition system for a program. Programs with data types comprised of bit-vectors of finite width require bespoke decision procedures for satisfiability. Finite-width data types challenge the reduction of that decision problem to one that off-the-shelf tools can solve effectively, e.g., SAT solvers for propositional logic. In that context, we recall the Tseitin encoding which converts formulae from that logic into conjunctive normal form - the standard format for most SAT solvers - with only linear blow-up in the size of the formula, but linear increase in the number of variables. Finally, we discuss the contributions that the three papers in this special section make in the areas that we sketched above. © Springer-Verlag 2009

    A microcosting study of the surgical correction of upper extremity deformity in children with spastic cerebral palsy

    Get PDF
    _Objective:_ Determine healthcare costs of upper-extremity surgical correction in children with spastic cerebral palsy (CP). _Method:_ This cohort study included 39 children with spastic CP who had surgery for their upper extremity at a Dutch hospital. A retrospective cost analysis was performed including both hospital and rehabilitation costs. Hospital costs were determined using microcosting methodology. Rehabilitation costs were estimated using reference prices. _Results:_ Hospital costs averaged €6813 per child. Labor (50%), overheads (29%), and medical aids (15%) were important cost drivers. Rehabilitation costs were estimated at €3599 per child. _Conclusions:_ Surgery of the upper extremity is an important contributor to the healthcare costs of children with CP. Our study shows that labor is the most important cost driver for hospital costs, owing to the multidisciplinary approach and patient-specific treatment plan. A remarkable finding was the substantial amount of rehabilitation costs

    An Algorithm for Probabilistic Alternating Simulation

    Get PDF
    In probabilistic game structures, probabilistic alternating simulation (PA-simulation) relations preserve formulas defined in probabilistic alternating-time temporal logic with respect to the behaviour of a subset of players. We propose a partition based algorithm for computing the largest PA-simulation, which is to our knowledge the first such algorithm that works in polynomial time, by extending the generalised coarsest partition problem (GCPP) in a game-based setting with mixed strategies. The algorithm has higher complexities than those in the literature for non-probabilistic simulation and probabilistic simulation without mixed actions, but slightly improves the existing result for computing probabilistic simulation with respect to mixed actions.Comment: We've fixed a problem in the SOFSEM'12 conference versio

    Atom gratings produced by large angle atom beam splitters

    Get PDF
    An asymptotic theory of atom scattering by large amplitude periodic potentials is developed in the Raman-Nath approximation. The atom grating profile arising after scattering is evaluated in the Fresnel zone for triangular, sinusoidal, magneto-optical, and bichromatic field potentials. It is shown that, owing to the scattering in these potentials, two \QTR{em}{groups} of momentum states are produced rather than two distinct momentum components. The corresponding spatial density profile is calculated and found to differ significantly from a pure sinusoid.Comment: 16 pages, 7 figure

    Cosmic histories of star formation and reionization: An analysis with a power-law approximation

    Full text link
    With a simple power-law approximation of high-redshift (3.5\gtrsim3.5) star formation history, i.e., ρ˙(z)[(1+z)/4.5]α\dot{\rho}_*(z)\propto [(1+z)/4.5]^{-\alpha}, we investigate the reionization of intergalactic medium (IGM) and the consequent Thomson scattering optical depth for cosmic microwave background (CMB) photons. A constraint on the evolution index α\alpha is derived from the CMB optical depth measured by the {\it Wilkinson Microwave Anisotropy Probe} (WMAP) experiment, which reads α2.18lgNγ3.89\alpha\approx2.18\lg{\mathscr{N}_{\gamma}}-3.89, where the free parameter Nγ\mathscr{N}_\gamma is the number of the escaped ionizing ultraviolet photons per baryon. Moreover, the redshift for full reionization, zfz_f, can also be expressed as a function of α\alpha as well as Nγ\mathscr{N}_{\gamma}. By further taking into account the implication of the Gunn-Peterson trough observations to quasars for the full reionization redshift, i.e., 6zf76\lesssim z_f \lesssim7, we obtain 0.3α1.30.3\lesssim\alpha\lesssim1.3 and 80Nγ23080\lesssim\mathscr{N}_{\gamma}\lesssim230. For a typical number of 4000\sim4000 of ionizing photons released per baryon of normal stars, the fraction of these photons escaping from the stars, fescf_{\rm esc}, can be constrained to within the range of (2.05.8)(2.0-5.8)%.Comment: 10 pages, 4 figures, accepted for publication in JCA

    Deterministic and stochastic descriptions of gene expression dynamics

    Full text link
    A key goal of systems biology is the predictive mathematical description of gene regulatory circuits. Different approaches are used such as deterministic and stochastic models, models that describe cell growth and division explicitly or implicitly etc. Here we consider simple systems of unregulated (constitutive) gene expression and compare different mathematical descriptions systematically to obtain insight into the errors that are introduced by various common approximations such as describing cell growth and division by an effective protein degradation term. In particular, we show that the population average of protein content of a cell exhibits a subtle dependence on the dynamics of growth and division, the specific model for volume growth and the age structure of the population. Nevertheless, the error made by models with implicit cell growth and division is quite small. Furthermore, we compare various models that are partially stochastic to investigate the impact of different sources of (intrinsic) noise. This comparison indicates that different sources of noise (protein synthesis, partitioning in cell division) contribute comparable amounts of noise if protein synthesis is not or only weakly bursty. If protein synthesis is very bursty, the burstiness is the dominant noise source, independent of other details of the model. Finally, we discuss two sources of extrinsic noise: cell-to-cell variations in protein content due to cells being at different stages in the division cycles, which we show to be small (for the protein concentration and, surprisingly, also for the protein copy number per cell) and fluctuations in the growth rate, which can have a significant impact.Comment: 23 pages, 5 figures; Journal of Statistical physics (2012

    Sharp Trace Hardy-Sobolev-Maz'ya Inequalities and the Fractional Laplacian

    Get PDF
    In this work we establish trace Hardy and trace Hardy-Sobolev-Maz'ya inequalities with best Hardy constants, for domains satisfying suitable geometric assumptions such as mean convexity or convexity. We then use them to produce fractional Hardy-Sobolev-Maz'ya inequalities with best Hardy constants for various fractional Laplacians. In the case where the domain is the half space our results cover the full range of the exponent s(0,1)s \in (0,1) of the fractional Laplacians. We answer in particular an open problem raised by Frank and Seiringer \cite{FS}.Comment: 42 page

    Automated mechanism design for B2B e-commerce models

    Get PDF
    Business-to-business electronic marketplaces (B2B e-Marketplaces) have been in the limelight since 1999 with the commercialisation of the Internet and subsequent “dot.com” boom [1]. Literature is indicative of the growth of the B2B sectors in all industries, and B2B e-Marketplace is one of the sectors that have witnessed a rapid increase. Consequently, the importance of developing the B2B e-Commerce Model for improved value chain in B2B exchanges is extremely important for SMEs to expose to the world marketplace. There are three research objectives (ROs) in this study; first (RO1) to critical review the concepts of the B2B e-Marketplace including their technologies, operations, business relationships and functionalities; second (RO2) to design an automated mechanism of B2B e-Marketplace for Small to Medium Sized Enterprises (SMEs); and third (RO3) to propose a conceptual B2B e-Commerce model for SMEs. The proposed model is constructed by the analytical findings obtained from the contemporary B2B e-Marketplace literature

    An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics

    Get PDF
    For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types
    corecore