1,237 research outputs found
Special section on advances in reachability analysis and decision procedures: contributions to abstraction-based system verification
Reachability analysis asks whether a system can evolve from legitimate initial states to unsafe states. It is thus a fundamental tool in the validation of computational systems - be they software, hardware, or a combination thereof. We recall a standard approach for reachability analysis, which captures the system in a transition system, forms another transition system as an over-approximation, and performs an incremental fixed-point computation on that over-approximation to determine whether unsafe states can be reached. We show this method to be sound for proving the absence of errors, and discuss its limitations for proving the presence of errors, as well as some means of addressing this limitation. We then sketch how program annotations for data integrity constraints and interface specifications - as in Bertrand Meyers paradigm of Design by Contract - can facilitate the validation of modular programs, e.g., by obtaining more precise verification conditions for software verification supported by automated theorem proving. Then we recap how the decision problem of satisfiability for formulae of logics with theories - e.g., bit-vector arithmetic - can be used to construct an over-approximating transition system for a program. Programs with data types comprised of bit-vectors of finite width require bespoke decision procedures for satisfiability. Finite-width data types challenge the reduction of that decision problem to one that off-the-shelf tools can solve effectively, e.g., SAT solvers for propositional logic. In that context, we recall the Tseitin encoding which converts formulae from that logic into conjunctive normal form - the standard format for most SAT solvers - with only linear blow-up in the size of the formula, but linear increase in the number of variables. Finally, we discuss the contributions that the three papers in this special section make in the areas that we sketched above. © Springer-Verlag 2009
A microcosting study of the surgical correction of upper extremity deformity in children with spastic cerebral palsy
_Objective:_ Determine healthcare costs of upper-extremity surgical correction in children with spastic cerebral palsy (CP).
_Method:_ This cohort study included 39 children with spastic CP who had surgery for their upper extremity at a Dutch hospital. A retrospective cost analysis was performed including both hospital and rehabilitation costs. Hospital costs were determined using microcosting methodology. Rehabilitation costs were estimated using reference prices.
_Results:_ Hospital costs averaged €6813 per child. Labor (50%), overheads (29%), and medical aids (15%) were important cost drivers. Rehabilitation costs were estimated at €3599 per child.
_Conclusions:_ Surgery of the upper extremity is an important contributor to the healthcare costs of children with CP. Our study shows that labor is the most important cost driver for hospital costs, owing to the multidisciplinary approach and patient-specific treatment plan. A remarkable finding was the substantial amount of rehabilitation costs
An Algorithm for Probabilistic Alternating Simulation
In probabilistic game structures, probabilistic alternating simulation
(PA-simulation) relations preserve formulas defined in probabilistic
alternating-time temporal logic with respect to the behaviour of a subset of
players. We propose a partition based algorithm for computing the largest
PA-simulation, which is to our knowledge the first such algorithm that works in
polynomial time, by extending the generalised coarsest partition problem (GCPP)
in a game-based setting with mixed strategies. The algorithm has higher
complexities than those in the literature for non-probabilistic simulation and
probabilistic simulation without mixed actions, but slightly improves the
existing result for computing probabilistic simulation with respect to mixed
actions.Comment: We've fixed a problem in the SOFSEM'12 conference versio
Atom gratings produced by large angle atom beam splitters
An asymptotic theory of atom scattering by large amplitude periodic
potentials is developed in the Raman-Nath approximation. The atom grating
profile arising after scattering is evaluated in the Fresnel zone for
triangular, sinusoidal, magneto-optical, and bichromatic field potentials. It
is shown that, owing to the scattering in these potentials, two
\QTR{em}{groups} of momentum states are produced rather than two distinct
momentum components. The corresponding spatial density profile is calculated
and found to differ significantly from a pure sinusoid.Comment: 16 pages, 7 figure
Recommended from our members
The impact of overseas training on curriculum innovation and change in English language education in Western China
This article assesses the impact of a UK-based professional development programme on curriculum innovation and change in English Language Education (ELE) in Western China. Based on interviews, focus group discussions and observation of a total of 48 English teachers who had participated in an overseas professional development programme influenced by modern approaches to education and ELE, and 9 of their colleagues who had not taken part, it assesses the uptake of new approaches on teachers’ return to China. Interviews with 10 senior managers provided supplementary data. Using Diffusion of Innovations Theory as the conceptual framework, we examine those aspects of the Chinese situation that are supportive of change and those that constrain innovation. We offer evidence of innovation in classroom practice on the part of returnees and ‘reinvention’ of the innovation to ensure a better fit with local needs. The key role of course participants as opinion leaders in the diffusion of new ideas is also explored. We conclude that the selective uptake of this innovation is under way and likely to be sustained against a background of continued curriculum reform in China
Cosmic histories of star formation and reionization: An analysis with a power-law approximation
With a simple power-law approximation of high-redshift () star
formation history, i.e., , we
investigate the reionization of intergalactic medium (IGM) and the consequent
Thomson scattering optical depth for cosmic microwave background (CMB) photons.
A constraint on the evolution index is derived from the CMB optical
depth measured by the {\it Wilkinson Microwave Anisotropy Probe} (WMAP)
experiment, which reads ,
where the free parameter is the number of the escaped
ionizing ultraviolet photons per baryon. Moreover, the redshift for full
reionization, , can also be expressed as a function of as well as
. By further taking into account the implication of the
Gunn-Peterson trough observations to quasars for the full reionization
redshift, i.e., , we obtain
and .
For a typical number of of ionizing photons released per baryon of
normal stars, the fraction of these photons escaping from the stars, , can be constrained to within the range of .Comment: 10 pages, 4 figures, accepted for publication in JCA
Deterministic and stochastic descriptions of gene expression dynamics
A key goal of systems biology is the predictive mathematical description of
gene regulatory circuits. Different approaches are used such as deterministic
and stochastic models, models that describe cell growth and division explicitly
or implicitly etc. Here we consider simple systems of unregulated
(constitutive) gene expression and compare different mathematical descriptions
systematically to obtain insight into the errors that are introduced by various
common approximations such as describing cell growth and division by an
effective protein degradation term. In particular, we show that the population
average of protein content of a cell exhibits a subtle dependence on the
dynamics of growth and division, the specific model for volume growth and the
age structure of the population. Nevertheless, the error made by models with
implicit cell growth and division is quite small. Furthermore, we compare
various models that are partially stochastic to investigate the impact of
different sources of (intrinsic) noise. This comparison indicates that
different sources of noise (protein synthesis, partitioning in cell division)
contribute comparable amounts of noise if protein synthesis is not or only
weakly bursty. If protein synthesis is very bursty, the burstiness is the
dominant noise source, independent of other details of the model. Finally, we
discuss two sources of extrinsic noise: cell-to-cell variations in protein
content due to cells being at different stages in the division cycles, which we
show to be small (for the protein concentration and, surprisingly, also for the
protein copy number per cell) and fluctuations in the growth rate, which can
have a significant impact.Comment: 23 pages, 5 figures; Journal of Statistical physics (2012
Sharp Trace Hardy-Sobolev-Maz'ya Inequalities and the Fractional Laplacian
In this work we establish trace Hardy and trace Hardy-Sobolev-Maz'ya
inequalities with best Hardy constants, for domains satisfying suitable
geometric assumptions such as mean convexity or convexity. We then use them to
produce fractional Hardy-Sobolev-Maz'ya inequalities with best Hardy constants
for various fractional Laplacians. In the case where the domain is the half
space our results cover the full range of the exponent of the
fractional Laplacians. We answer in particular an open problem raised by Frank
and Seiringer \cite{FS}.Comment: 42 page
Automated mechanism design for B2B e-commerce models
Business-to-business electronic marketplaces (B2B
e-Marketplaces) have been in the limelight since 1999 with the
commercialisation of the Internet and subsequent “dot.com”
boom [1]. Literature is indicative of the growth of the B2B
sectors in all industries, and B2B e-Marketplace is one of the
sectors that have witnessed a rapid increase. Consequently, the
importance of developing the B2B e-Commerce Model for
improved value chain in B2B exchanges is extremely important
for SMEs to expose to the world marketplace. There are three
research objectives (ROs) in this study; first (RO1) to critical
review the concepts of the B2B e-Marketplace including their
technologies, operations, business relationships and
functionalities; second (RO2) to design an automated
mechanism of B2B e-Marketplace for Small to Medium Sized
Enterprises (SMEs); and third (RO3) to propose a conceptual
B2B e-Commerce model for SMEs. The proposed model is
constructed by the analytical findings obtained from the
contemporary B2B e-Marketplace literature
An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics
For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types
- …
