45 research outputs found

    The severity of pandemic H1N1 influenza in the United States, from April to July 2009: A Bayesian analysis

    Get PDF
    Background: Accurate measures of the severity of pandemic (H1N1) 2009 influenza (pH1N1) are needed to assess the likely impact of an anticipated resurgence in the autumn in the Northern Hemisphere. Severity has been difficult to measure because jurisdictions with large numbers of deaths and other severe outcomes have had too many cases to assess the total number with confidence. Also, detection of severe cases may be more likely, resulting in overestimation of the severity of an average case. We sought to estimate the probabilities that symptomatic infection would lead to hospitalization, ICU admission, and death by combining data from multiple sources. Methods and Findings: We used complementary data from two US cities: Milwaukee attempted to identify cases of medically attended infection whether or not they required hospitalization, while New York City focused on the identification of hospitalizations, intensive care admission or mechanical ventilation (hereafter, ICU), and deaths. New York data were used to estimate numerators for ICU and death, and two sources of data - medically attended cases in Milwaukee or self-reported influenza-like illness (ILI) in New York - were used to estimate ratios of symptomatic cases to hospitalizations. Combining these data with estimates of the fraction detected for each level of severity, we estimated the proportion of symptomatic patients who died (symptomatic case-fatality ratio, sCFR), required ICU (sCIR), and required hospitalization (sCHR), overall and by age category. Evidence, prior information, and associated uncertainty were analyzed in a Bayesian evidence synthesis framework. Using medically attended cases and estimates of the proportion of symptomatic cases medically attended, we estimated an sCFR of 0.048% (95% credible interval [CI] 0.026%-0.096%), sCIR of 0.239% (0.134%-0.458%), and sCHR of 1.44% (0.83%-2.64%). Using self-reported ILI, we obtained estimates approximately 7-96lower. sCFR and sCIR appear to be highest in persons aged 18 y and older, and lowest in children aged 5-17 y. sCHR appears to be lowest in persons aged 5-17; our data were too sparse to allow us to determine the group in which it was the highest. Conclusions: These estimates suggest that an autumn-winter pandemic wave of pH1N1 with comparable severity per case could lead to a number of deaths in the range from considerably below that associated with seasonal influenza to slightly higher, but with the greatest impact in children aged 0-4 and adults 18-64. These estimates of impact depend on assumptions about total incidence of infection and would be larger if incidence of symptomatic infection were higher or shifted toward adults, if viral virulence increased, or if suboptimal treatment resulted from stress on the health care system; numbers would decrease if the total proportion of the population symptomatically infected were lower than assumed.published_or_final_versio

    Comparing propene polymerization with 1-butene polymerization catalyzed by MAO-activated C2- and C1-symmetric zirconocenes : An experimental and computational study on the influence of olefin size on stereoselectivity

    No full text
    Polypropene and poly(1-butene) have been synthesized under very similar experimental conditions with a series of MAO-activated C2-symmetric and C1-symmetric ansa-zirconocenes. The C1-symmetric zirconocenes bearing the bilaterally symmetric fluorenyl or bis(2-methylthieno)cyclopentadienyl ligand connected through a dimethylsilyl bridge to substituted indenyl ligands produce isotactic polybutene of similar or higher molecular mass and with noticeably higher isotacticity, compared to isotactic polypropene prepared with the same catalysts under comparable conditions. Structural and mechanistic reasons for such behavior are discussed on the basis of QM/MM calculations

    Verification and Synthesis of Counters based on Symbolic Techniques

    No full text
    Binary Decision Diagrams and Symbolic Techniques have undergone major improvements in the last few years but extending the applicability of the reachability analysis to new fields is still a key issue. A great limitation on standard Symbolic Traversal is represented by Finite State Machines with a very high sequential depth. A typical example of this behaviour are counters. On the other hand systems containing counters, e.g. embedded systems, are of great practical importance in several fields. Among the techniques introduced to better deal with "pure" counters, iterative squaring plays an important role, because it can produce solutions with a logarithmic execution time with respect to the sequential depth. Some drawbacks usually limit the application of such a technique to more general circuits but we successfully tailored iterative squaring to allow its application for symbolic verification and synthesis of circuits containing counters. Experiments on large and complex home--made and industrials circuits containing counters show the feasibility of the approach

    Formal verification of microprogrammed architectures

    No full text
    The paper presents the application of formal verification techniques to a real microprocessor. The device is described and verified resorting to a functional model. A methodology which can be used in a wide number of cases is also presented

    Efficient Computation of Timed Transition Relations

    No full text
    Finite State Machines (FSMs) are a convenient model for specification, analysis and synthesis of the control part of electronic systems. State traversal techniques have been developed to verify properties such as equivalence, reachability and so on for an FSM model. However, those techniques can be very expensive when applied in a synthesis environment, especially when the behavior involves long counting sequences. In this paper we address the problem of efficiently compute silent paths in an FSM. These paths are characterized by no observable activity under constant inputs. They can be used for a variety of applications, from verification, to synthesis, to simulation. In particular, we describe a new approach to compute the Timed Transition Relation of an FSM and we discuss a set of promising experimental results in which Timed Transition Relations are built

    Functional approaches applied to microprogrammed architectures

    No full text
    Zero-defect designs are the goal of current research activities in CAD. An increasingly popular avenue of attack is formal verification of hardware correctness. Formal verification techniques have been applied widely in the past, but most of the examples were either simple or verification-oriented. This paper presents the application of such techniques to a real microprogrammed microprocessor, the MTI. Some abstract views of the device are identified, and the upper levels, namely, the machine instruction and microprogram levels, are described and verified within a functional framework. A methodology that is general and applicable to state-of-the-art microprogrammed architectures is a by-product of this work

    Crystallization behavior and mechanical properties of regiodefective, highly stereoregular isotactic polypropylene: effect of regiodefects versus stereodefects and influence of the molecular mass

    No full text
    Highly isotactic polypropylene samples, containing a very low amount of rr stereodefects (0.1-0.2%) and slightly higher concentration of defects of regioregularity (0.8-0.9% of 2,1 erythro units), with different molecular masses have been prepared with an isospecific but not fully regioselective metallocene catalyst. The effects of the presence of rr defects and of 2,1 regiodefects, and the effect of the molecular mass on the mechanical properties and crystallization behavior of polypropylene have been analyzed. Samples containing 2,1 regiodefects are very stiff and much more fragile than the samples containing only rr stereodefects. The presence of rr defects, even for low concentrations, produces instead increase of ductility and improvement of drawability at room temperature. The different effect of stereo- and regio-defects is probably related to their different levels of inclusion inside the crystalline phase. The uniform inclusion of rr defects in the crystals makes the stereodefective and regioregular samples more homogeneous, where crystalline and amorphous phases have the same composition. The study of the crystallization behavior has shown that the molecular mass strongly influences the amount of alpha and gamma forms that crystallize from the melt. Samples containing regiodefects and with high molecular masses (higher than 200000) show the same amount of gamma form and the same crystallization rate. A lower amount of gamma form and higher crystallization rates are instead observed for the sample having similar microstructure but lower molecular mass. This indicates that the crystallization of the gamma form is favored over the alpha form when the crystallization is slow, that is, for samples with high molecular mass. When the molecular mass is lower than 100000 the crystallization is faster and the formation of the alpha form is kinetically favored. The comparison with regioregular samples containing only rr stereodefects has shown that the effects of rr stereodefects and 2,1-regiodefects on the crystallization properties of polypropylene are very similar, at least when the concentration of defects is small (1% of rr stereodefects or 2,1 erythro units). Both defects produce a shortening of the regular crystallizable isotactic and regioregular sequences, inducing crystallization of the gamma form

    Speeding up model checking by exploiting explicit and hidden verification constraints

    No full text
    Abstract—Constraints represent a key component of state-of-the-art verification tools based on compositional approaches and assume–guarantee reasoning. In recent years, most of the research efforts on verification constraints have focused on defining formats and techniques to encode, or to synthesize, constraints starting from the specification of the design. In this paper, we analyze the impact of constraints on the performance of model checking tools, and we discuss how to effectively exploit them. We also introduce an approach to explicitly derive verification constraints hidden in the design and/or in the property under verification. Such constraints may simply come from true design constraints, embedded within the properties, or may be generated in the general effort to reduce or partition the state space. Experimental results show that, in both cases, we can reap benefits for the overall verification process in several hard-to-solve designs, where we obtain speed-ups of more than one order of magnitude. I
    corecore