134 research outputs found

    A Comparison of Risk Evaluation in Emergency Medical Services Helicopter Operation Regulations

    Get PDF
    This study represents a comparison of Helicopter Emergency Medical Services (HEMS) operations, between US Federal Aviation Regulations (FARs), and European Joint Aviation Regulations Operations Specifications. Presently, US regulations allow HEMS operators to conduct work under FAR Part 135, Commercial Aviation Operations, or under FAR Part 91, General Aviation Operations. This allows HEMS operators to accept a greater level of risk by substituting lower minimum procedural standards under FAR Part 91 than under FAR Part 135, and may be partly culpable for a higher rate of fatal crashes in HEMS operations conducted under FAR Part 91. In stark contrast, explicit criteria and minimum operating considerations are stated in the European regulations. The Federal Aviation Administration (FAA) has been slow to take a similar clear and firm regulatory stance as that of its European counterpart regarding the human factors involved in the risk assessment of HEMS operations. Providing clearly defined steps to analyze and mitigate unnecessary threats, developing optimum performance guidelines, as well as minimum acceptable operational standards would benefit not only the US HEMS industry but also the patients and public it serves by reducing exposure to preventable dangers

    Results from a 13-Year Prospective Cohort Study Show Increased Mortality Associated with Bloodstream Infections Caused by Pseudomonas aeruginosa Compared to Other Bacteria

    Get PDF
    ABSTRACT The impact of bacterial species on outcome in bloodstream infections (BSI) is incompletely understood. We evaluated the impact of bacterial species on BSI mortality, with adjustment for patient, bacterial, and treatment factors. From 2002 to 2015, all adult inpatients with monomicrobial BSI caused by Staphylococcus aureus or Gram-negative bacteria at Duke University Medical Center were prospectively enrolled. Kaplan-Meier curves and multivariable Cox regression with propensity score models were used to examine species-specific bacterial BSI mortality. Of the 2,659 enrolled patients, 999 (38%) were infected with S. aureus , and 1,660 (62%) were infected with Gram-negative bacteria. Among patients with Gram-negative BSI, Enterobacteriaceae (81% [1,343/1,660]) were most commonly isolated, followed by non-lactose-fermenting Gram-negative bacteria (16% [262/1,660]). Of the 999 S. aureus BSI isolates, 507 (51%) were methicillin resistant. Of the 1,660 Gram-negative BSI isolates, 500 (30%) were multidrug resistant. The unadjusted time-to-mortality among patients with Gram-negative BSI was shorter than that of patients with S. aureus BSI ( P = 0.003), due to increased mortality in patients with non-lactose-fermenting Gram-negative BSI generally ( P < 0.0001) and Pseudomonas aeruginosa BSI ( n = 158) in particular ( P < 0.0001). After adjustment for patient demographics, medical comorbidities, bacterial antibiotic resistance, timing of appropriate antibiotic therapy, and source control in patients with line-associated BSI, P. aeruginosa BSI remained significantly associated with increased mortality (hazard ratio = 1.435; 95% confidence interval = 1.043 to 1.933; P = 0.02). P. aeruginosa BSI was associated with increased mortality relative to S. aureus or other Gram-negative BSI. This effect persisted after adjustment for patient, bacterial, and treatment factors

    Deconstructing Disability: A Philosophy for Inclusion

    Full text link
    This article offers derrida's deconstruction as a philosophy and practical strategy that challenges the assumed, factual nature of "disability" as a construct explaining human differences. The appeal of deconstruction lies in the contradictory philosophy currently articulated by the inclusion movement, a philosophy that simultaneously supports the disability construct as objective reality while calling for students "with disabilities" to be placed in educational settings designed for students considered nondisabled. This article proposes deconstruction as one coherent philosophical orientation for inclusion, an approach that critiques the political and moral hierarchy of ability and disability. A deconstructionist critique of disability is explained and demonstrated. Practical suggestions for the utilization of deconstruction by special educators are outlined.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/68721/2/10.1177_074193259701800605.pd

    An iterative block-shifting approach to retention time alignment that preserves the shape and area of gas chromatography-mass spectrometry peaks

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Metabolomics, petroleum and biodiesel chemistry, biomarker discovery, and other fields which rely on high-resolution profiling of complex chemical mixtures generate datasets which contain millions of detector intensity readings, each uniquely addressed along dimensions of <it>time </it>(<it>e.g.</it>, <it>retention time </it>of chemicals on a chromatographic column), a <it>spectral value </it>(<it>e.g., mass-to-charge ratio </it>of ions derived from chemicals), and the <it>analytical run number</it>. They also must rely on data preprocessing techniques. In particular, inter-run variance in the retention time of chemical species poses a significant hurdle that must be cleared before feature extraction, data reduction, and knowledge discovery can ensue. <it>Alignment methods</it>, for calibrating retention reportedly (and in our experience) can misalign matching chemicals, falsely align distinct ones, be unduly sensitive to chosen values of input parameters, and result in distortions of peak shape and area.</p> <p>Results</p> <p>We present an iterative block-shifting approach for retention-time calibration that detects chromatographic features and qualifies them by retention time, spectrum, and the effect of their inclusion on the quality of alignment itself. Mass chromatograms are aligned pairwise to one selected as a reference. In tests using a 45-run GC-MS experiment, block-shifting reduced the absolute deviation of retention by greater than 30-fold. It compared favourably to COW and XCMS with respect to alignment, and was markedly superior in preservation of peak area.</p> <p>Conclusion</p> <p>Iterative block-shifting is an attractive method to align GC-MS mass chromatograms that is also generalizable to other two-dimensional techniques such as HPLC-MS.</p

    Variation in Target Attainment of Beta-Lactam Antibiotic Dosing Between International Pediatric Formularies.

    Get PDF
    As antimicrobial susceptibility of common bacterial pathogens decreases, ensuring optimal dosing may preserve the use of older antibiotics in order to limit the spread of resistance to newer agents. Beta-lactams represent the most widely prescribed antibiotic class, yet most were licensed prior to legislation changes mandating their study in children. As a result, significant heterogeneity persists in the pediatric doses used globally, along with quality of evidence used to inform dosing. This review summarizes dosing recommendations from the major pediatric reference sources and tries to answer the questions: Does beta-lactam dose heterogeneity matter? Does it impact pharmacodynamic target attainment? For three important severe clinical infections-pneumonia, sepsis, and meningitis-pharmacokinetic models were identified for common for beta-lactam antibiotics. Real-world demographics were derived from three multicenter point prevalence surveys. Simulation results were compared with minimum inhibitory concentration distributions to inform appropriateness of recommended doses in targeted and empiric treatment. While cephalosporin dosing regimens are largely adequate for target attainment, they also pose the most risk of neurotoxicity. Our review highlights aminopenicillin, piperacillin, and meropenem doses as potentially requiring review/optimization in order to preserve the use of these agents in future

    Breaking the Double Impasse: Securing and Supporting Diverse Housing Tenures in the United States

    Get PDF
    What might be described as a double impasse characterizes debate on U.S. housing tenure with advocates fighting for rental or ownership housing on one side and Third Way or mixed-tenure solutions on the other. Breaking this impasse requires disengaging from conceptions of an idealized form of tenure and instead advocating making virtually all tenures as secure and supported as possible, so that diverse households are able to live in homes that best fit their changing needs over their life cycles. This essay (a) presents data on the variety of tenures in the United States; (b) conveys a new two-dimensional map of tenure according to their degrees of control and potential for wealth-building; and (c) shows how U.S. institutions shape their risks and subsidies. Most U.S. tenures are at least somewhat risky, including those that receive the greatest federal subsidies. A new housing system is needed to secure and support as many tenures as possible

    A review of spatial causal inference methods for environmental and epidemiological applications

    Get PDF
    The scientific rigor and computational methods of causal inference have had great impacts on many disciplines, but have only recently begun to take hold in spatial applications. Spatial casual inference poses analytic challenges due to complex correlation structures and interference between the treatment at one location and the outcomes at others. In this paper, we review the current literature on spatial causal inference and identify areas of future work. We first discuss methods that exploit spatial structure to account for unmeasured confounding variables. We then discuss causal analysis in the presence of spatial interference including several common assumptions used to reduce the complexity of the interference patterns under consideration. These methods are extended to the spatiotemporal case where we compare and contrast the potential outcomes framework with Granger causality, and to geostatistical analyses involving spatial random fields of treatments and responses. The methods are introduced in the context of observational environmental and epidemiological studies, and are compared using both a simulation study and analysis of the effect of ambient air pollution on COVID-19 mortality rate. Code to implement many of the methods using the popular Bayesian software OpenBUGS is provided

    One thousand plant transcriptomes and the phylogenomics of green plants

    Get PDF
    Abstract: Green plants (Viridiplantae) include around 450,000–500,000 species1, 2 of great diversity and have important roles in terrestrial and aquatic ecosystems. Here, as part of the One Thousand Plant Transcriptomes Initiative, we sequenced the vegetative transcriptomes of 1,124 species that span the diversity of plants in a broad sense (Archaeplastida), including green plants (Viridiplantae), glaucophytes (Glaucophyta) and red algae (Rhodophyta). Our analysis provides a robust phylogenomic framework for examining the evolution of green plants. Most inferred species relationships are well supported across multiple species tree and supermatrix analyses, but discordance among plastid and nuclear gene trees at a few important nodes highlights the complexity of plant genome evolution, including polyploidy, periods of rapid speciation, and extinction. Incomplete sorting of ancestral variation, polyploidization and massive expansions of gene families punctuate the evolutionary history of green plants. Notably, we find that large expansions of gene families preceded the origins of green plants, land plants and vascular plants, whereas whole-genome duplications are inferred to have occurred repeatedly throughout the evolution of flowering plants and ferns. The increasing availability of high-quality plant genome sequences and advances in functional genomics are enabling research on genome evolution across the green tree of life
    corecore