773 research outputs found

    The epidemiology of hematogenous vertebral osteomyelitis: a cohort study in a tertiary care hospital

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Vertebral osteomyelitis is a common manifestation of osteomyelitis in adults and associated with considerable morbidity. Limited data exist regarding hematogenous vertebral osteomyelitis. Our objective was to describe the epidemiology and management of hematogenous vertebral osteomyelitis.</p> <p>Methods</p> <p>We performed a 2-year retrospective cohort study of adult patients with hematogenous vertebral osteomyelitis at a tertiary care hospital.</p> <p>Results</p> <p>Seventy patients with hematogenous vertebral osteomyelitis were identified. The mean age was 59.7 years (±15.0) and 38 (54%) were male. Common comorbidities included diabetes (43%) and renal insufficiency (24%). Predisposing factors in the 30 days prior to admission included bacteremia (19%), skin/soft tissue infection (17%), and having an indwelling catheter (30%). Back pain was the most common symptom (87%). Seven (10%) patients presented with paraplegia. Among the 46 (66%) patients with a microbiological diagnosis, the most common organisms were methicillin-susceptible <it>S. aureus </it>[15 (33%) cases], and methicillin-resistant <it>S. aureus </it>[10 (22%)]. Among the 44 (63%) patients who had a diagnostic biopsy, open biopsy was more likely to result in pathogen recovery [14 (93%) of 15 with open biopsy vs. 14 (48%) of 29 with needle biopsy; p = 0.003]. Sixteen (23%) patients required surgical intervention for therapeutic purposes during admission.</p> <p>Conclusions</p> <p>This is one of the largest series of hematogenous vertebral osteomyelitis. A microbiological diagnosis was made in only approximately two-thirds of cases. <it>S. aureus </it>was the most common causative organism, of which almost half the isolates were methicillin-resistant.</p

    Beyond the public and private divide: Remapping transnational climate governance in de 21th century

    Get PDF
    This article provides a first step towards a better theoretical and empirical knowledge of the emerging arena of transnational climate governance. The need for such a re-conceptualization emerges from the increasing relevance of non-state and transnational approaches towards climate change mitigation at a time when the intergovernmental negotiation process has to overcome substantial stalemate and the international arena becomes increasingly fragmented. Based on a brief discussion of the increasing trend towards transnationalization and functional segmentation of the global climate governance arena, we argue that a remapping of climate governance is necessary and needs to take into account different spheres of authority beyond the public and international. Hence, we provide a brief analysis of how the public/private divide has been conceptualized in Political Science and International Relations. Subsequently, we analyse the emerging transnational climate governance arena. Analytically, we distinguish between different manifestations of transnational climate governance on a continuum ranging from delegated and shared public-private authority to fully non-state and private responses to the climate problem. We suggest that our remapping exercise presented in this article can be a useful starting point for future research on the role and relevance of transnational approaches to the global climate crisis

    The Pioneer Anomaly

    Get PDF
    Radio-metric Doppler tracking data received from the Pioneer 10 and 11 spacecraft from heliocentric distances of 20-70 AU has consistently indicated the presence of a small, anomalous, blue-shifted frequency drift uniformly changing with a rate of ~6 x 10^{-9} Hz/s. Ultimately, the drift was interpreted as a constant sunward deceleration of each particular spacecraft at the level of a_P = (8.74 +/- 1.33) x 10^{-10} m/s^2. This apparent violation of the Newton's gravitational inverse-square law has become known as the Pioneer anomaly; the nature of this anomaly remains unexplained. In this review, we summarize the current knowledge of the physical properties of the anomaly and the conditions that led to its detection and characterization. We review various mechanisms proposed to explain the anomaly and discuss the current state of efforts to determine its nature. A comprehensive new investigation of the anomalous behavior of the two Pioneers has begun recently. The new efforts rely on the much-extended set of radio-metric Doppler data for both spacecraft in conjunction with the newly available complete record of their telemetry files and a large archive of original project documentation. As the new study is yet to report its findings, this review provides the necessary background for the new results to appear in the near future. In particular, we provide a significant amount of information on the design, operations and behavior of the two Pioneers during their entire missions, including descriptions of various data formats and techniques used for their navigation and radio-science data analysis. As most of this information was recovered relatively recently, it was not used in the previous studies of the Pioneer anomaly, but it is critical for the new investigation.Comment: 165 pages, 40 figures, 16 tables; accepted for publication in Living Reviews in Relativit

    Debating the Desirability of New Biomedical Technologies: Lessons from the Introduction of Breast Cancer Screening in the Netherlands

    Get PDF
    Health technology assessment (HTA) was developed in the 1970s and 1980s to facilitate decision making on the desirability of new biomedical technologies. Since then, many of the standard tools and methods of HTA have been criticized for their implicit normativity. At the same time research into the character of technology in practice has motivated philosophers, sociologists and anthropologists to criticize the traditional view of technology as a neutral instrument designed to perform a specific function. Such research suggests that the tools and methods of more traditional forms of HTA are often inspired by an ‘instrumentalist’ conception of technology that does not fit the way technology actually works. This paper explores this hypothesis for a specific case: the assessments and deliberations leading to the introduction of breast cancer screening in the Netherlands. After reconstructing this history of HTA ‘in the making’ the stepwise model of HTA that emerged during the process is discussed. This model was rooted indeed in an instrumentalist conception of technology. However, a more detailed reconstruction of several episodes from this history reveals how the actors already experienced the inadequacy of some of the instrumentalist presuppositions. The historical case thus shows how an instrumentalist conception of technology may result in implicit normative effects. The paper concludes that an instrumentalist view of technology is not a good starting point for HTA and briefly suggests how the fit between HTA methods and the actual character of technology in practice might be improved

    Modeling acid-gas generation from boiling chloride brines

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study investigates the generation of HCl and other acid gases from boiling calcium chloride dominated waters at atmospheric pressure, primarily using numerical modeling. The main focus of this investigation relates to the long-term geologic disposal of nuclear waste at Yucca Mountain, Nevada, where pore waters around waste-emplacement tunnels are expected to undergo boiling and evaporative concentration as a result of the heat released by spent nuclear fuel. Processes that are modeled include boiling of highly concentrated solutions, gas transport, and gas condensation accompanied by the dissociation of acid gases, causing low-pH condensate.</p> <p>Results</p> <p>Simple calculations are first carried out to evaluate condensate pH as a function of HCl gas fugacity and condensed water fraction for a vapor equilibrated with saturated calcium chloride brine at 50-150°C and 1 bar. The distillation of a calcium-chloride-dominated brine is then simulated with a reactive transport model using a brine composition representative of partially evaporated calcium-rich pore waters at Yucca Mountain. Results show a significant increase in boiling temperature from evaporative concentration, as well as low pH in condensates, particularly for dynamic systems where partial condensation takes place, which result in enrichment of HCl in condensates. These results are in qualitative agreement with experimental data from other studies.</p> <p>Conclusion</p> <p>The combination of reactive transport with multicomponent brine chemistry to study evaporation, boiling, and the potential for acid gas generation at the proposed Yucca Mountain repository is seen as an improvement relative to previously applied simpler batch evaporation models. This approach allows the evaluation of thermal, hydrological, and chemical (THC) processes in a coupled manner, and modeling of settings much more relevant to actual field conditions than the distillation experiment considered. The actual and modeled distillation experiments do not represent expected conditions in an emplacement drift, but nevertheless illustrate the potential for acid-gas generation at moderate temperatures (<150°C).</p

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV

    Get PDF
    The azimuthal anisotropy of charged particles in PbPb collisions at nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS detector at the LHC over an extended transverse momentum (pt) range up to approximately 60 GeV. The data cover both the low-pt region associated with hydrodynamic flow phenomena and the high-pt region where the anisotropies may reflect the path-length dependence of parton energy loss in the created medium. The anisotropy parameter (v2) of the particles is extracted by correlating charged tracks with respect to the event-plane reconstructed by using the energy deposited in forward-angle calorimeters. For the six bins of collision centrality studied, spanning the range of 0-60% most-central events, the observed v2 values are found to first increase with pt, reaching a maximum around pt = 3 GeV, and then to gradually decrease to almost zero, with the decline persisting up to at least pt = 40 GeV over the full centrality range measured.Comment: Replaced with published version. Added journal reference and DO

    Feedforward Inhibition and Synaptic Scaling – Two Sides of the Same Coin?

    Get PDF
    Feedforward inhibition and synaptic scaling are important adaptive processes that control the total input a neuron can receive from its afferents. While often studied in isolation, the two have been reported to co-occur in various brain regions. The functional implications of their interactions remain unclear, however. Based on a probabilistic modeling approach, we show here that fast feedforward inhibition and synaptic scaling interact synergistically during unsupervised learning. In technical terms, we model the input to a neural circuit using a normalized mixture model with Poisson noise. We demonstrate analytically and numerically that, in the presence of lateral inhibition introducing competition between different neurons, Hebbian plasticity and synaptic scaling approximate the optimal maximum likelihood solutions for this model. Our results suggest that, beyond its conventional use as a mechanism to remove undesired pattern variations, input normalization can make typical neural interaction and learning rules optimal on the stimulus subspace defined through feedforward inhibition. Furthermore, learning within this subspace is more efficient in practice, as it helps avoid locally optimal solutions. Our results suggest a close connection between feedforward inhibition and synaptic scaling which may have important functional implications for general cortical processing
    corecore