3,336 research outputs found

    Neuromuscular Blockade with Rocuronium Bromide Increases the Tolerance of Acute Normovolemic Anemia in Anesthetized Pigs

    Get PDF
    Background: The patient's individual anemia tolerance is pivotal when blood transfusions become necessary, but are not feasible for some reason. To date, the effects of neuromuscular blockade (NMB) on anemia tolerance have not been investigated. Methods: 14 anesthetized and mechanically ventilated pigs were randomly assigned to the Roc group (3.78 mg/kg rocuronium bromide followed by continuous infusion of 1 mg/kg/min, n = 7) or to the Sal group (administration of the corresponding volume of normal saline, n = 7). Subsequently, acute normovolemic anemia was induced by simultaneous exchange of whole blood for a 6% hydroxyethyl starch solution (130/0.4) until a sudden decrease of total body O-2 consumption (VO2) indicated a critical limitation of O-2 transport capacity. The Hb concentration quantified at this time point (Hb(crit)) was the primary end-point of the protocol. Secondary endpoints were parameters of hemodynamics, O-2 transport and tissue oxygenation. Results: Hb(crit) was significantly lower in the Roc group (2.4 +/- 0.5 vs. 3.2 +/- 0.7 g/dl) reflecting increased anemia tolerance. NMB with rocuronium bromide reduced skeletal muscular VO2 and total body O-2 extraction rate. As the cardiac index increased simultaneously, total body VO2 only decreased marginally in the Roc group (change of VO2 relative to baseline -1.7 +/- 0.8 vs. 3.2 +/- 1.9% in the Sal group, p < 0.05). Conclusion: Deep NMB with rocuronium bromide increases the tolerance of acute normovolemic anemia. The underlying mechanism most likely involves a reduction of skeletal muscular VO2. During acellular treatment of an acute blood loss, NMB might play an adjuvant role in situations where profound stages of normovolemic anemia have to be tolerated (e.g. bridging an unexpected blood loss until blood products become available for transfusion). Copyright (C) 2011 S. Karger AG, Base

    An extragalactic supernebula confined by gravity

    Full text link
    Little is known about the origins of the giant star clusters known as globular clusters. How can hundreds of thousands of stars form simultaneously in a volume only a few light years across the distance of the sun to its nearest neighbor? Radiation pressure and winds from luminous young stars should disperse the star-forming gas and disrupt the formation of the cluster. Globular clusters in our Galaxy cannot provide answers; they are billions of years old. Here we report the measurement of infrared hydrogen recombination lines from a young, forming super star cluster in the dwarf galaxy, NGC 5253. The lines arise in gas heated by a cluster of an estimated million stars, so young that it is still enshrouded in gas and dust, hidden from optical view. We verify that the cluster contains 4000-6000 massive, hot "O" stars. Our discovery that the gases within the cluster are bound by gravity may explain why these windy and luminous O stars have not yet blown away the gases to allow the cluster to emerge from its birth cocoon. Young clusters in "starbursting" galaxies in the local and distant universe may be similarly gravitationally confined and cloaked from view.Comment: Letter to Natur

    The Star Formation in Radio Survey: Jansky Very Large Array 33 GHz Observations of Nearby Galaxy Nuclei and Extranuclear Star-Forming Regions

    Get PDF
    We present 33 GHz imaging for 112 pointings towards galaxy nuclei and extranuclear star-forming regions at \approx2" resolution using the Karl G. Jansky Very Large Array (VLA) as part of the Star Formation in Radio Survey. A comparison with 33 GHz Robert C. Byrd Green Bank Telescope single-dish observations indicates that the interferometric VLA observations recover 78±478\pm4 % of the total flux density over 25" regions (\approx kpc-scales) among all fields. On these scales, the emission being resolved out is most likely diffuse non-thermal synchrotron emission. Consequently, on the 30300\approx30-300 pc scales sampled by our VLA observations, the bulk of the 33 GHz emission is recovered and primarily powered by free-free emission from discrete HII regions, making it an excellent tracer of massive star formation. Of the 225 discrete regions used for aperture photometry, 162 are extranuclear (i.e., having galactocentric radii rG250r_{\rm G} \geq 250 pc) and detected at >3σ>3\sigma significance at 33 GHz and in Hα\alpha. Assuming a typical 33 GHz thermal fraction of 90 %, the ratio of optically-thin 33 GHz-to-uncorrected Hα\alpha star formation rates indicate a median extinction value on 30300\approx30-300 pc scales of AHα1.26±0.09A_{\rm H\alpha} \approx 1.26\pm0.09 mag with an associated median absolute deviation of 0.87 mag. We find that 10 % of these sources are "highly embedded" (i.e., AHα3.3A_{\rm H\alpha}\gtrsim3.3 mag), suggesting that on average HII regions remain embedded for 1\lesssim1 Myr. Finally, we find the median 33 GHz continuum-to-Hα\alpha line flux ratio to be statistically larger within rG<250r_{\rm G}<250 pc relative the outer-disk regions by a factor of 1.82±0.391.82\pm0.39, while the ratio of 33 GHz-to-24 μ\mum flux densities are lower by a factor of 0.45±0.080.45\pm0.08, which may suggest increased extinction in the central regions.E.J.M. acknowledges the hospitality of the Aspen Center for Physics, which is supported by National Science Foundation grant No. PHY-1066293. The National Radio Astronomy Observatory is a facility of the National Science Foundation operated under cooperative agreement by Associated Universities, Inc. This research made use of APLpy, an open-source plotting package for Python hosted at http://aplpy.github.com

    Human Computation and Convergence

    Full text link
    Humans are the most effective integrators and producers of information, directly and through the use of information-processing inventions. As these inventions become increasingly sophisticated, the substantive role of humans in processing information will tend toward capabilities that derive from our most complex cognitive processes, e.g., abstraction, creativity, and applied world knowledge. Through the advancement of human computation - methods that leverage the respective strengths of humans and machines in distributed information-processing systems - formerly discrete processes will combine synergistically into increasingly integrated and complex information processing systems. These new, collective systems will exhibit an unprecedented degree of predictive accuracy in modeling physical and techno-social processes, and may ultimately coalesce into a single unified predictive organism, with the capacity to address societies most wicked problems and achieve planetary homeostasis.Comment: Pre-publication draft of chapter. 24 pages, 3 figures; added references to page 1 and 3, and corrected typ

    Elevated hemostasis markers after pneumonia increases one-year risk of all-cause and cardiovascular deaths

    Get PDF
    Background: Acceleration of chronic diseases, particularly cardiovascular disease, may increase long-term mortality after community-acquired pneumonia (CAP), but underlying mechanisms are unknown. Persistence of the prothrombotic state that occurs during an acute infection may increase risk of subsequent atherothrombosis in patients with pre-existing cardiovascular disease and increase subsequent risk of death. We hypothesized that circulating hemostasis markers activated during CAP persist at hospital discharge, when patients appear to have recovered clinically, and are associated with higher mortality, particularly due to cardiovascular causes. Methods: In a cohort of survivors of CAP hospitalization from 28 US sites, we measured D-Dimer, thrombin-antithrombin complexes [TAT], Factor IX, antithrombin, and plasminogen activator inhibitor-1 at hospital discharge, and determined 1-year all-cause and cardiovascular mortality. Results: Of 893 subjects, most did not have severe pneumonia (70.6% never developed severe sepsis) and only 13.4% required intensive care unit admission. At discharge, 88.4% of subjects had normal vital signs and appeared to have clinically recovered. D-dimer and TAT levels were elevated at discharge in 78.8% and 30.1% of all subjects, and in 51.3% and 25.3% of those without severe sepsis. Higher D-dimer and TAT levels were associated with higher risk of all-cause mortality (range of hazard ratios were 1.66-1.17, p = 0.0001 and 1.46-1.04, p = 0.001 after adjusting for demographics and comorbid illnesses) and cardiovascular mortality (p = 0.009 and 0.003 in competing risk analyses). Conclusions: Elevations of TAT and D-dimer levels are common at hospital discharge in patients who appeared to have recovered clinically from pneumonia and are associated with higher risk of subsequent deaths, particularly due to cardiovascular disease. © 2011 Yende et al

    A Practical Model for Implementing Digital Media Assessments in Tertiary Science Education

    Full text link
    Learner-Generated Digital Media (LGDM) has been incorporated as a learning tool to assess students in Higher Education over the last decade. There are models developed for video making in the classroom that considers technical know-how, pedagogies or a combination of both. However, there is the absence of a student-centred, practical framework to inform academics and students on the implementation of digital presentations as an assessment tool in the curricula. This conceptual paper proposes a new framework to assist with the design, implementation and evaluation of LGDM as assessment tools. The framework considers the following elements: (1) pedagogy; (2) student training; (3) hosting of videos; (4) marking schemes; (5) group contribution; (6) feedback; (7) reflection, and; (8) evaluation. The purpose of this paper is to outline the basic elements of the framework and provide practical implementation strategies that academics from any discipline could apply to their classrooms

    Upper atmospheres and ionospheres of planets and satellites

    Full text link
    The upper atmospheres of the planets and their satellites are more directly exposed to sunlight and solar wind particles than the surface or the deeper atmospheric layers. At the altitudes where the associated energy is deposited, the atmospheres may become ionized and are referred to as ionospheres. The details of the photon and particle interactions with the upper atmosphere depend strongly on whether the object has anintrinsic magnetic field that may channel the precipitating particles into the atmosphere or drive the atmospheric gas out to space. Important implications of these interactions include atmospheric loss over diverse timescales, photochemistry and the formation of aerosols, which affect the evolution, composition and remote sensing of the planets (satellites). The upper atmosphere connects the planet (satellite) bulk composition to the near-planet (-satellite) environment. Understanding the relevant physics and chemistry provides insight to the past and future conditions of these objects, which is critical for understanding their evolution. This chapter introduces the basic concepts of upper atmospheres and ionospheres in our solar system, and discusses aspects of their neutral and ion composition, wind dynamics and energy budget. This knowledge is key to putting in context the observations of upper atmospheres and haze on exoplanets, and to devise a theory that explains exoplanet demographics.Comment: Invited Revie

    Improving phylogeny reconstruction at the strain level using peptidome datasets

    Get PDF
    Typical bacterial strain differentiation methods are often challenged by high genetic similarity between strains. To address this problem, we introduce a novel in silico peptide fingerprinting method based on conventional wet-lab protocols that enables the identification of potential strain-specific peptides. These can be further investigated using in vitro approaches, laying a foundation for the development of biomarker detection and application-specific methods. This novel method aims at reducing large amounts of comparative peptide data to binary matrices while maintaining a high phylogenetic resolution. The underlying case study concerns the Bacillus cereus group, namely the differentiation of Bacillus thuringiensis, Bacillus anthracis and Bacillus cereus strains. Results show that trees based on cytoplasmic and extracellular peptidomes are only marginally in conflict with those based on whole proteomes, as inferred by the established Genome-BLAST Distance Phylogeny (GBDP) method. Hence, these results indicate that the two approaches can most likely be used complementarily even in other organismal groups. The obtained results confirm previous reports about the misclassification of many strains within the B. cereus group. Moreover, our method was able to separate the B. anthracis strains with high resolution, similarly to the GBDP results as benchmarked via Bayesian inference and both Maximum Likelihood and Maximum Parsimony. In addition to the presented phylogenomic applications, whole-peptide fingerprinting might also become a valuable complementary technique to digital DNA-DNA hybridization, notably for bacterial classification at the species and subspecies level in the future.This research was funded by Grant AGL2013-44039-R from the Spanish “Plan Estatal de I+D+I”, and by Grant EM2014/046 from the “Plan Galego de investigación, innovación e crecemento 2011-2015”. BS was recipient of a Ramón y Cajal postdoctoral contractfrom the Spanish Ministry of Economyand Competitiveness. This work was also partially funded by the [14VI05] Contract-Programme from the University of Vigo and the Agrupamento INBIOMED from DXPCTSUG-FEDER unha maneira de facer Europa (2012/273).The research leading to these results has also received funding from the European Union’s Seventh Framework Programme FP7/REGPOT-2012-2013.1 under grant agreement n˚ 316265, BIOCAPS. This document reflects only the authors’ views and the European Union is not liable for any use that may be made of the information contained herein. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript
    corecore