27 research outputs found

    Overview of Carbon Dioxide Control Issues During International Space Station/Space Shuttle Joint Docked Operations

    Get PDF
    Crewed space vehicles have a common requirement to remove the carbon dioxide (CO2) created by the metabolic processes of the crew. The space shuttle [Space Transportation System (STS)] and International Space Station (ISS) each have systems in place that allow control and removal of CO2 from the habitable cabin environment. During periods in which the space shuttle is docked to the ISS, known as "joint docked operations," the space shuttle and ISS share a common atmosphere environment. During this period, an elevated amount of CO2 is produced through the combined metabolic activity of the STS and ISS crews. This elevated CO2 production, together with the large effective atmosphere created by collective volumes of the docked vehicles, creates a unique set of requirements for CO2 removal. This paper will describe individual CO2 control plans implemented by STS and ISS engineering teams, as well as the integrated plans used when both vehicles are docked. The paper will also discuss some of the issues and anomalies experienced by both engineering teams

    Overview of International Space Station Carbon Dioxide Removal Assembly On-Orbit Operations and Performance

    Get PDF
    Controlling Carbon Dioxide (CO2) partial pressure in the habitable vehicle environment is a critical part of operations on the International Space Station (ISS). On the United States segment of ISS, CO2 levels are primarily controlled by the Carbon Dioxide Removal Assembly (CDRA). There are two CDRAs on ISS; one in the United States Laboratory module, and one in the Node3 module. CDRA has been through several significant operational issues, performance issues and subsequent re-design of various components, primarily involving the Desiccant Adsorbent Bed (DAB) assembly and Air Selector Valves (ASV). This paper will focus on significant operational and performance issues experienced by the CDRA team from 2008-2012

    Crew Health and Performance Improvements with Reduced Carbon Dioxide Levels and the Resource Impact to Accomplish Those Reductions

    Get PDF
    Carbon dioxide (CO2) removal is one of the primary functions of the International Space Station (ISS) atmosphere revitalization systems. Primary CO2 removal is via the ISS s two Carbon Dioxide Removal Assemblies (CDRAs) and the Russian carbon dioxide removal assembly (Vozdukh); both of these systems are regenerable, meaning that their CO2 removal capacity theoretically remains constant as long as the system is operating. Contingency CO2 removal capability is provided by lithium hydroxide (LiOH) canisters, which are consumable, meaning that their CO2 removal capability disappears once the resource is used. With the advent of 6 crew ISS operations, experience showing that CDRA failures are not uncommon, and anecdotal association of crew symptoms with CO2 values just above 4 mmHg, the question arises: How much lower do we keep CO2 levels to minimize the risk to crew health and performance, and what will the operational cost to the CDRAs be to do it? The primary crew health concerns center on the interaction of increased intracranial pressure from fluid shifts and the increased intracranial blood flow induced by CO2. Typical acute symptoms include headache, minor visual disturbances, and subtle behavioral changes. The historical database of CO2 exposures since the beginning of ISS operations has been compared to the incidence of crew symptoms reported in private medical conferences. We have used this database in an attempt to establish an association between the CO2 levels and the risk of crew symptoms. This comparison will answer the question of the level needed to protect the crew from acute effects. As for the second part of the question, operation of the ISS s regenerable CO2 removal capability reduces the limited life of constituent parts. It also consumes limited electrical power and thermal control resources. Operation of consumable CO2 removal capability (LiOH) uses finite consumable materials, which must be replenished in the long term. Therefore, increased CO2 removal means increased resource use, with increased logistical capability to maintain necessary resources on board ISS. We must strike a balance between sufficiently low CO2 levels to maintain crew health and CO2 levels which are operationally feasible for the ISS progra

    Will all scientists working on snails and the diseases they transmit please stand up?

    Get PDF
    Copyright © 2012 Adema et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.No abstract available

    Effects of Cu/Zn Superoxide Dismutase (sod1) Genotype and Genetic Background on Growth, Reproduction and Defense in Biomphalaria glabrata

    Get PDF
    Resistance of the snail Biomphalaria glabrata to the trematode Schistosoma mansoni is correlated with allelic variation at copper-zinc superoxide dismutase (sod1). We tested whether there is a fitness cost associated with carrying the most resistant allele in three outbred laboratory populations of snails. These three populations were derived from the same base population, but differed in average resistance. Under controlled laboratory conditions we found no cost of carrying the most resistant allele in terms of fecundity, and a possible advantage in terms of growth and mortality. These results suggest that it might be possible to drive resistant alleles of sod1 into natural populations of the snail vector for the purpose of controlling transmission of S. mansoni. However, we did observe a strong effect of genetic background on the association between sod1 genotype and resistance. sod1 genotype explained substantial variance in resistance among individuals in the most resistant genetic background, but had little effect in the least resistant genetic background. Thus, epistatic interactions with other loci may be as important a consideration as costs of resistance in the use of sod1 for vector manipulation

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Prevalence, associated factors and outcomes of pressure injuries in adult intensive care unit patients: the DecubICUs study

    Get PDF
    Funder: European Society of Intensive Care Medicine; doi: http://dx.doi.org/10.13039/501100013347Funder: Flemish Society for Critical Care NursesAbstract: Purpose: Intensive care unit (ICU) patients are particularly susceptible to developing pressure injuries. Epidemiologic data is however unavailable. We aimed to provide an international picture of the extent of pressure injuries and factors associated with ICU-acquired pressure injuries in adult ICU patients. Methods: International 1-day point-prevalence study; follow-up for outcome assessment until hospital discharge (maximum 12 weeks). Factors associated with ICU-acquired pressure injury and hospital mortality were assessed by generalised linear mixed-effects regression analysis. Results: Data from 13,254 patients in 1117 ICUs (90 countries) revealed 6747 pressure injuries; 3997 (59.2%) were ICU-acquired. Overall prevalence was 26.6% (95% confidence interval [CI] 25.9–27.3). ICU-acquired prevalence was 16.2% (95% CI 15.6–16.8). Sacrum (37%) and heels (19.5%) were most affected. Factors independently associated with ICU-acquired pressure injuries were older age, male sex, being underweight, emergency surgery, higher Simplified Acute Physiology Score II, Braden score 3 days, comorbidities (chronic obstructive pulmonary disease, immunodeficiency), organ support (renal replacement, mechanical ventilation on ICU admission), and being in a low or lower-middle income-economy. Gradually increasing associations with mortality were identified for increasing severity of pressure injury: stage I (odds ratio [OR] 1.5; 95% CI 1.2–1.8), stage II (OR 1.6; 95% CI 1.4–1.9), and stage III or worse (OR 2.8; 95% CI 2.3–3.3). Conclusion: Pressure injuries are common in adult ICU patients. ICU-acquired pressure injuries are associated with mainly intrinsic factors and mortality. Optimal care standards, increased awareness, appropriate resource allocation, and further research into optimal prevention are pivotal to tackle this important patient safety threat
    corecore