145 research outputs found

    Too Tall for Guatemala

    Get PDF
    I was clearly out of place. I came to the highlands of Guatemala during my fourth year of medical school to study Spanish, work in a rural clinic, and experience a different way of life. For a month, I lived with a Guatemalan family, ate plantains with every meal, and generally tried to immerse myself in the rich Mayan culture surrounding me. Almost a year later, my Spanish is fading fast. The handful of days I spent in volunteer clinic is a distant memory at this point. Why did I go there again? I was the epitome of an outsider. I came down with Montezuma’s revenge, just like all of my American classmates who traveled there with me. Despite its location in tropical Central America, Guatemala is an exceptionally mountainous country and despite all the warnings from the program’s director about the cool climate, I severely under packed. That left me living, eating, and sleeping in my one Patagonia fleece. As a tall American, I towered over the local people and fit very poorly in nearly all things Guatemalan. I rode buses with my knees under my chin and my head on the ceiling

    Soil and soil breathing remote monitoring: A short review

    Get PDF
    The efficiency of agricultural use of soils depends directly on their quality indicators, which include an extended set of characteristics: from data of the environmental situation to the component composition of the soil air. Therefore, for a more complete survey of agricultural land in order to determine their qualitative indicators and subsequent application, it is necessary to carry out comprehensive monitoring while simultaneously studying the characteristics of soils and their air composition. The article is devoted to the literature analysis on the remote monitoring of soils and soil air. Particular attention was paid to the relationship between soil type and soil air composition and it was found that the soil air composition (in the combination with pH and humidity parameters) can assess the type, quality and environmental condition of soils. Since when developing a remote monitoring system of soil air soil moisture and soil structure significantly affect the processes occurring in soils, and ultimately the quantitative composition of soil air, it is very important to know the dependence of the soil air composition on the type and quality of the soil itself, the influence of moisture, structure and other parameters on it. It was shown that the use of sensors is a promising direction for the development of the soils and soil air remote monitoring. It was indicated that soil and soil air remote monitoring in real time will provide reliable, timely information on the environmental status of soils and their quality. Commercial sensors that can be used to determine CO2, O2, NOx, CH4, CO, H2 and NH3 were considered and the technique for sensor signal processing was chosen. A remote monitoring system with the use of existing commercial sensors was proposed, the movement of which can be realized with the help of quadcopter, which will allow parallel scanning of the soils and the land terrain. Such a system will make it possible to correctly assess the readiness of soils for planting, determine their intended use, correctly apply fertilizers, and even predict the yield of certain crops. Thereby, this approach will create a modern on-line system for full monitoring of soil, land and rapid response in the case of its change for the agro-industrial sector

    Use of the Spine AdVerse Events Severity (SAVES) System to Categorize and Report Adverse Events in Spine Surgery.

    Get PDF
    Introduction: Analysis of adverse events (AEs) in spine surgery has historically been retrospective, utilizing hospital administrative data. Our objective was to determine the incidence, severity and effect on hospital length of stay (LOS) for AEs in spine surgery using the Spine AdVerse Events Severity (SAVES V2) system. Methods: AEs for all surgical spine patients at our institution were prospectively collected for 18 months and correlated with retrospective data from operative reports and H&Ps. Statistical analyses compared patient demographics, diagnoses, and surgical characteristics to hospital length of stay and likelihood of adverse events. Results: This system captured 75% (765/977) of surgical cases for all indications over the study period. 73% (541/743) of patients experienced at least one AE, with an average of 1.2 AEs per patient (range 0-5). The most common AEs were pain control (31%), urinary retention (9.7%), wound infection (6.3%), and incidental durotomy (5.8%). For patients experiencing at least one AE, 30% had no effect on LOS, 48% increased LOS by 1-2 days, 15% increased LOS by 3-7 days, and 7% had prolonged LOS greater than 8 days. Our system captured 25.4% more adverse events (60.0% vs. 34.6%) than hospital administrative data. Univariate analysis revealed patient age, emergent surgery, diagnostic and surgical categories, and spine region to be predictors of both AEs and LOS. Instrumentation was predictive of increased LOS but not AEs. The type of AE was strongly associated with LOS. Multivariable analysis of AE likelihood demonstrated emergent surgery to be the strongest independent predictor with an adjusted odds ratio of 8.5 versus elective surgery. Discussion: Spine surgery is associated with a high incidence of adverse events, which often prolong hospital length of stay. Better characterization of adverse events and their predictors could lead to improved management strategies that reduce patient morbidity and mortality

    The hemodynamic tolerability and feasibility of sustained low efficiency dialysis in the management of critically ill patients with acute kidney injury

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Minimization of hemodynamic instability during renal replacement therapy (RRT) in patients with acute kidney injury (AKI) is often challenging. We examined the relative hemodynamic tolerability of sustained low efficiency dialysis (SLED) and continuous renal replacement therapy (CRRT) in critically ill patients with AKI. We also compared the feasibility of SLED administration with that of CRRT and intermittent hemodialysis (IHD).</p> <p>Methods</p> <p>This cohort study encompassed four critical care units within a single university-affiliated medical centre. 77 consecutive critically ill patients with AKI who were treated with CRRT (n = 30), SLED (n = 13) or IHD (n = 34) and completed at least two RRT sessions were included in the study. Overall, 223 RRT sessions were analyzed. Hemodynamic instability during a given session was defined as the composite of a > 20% reduction in mean arterial pressure or any escalation in pressor requirements. Treatment feasibility was evaluated based on the fraction of the prescribed therapy time that was delivered. An interrupted session was designated if < 90% of the prescribed time was administered. Generalized estimating equations were used to compare the hemodynamic tolerability of SLED vs CRRT while accounting for within-patient clustering of repeated sessions and key confounders.</p> <p>Results</p> <p>Hemodynamic instability occurred during 22 (56.4%) SLED and 43 (50.0%) CRRT sessions (p = 0.51). In a multivariable analysis that accounted for clustering of multiple sessions within the same patient, the odds ratio for hemodynamic instability with SLED was 1.20 (95% CI 0.58-2.47), as compared to CRRT. Session interruption occurred in 16 (16.3), 30 (34.9) and 11 (28.2) of IHD, CRRT and SLED therapies, respectively.</p> <p>Conclusions</p> <p>In critically ill patients with AKI, the administration of SLED is feasible and provides comparable hemodynamic control to CRRT.</p

    Risk Factors for SARS Transmission from Patients Requiring Intubation: A Multicentre Investigation in Toronto, Canada

    Get PDF
    In the 2003 Toronto SARS outbreak, SARS-CoV was transmitted in hospitals despite adherence to infection control procedures. Considerable controversy resulted regarding which procedures and behaviours were associated with the greatest risk of SARS-CoV transmission.A retrospective cohort study was conducted to identify risk factors for transmission of SARS-CoV during intubation from laboratory confirmed SARS patients to HCWs involved in their care. All SARS patients requiring intubation during the Toronto outbreak were identified. All HCWs who provided care to intubated SARS patients during treatment or transportation and who entered a patient room or had direct patient contact from 24 hours before to 4 hours after intubation were eligible for this study. Data was collected on patients by chart review and on HCWs by interviewer-administered questionnaire. Generalized estimating equation (GEE) logistic regression models and classification and regression trees (CART) were used to identify risk factors for SARS transmission. ratio ≤59 (OR = 8.65, p = .001) were associated with increased risk of transmission of SARS-CoV. In CART analyses, the four covariates which explained the greatest amount of variation in SARS-CoV transmission were covariates representing individual patients.Close contact with the airway of severely ill patients and failure of infection control practices to prevent exposure to respiratory secretions were associated with transmission of SARS-CoV. Rates of transmission of SARS-CoV varied widely among patients

    Epidemiology of influenza-associated hospitalization in adults, Toronto, 2007/8

    Get PDF
    The purpose of this investigation was to identify when diagnostic testing and empirical antiviral therapy should be considered for adult patients requiring hospitalization during influenza seasons. During the 2007/8 influenza season, six acute care hospitals in the Greater Toronto Area participated in active surveillance for laboratory-confirmed influenza requiring hospitalization. Nasopharyngeal (NP) swabs were obtained from patients presenting with acute respiratory or cardiac illness, or with febrile illness without clear non-respiratory etiology. Predictors of influenza were analyzed by multivariable logistic regression analysis and likelihoods of influenza infection in various patient groups were calculated. Two hundred and eighty of 3,917 patients were found to have influenza. Thirty-five percent of patients with influenza presented with a triage temperature ≥38.0°C, 80% had respiratory symptoms in the emergency department, and 76% were ≥65 years old. Multivariable analysis revealed a triage temperature ≥38.0°C (odds ratio [OR] 3.1; 95% confidence interval [CI] 2.3–4.1), the presence of respiratory symptoms (OR 1.7; 95% CI 1.2–2.4), admission diagnosis of respiratory infection (OR 1.8; 95% CI 1.3–2.4), admission diagnosis of exacerbation of chronic obstructive pulmonary disease (COPD)/asthma or respiratory failure (OR 2.3; 95% CI 1.6–3.4), and admission in peak influenza weeks (OR 4.2; 95% CI 3.1–5.7) as independent predictors of influenza. The likelihood of influenza exceeded 15% in patients with respiratory infection or exacerbation of COPD/asthma if the triage temperature was ≥38.0°C or if they were admitted in the peak weeks during the influenza season. During influenza season, diagnostic testing and empiric antiviral therapy should be considered in patients requiring hospitalization if respiratory infection or exacerbation of COPD/asthma are suspected and if either the triage temperature is ≥38.0°C or admission is during the weeks of peak influenza activity

    Hemodynamic effects of lung recruitment maneuvers in acute respiratory distress syndrome

    Get PDF
    Background: Clinical trials have, so far, failed to establish clear beneficial outcomes of recruitment maneuvers (RMs) on patient mortality in acute respiratory distress syndrome (ARDS), and the effects of RMs on the cardiovascular system remain poorly understood. Methods: A computational model with highly integrated pulmonary and cardiovascular systems was configured to replicate static and dynamic cardio-pulmonary data from clinical trials. Recruitment maneuvers (RMs) were executed in 23 individual in-silico patients with varying levels of ARDS severity and initial cardiac output. Multiple clinical variables were recorded and analyzed, including arterial oxygenation, cardiac output, peripheral oxygen delivery and alveolar strain. Results: The maximal recruitment strategy (MRS) maneuver, which implements gradual increments of positive end expiratory pressure (PEEP) followed by PEEP titration, produced improvements in PF ratio, carbon dioxide elimination and dynamic strain in all 23 in-silico patients considered. Reduced cardiac output in the moderate and mild in silico ARDS patients produced significant drops in oxygen delivery during the RM (average decrease of 423 ml min-1 and 526 ml min-1, respectively). In the in-silico patients with severe ARDS, however, significantly improved gas-exchange led to an average increase of 89 ml min-1 in oxygen delivery during the RM, despite a simultaneous fall in cardiac output of more than 3 l min-1 on average. Post RM increases in oxygen delivery were observed only for the in silico patients with severe ARDS. In patients with high baseline cardiac outputs (>6.5 l min-1), oxygen delivery never fell below 700 ml min-1. Conclusions: Our results support the hypothesis that patients with severe ARDS and significant numbers of alveolar units available for recruitment may benefit more from RMs. Our results also indicate that a higher than normal initial cardiac output may provide protection against the potentially negative effects of high intrathoracic pressures associated with RMs on cardiac function. Results from in silico patients with mild or moderate ARDS suggest that the detrimental effects of RMs on cardiac output can potentially outweigh the positive effects of alveolar recruitment on oxygenation, resulting in overall reductions in tissue oxygen delivery

    Group B Streptococcus vaccine development: present status and future considerations, with emphasis on perspectives for low and middle income countries.

    Get PDF
    Globally, group B Streptococcus (GBS) remains the leading cause of sepsis and meningitis in young infants, with its greatest burden in the first 90 days of life. Intrapartum antibiotic prophylaxis (IAP) for women at risk of transmitting GBS to their newborns has been effective in reducing, but not eliminating, the young infant GBS disease burden in many high income countries. However, identification of women at risk and administration of IAP is very difficult in many low and middle income country (LMIC) settings, and is not possible for home deliveries. Immunization of pregnant women with a GBS vaccine represents an alternate pathway to protecting newborns from GBS disease, through the transplacental antibody transfer to the fetus in utero. This approach to prevent GBS disease in young infants is currently under development, and is approaching late stage clinical evaluation. This manuscript includes a review of the natural history of the disease, global disease burden estimates, diagnosis and existing control options in different settings, the biological rationale for a vaccine including previous supportive studies, analysis of current candidates in development, possible correlates of protection and current status of immunogenicity assays. Future potential vaccine development pathways to licensure and use in LMICs, trial design and implementation options are discussed, with the objective to provide a basis for reflection, rather than recommendations

    Nurses' perceptions of aids and obstacles to the provision of optimal end of life care in ICU

    Get PDF
    Contains fulltext : 172380.pdf (publisher's version ) (Open Access
    corecore