69 research outputs found

    Prevalence of cardiovascular and respiratory complications following trauma in patients with obesity

    Get PDF
    BACKGROUND: It is generally accepted that obesity puts patients at an increased risk for cardiovascular and respiratory complications after surgical procedures. However, in the setting of trauma, there have been mixed findings in regards to whether obesity increases the risk for additional complications. OBJECTIVE: The aim of this study was to identify whether obese patients suffer an increased risk of cardiac and respiratory complications following traumatic injury. METHODS: A retrospective analysis of 275,393 patients was conducted using the 2012 National Trauma Data Bank. Hierarchical regression modeling was performed to determine the probability of experiencing a cardiac or respiratory complication. RESULTS: Patients with obesity were at a significantly higher risk of cardiac and respiratory complications compared to patients without obesity [OR: 1.81; CI: 1.72-1.91]. Prevalence of cardiovascular and respiratory complications for patients with obesity was 12.6% compared to 5.2% for non-obese patients. CONCLUSIONS: Obesity is predictive of an increased risk for cardiovascular and respiratory complications following trauma

    Infectious Complications in Obese Patients Following Trauma

    Get PDF
    Background Obesity is a public health concern in the United States due to its increasing prevalence, especially in younger age groups. Trauma is the most common cause of death for people under aged 40 y. The purpose of this study is to determine the association between obesity and specific infectious complications after traumatic injury. Materials and methods A retrospective analysis was conducted using data from the 2012 National Trauma Data Bank. The National Trauma Data Bank defined obesity as having a body mass index of 30 or greater. Descriptive statistics were calculated and stratified by obesity status. A hierarchical regression model was used to determine the odds of experiencing an infectious complication in patients with obesity while controlling for age, gender, diabetes, number of comorbidities, injury severity, injury mechanism, head injury, and surgical procedure. Results Patients with a body mass index of 30 or greater compared with nonobese patients had increased odds of having an infectious complication (Odds Ratio, 1.59; 1.49-1.69). In addition to obesity, injury severity score greater than 29, age 40 y or older, diabetes, comorbid conditions, and having a surgical procedure were also predictive of an infectious complication. Conclusions Our results indicate that trauma patients with obesity are nearly 60% more likely to develop an infectious complication in the hospital. Infection prevention and control measures should be implemented soon after hospital arrival for patients with obesity, particularly those with operative trauma

    Teacher Perceptions of Skills, Knowledge, and Resources Needed to Promote Social and Emotional Learning in Rural Classrooms

    Get PDF
    The incorporation of social and emotional learning (SEL) in schools has been shown to improve academic and psychological health of students. Research has been limited regarding implementation of SEL programs in rural communities, where student needs are heightened. The current study examined factors that could impact teachers’ intentions to be early adopters of a SEL curriculum in a rural community. Seventy-six teachers provided self-report data regarding perceptions of professional strengths, school climate, school resources for student support, ability to educate diverse students, ability to teach specific SEL domains, and intentions to be an early adopter of a SEL program. Present results indicated positive perceptions of school climate, one’s ability to teach diverse students, and one’s ability to teach self-management skills positively predicted intentions to be an early adopter of a SEL curriculum. Implications for rural schools are explored and recommendations for adoption of SEL curricula in rural schools are provided

    Clinical indicators of hemorrhagic shock in pregnancy

    Get PDF
    Background Several hemodynamic parameters have been promoted to help establish a rapid diagnosis of hemorrhagic shock, but they have not been well validated in the pregnant population. In this study, we examined the association between three measures of shock and early blood transfusion requirements among pregnant trauma patients. Methods This study included 81 pregnant trauma patients admitted to a level 1 trauma center (2010–2015). In separate logistic regression models, we tested the relationship between exposure variables—initial systolic blood pressure (SBP), shock index (SI), and rate over pressure evaluation (ROPE)—and the outcome of transfusion of blood products within 24 hours of admission. To test the predictive ability of each measure, we used receiver operating characteristic (ROC) curves. Results A total of 10% of patients received blood products in the patient cohort. No patients had an initial SBP≤90, so the SBP measure was excluded from analysis. We found that patients with SI>1 were significantly more likely to receive blood transfusions compared with patients with SI3 was not associated with blood transfusion compared with ROPE≤3 (OR 2.92; 95% CI 0.28 to 30.42). Furthermore, comparison of area under the ROC curve for SI (0.68) and ROPE (0.54) suggested that SI was more predictive than ROPE of blood transfusion. Conclusion We found that an elevated SI was more closely associated with early blood product transfusion than SBP and ROPE in injured pregnant patients. Level of evidence Prognostic, level II

    Human Adipose Stromal Cells Increase Survival and Mesenteric Perfusion Following Intestinal Ischemia and Reperfusion Injury

    Get PDF
    OBJECTIVE: Intestinal ischemia can quickly escalate to bowel necrosis and perforation. Transplantation of stem cells presents a novel treatment modality for this problem. We hypothesized that: human adipose-derived stromal cells (hASCs) would increase survival and mesenteric perfusion to a greater degree compared with differentiated cellular controls following ischemic intestinal injury, and improved outcomes with hASC therapy would be associated with preservation of intestinal histological and tight junction architecture, and lower levels of systemic inflammation following intestinal injury. METHODS: hASCs and keratinocytes (differentiated cellular control) were cultured on polystyrene flasks at 37°C in 5% CO2 in air. Adult male C57Bl6J mice were anesthetized and a midline laparotomy performed. The intestines were eviscerated, the small bowel mesenteric root identified, and intestinal ischemia was established by temporarily occluding the superior mesenteric artery for 60 min with a noncrushing vascular clamp. Following ischemia, the clamp was removed, and the intestines were returned to the abdominal cavity. Before abdominal closure, 2 million hASCs or keratinocytes in 250 μL of phosphate-buffered saline (carrier for cells and control solution) were infused into the peritoneum. Animals were allowed to recover for 12 or 24 h (perfusion, histology, cytokine, and immunofluoresence studies), or 7 days (survival studies). Intestinal perfusion was assessed by laser Doppler imaging. Intestinal tissue segments were stained with hematoxylin and eosin, as well as antibodies for the tight junction protein claudin-1. Separate aliquots of intestine, liver, and lung tissue were homogenized and assessed for inflammatory cytokines via multiplex beaded assay. RESULTS: Animals administered hASCs following intestinal ischemia and reperfusion (I/R) injury had significantly greater 7-day survival and better postischemic recovery of mesenteric perfusion compared with vehicle or keratinocyte therapy. hASCs also abated intestinal mucosal destruction, facilitated preservation of intestinal tight junctions, and decreased the systemic inflammatory response to injury. CONCLUSIONS: Human adipose-derived stromal cells improved survival and mesenteric perfusion and attenuated the mucosal damage associated with intestinal I/R injury. hASCs should be considered as a plausible cell source for novel cellular treatment plans following intestinal ischemia

    Development of a Multidisciplinary Program to Expedite Care of Esophageal Emergencies

    Get PDF
    Background Level 1 programs have improved outcomes by expediting the multidisciplinary care of critically ill patients. We established a novel level 1 program for the management of esophageal emergencies. Methods After institutional review board approval, we performed a retrospective analysis of patients referred to our level 1 esophageal emergency program from April 2013 through November 2015. A historical comparison group of patients treated for the same diagnosis in the previous 2 years was used. Results Eighty patients were referred and transported an average distance of 56 miles (range, 1–163 miles). Median time from referral to arrival was 2.4 hours (range, 0.4-12.9 hours). Referrals included 6 (7%) patients with esophageal obstruction and 71 (89%) patients with suspected esophageal perforation. Of the patients with suspected esophageal perforation, causes included iatrogenic (n = 26), Boerhaave’s syndrome (n = 32), and other (n = 13). Forty-six percent (n = 33) of patients were referred because of pneumomediastinum, but perforation could not be subsequently demonstrated. Initial management of patients with documented esophageal perforation included operative treatment (n = 25), endoscopic intervention (n = 8), and supportive care (n = 5). Retrospective analysis demonstrated a statistically significant difference in mean Pittsburgh severity index score (PSS) between esophageal perforation treatment groups (p < 0.01). In patients with confirmed perforations, there were 3 (8%) mortalities within 30 days. More patients in the esophageal level 1 program were transferred to our institution in less than 24 hours after diagnosis than in the historical comparison group (p < 0.01). Conclusions Development of an esophageal emergency referral program has facilitated multidisciplinary care at a high-volume institution, and early outcomes appear favorable

    An Analysis of Collegiate Club-Sport Female Lacrosse Players: Sport-Specific Field Test Performance and the Influence of Lacrosse Stick Carrying

    Get PDF
    International Journal of Exercise Science 11(4): 269-280, 2018. Lacrosse is a field-based, intermittent sport that requires players to use a stick with a shaft and mesh pocket to manipulate the ball. However, there has been limited analysis of the characteristics of collegiate club-sport players, and whether stick carry influences the sprinting speed of lacrosse players. As a result, this study investigated the field test characteristics of collegiate club-sport female lacrosse players, and the effects of stick carry on linear and change-of-direction speed. Nine players (seven field players, two goalkeepers) volunteered for this study and completed: vertical jump and standing broad jump; 30-meter (m) sprint (0-5, 0-10, and 0-30 m intervals) and modified T-test without and with a stick; and the Yo-Yo Intermittent Recovery Test. Magnitude-based inference analyses via effect sizes (d) compared the field players and goalkeepers. Data was pooled for the 30-m sprint and modified T-test to examine the effect of stick carry via paired samples t-tests (p\u3c0.05) and effect sizes. Field players performed better in most field tests (d=0.93-2.45), although goalkeepers generated greater vertical jump power (d=2.01). With regards to the effects of stick carry, there was a significant difference between the faster 0-5 m sprint interval without a stick compared to with a stick (p=0.02), but this had a small effect (d=0.25). There were no differences between the other sprint intervals and modified T-test (p=0.08-0.39; d=0.06-0.19). When contextualized with comparisons to other female collegiate athletes, the results indicated limitations in training exposure for collegiate club-sport lacrosse players. Furthermore, stick carry generally did not affect speed

    An investigation of the mechanics and sticking region of a one-repetition maximum close-grip bench press versus the traditional bench press

    Get PDF
    The close-grip bench press (CGBP) is a variation of the traditional bench press (TBP) that uses a narrower grip (~95% of biacromial distance (BAD)) and has potential application for athletes performing explosive arm actions from positions where the hands are held close to the torso. Limited research has investigated CGBP mechanics compared to the TBP. Twenty-seven resistance-trained individuals completed a one-repetition maximum TBP and CGBP. The TBP was performed with the preferred grip; the CGBP with a grip width of 95% BAD. A linear position transducer measured lift distance and duration; peak and mean power, velocity, and force; distance and time when peak power occurred; and work. Pre-sticking region (PrSR), sticking region, and post-sticking region distance and duration for each lift was measured. A repeated measures ANOVA was used to derive differences between TBP and CGBP mechanics (p \u3c 0.01); effect sizes (d) were also calculated. A greater load was lifted in the TBP, thus mean force was greater (d = 0.16–0.17). Peak power and velocity were higher in the CGBP, which had a longer PrSR distance (d = 0.49–1.32). The CGBP could emphasize power for athletes that initiate explosive upper-body actions with the hands positioned close to the torso

    Study Protocol for the COVID-19 Pandemic Adjustment Survey (CPAS): A Longitudinal Study of Australian Parents of a Child 0–18 Years

    Get PDF
    Background: The COVID-19 pandemic presents significant risks to the mental health and wellbeing of Australian families. Employment and economic uncertainty, chronic stress, anxiety, and social isolation are likely to have negative impacts on parent mental health, couple and family relationships, as well as child health and development. Objective: This study aims to: (1) provide timely information on the mental health impacts of the emerging COVID-19 crisis in a close to representative sample of Australian parents and children (0–18 years), (2) identify adults and families most at risk of poor mental health outcomes, and (3) identify factors to target through clinical and public health intervention to reduce risk. Specifically, this study will investigate the extent to which the COVID-19 pandemic is associated with increased risk for parents’ mental health, lower well-being, loneliness, and alcohol use; parent-parent and parent-child relationships (both verbal and physical); and child and adolescent mental health problems. Methods: The study aims to recruit a close to representative sample of at least 2,000 adults aged 18 years and over living in Australia who are parents of a child 0–4 years (early childhood, N = 400), 5–12 years (primary school N = 800), and 13–18 years (secondary school, N = 800). The design will be a longitudinal cohort study using an online recruitment methodology. Participants will be invited to complete an online baseline self-report survey (20 min) followed by a series of shorter online surveys (10 min) scheduled every 2 weeks for the duration of the COVID-19 pandemic (i.e., estimated to be 14 surveys over 6 months). Results: The study will employ post stratification weights to address differences between the final sample and the national population in geographic communities across Australia. Associations will be analyzed using multilevel modeling with time-variant and time-invariant predictors of change in trajectory over the testing period. Conclusions: This study will provide timely information on the mental health impacts of the COVID-19 crisis on parents and children in Australia; identify communities, parents, families, and children most at risk of poor outcomes; and identify potential factors to address in clinical and public health interventions to reduce risk

    Spatial priorities for conserving the most intact biodiverse forests within Central Africa

    Get PDF
    The forests of Central Africa contain some of Earth's few remaining intact forests. These forests are increasingly threatened by infrastructure development, agriculture, and unsustainable extraction of natural resources (e.g. minerals, bushmeat, and timber), all of which is leading to deforestation and forest degradation, particularly defaunation, and hence causing declines in biodiversity and a significant increase in carbon emissions. Given the pervasive nature of these threats, the global importance of Central African forests for biodiversity conservation, and the limited resources for conservation and sustainable management, there is a need to identify where the most important areas are to orientate conservation efforts. We developed a novel approach for identifying spatial priorities where conservation efforts can maximize biodiversity benefits within Central Africa's most intact forest areas. We found that the Democratic Republic of Congo has the largest amount of priority areas in the region, containing more than half, followed by Gabon, the Republic of Congo and Cameroon. We compared our approach to one that solely prioritizes forest intactness and one that aims to achieve only biodiversity representation objectives. We found that when priorities are only based on forest intactness (without considering biodiversity representation), there are significantly fewer biodiversity benefits and vice versa. We therefore recommend multi-objective planning that includes biodiversity representation and forest intactness to ensure that both objectives are maximized. These results can inform various types of conservation strategies needed within the region, including land-use planning, jurisdictional REDD + initiatives, and performance related carbon payments, protected area expansion, community forest management, and forest concession plans
    • …
    corecore