104 research outputs found

    Nepal earthquake 2015: experience of junior clinical year medical students of Patan Academy of Health Sciences

    Get PDF
    Dilemmas regarding whether medical students’ participation ‘as doctor’ in disaster response is ethical remains unanswered. Although they prove to be an important addition to the workforce during such settings, their limited competency and likeliness to harm themselves and the patients in the process raises questions. Here we present our views on medical students’ involvement in disaster response based on our experiences at Patan Hospital, Patan Academy of Health Sciences (PAHS) during the Nepal earthquake 2015. Medical students play a crucial role in disaster management; however, they are not proficient in care for mass disasters. Nonetheless, being involved offered students first-hand experience on disaster response and also helped the disaster response by providing extra manpower. With more training, medical students can serve as a skillful workforce during disaster. The impact of regular drills strategically placed during medical school training in the curricula can be of immense help to build capacity for medical disaster response. Keywords: disaster response, mass disasters, medical students, Nepal earthquake 201

    Etiology of Acute Diarrheal Disease and Antimicrobial Susceptibility Pattern in Children Younger Than 5 Years Old in Nepal

    Get PDF
    Diarrhea is a common cause of morbidity and mortality among children younger than 5 years in developing countries. Children from 3 to 60 months of age were recruited from two hospitals in Nepal— Bharatpur Hospital, Bharatpur, and Kanti Children’s Hospital, Kathmandu—in 2006 to 2009. Stool specimens collected from 1,200 children with acute diarrhea (cases) and 1,200 children without diarrhea (control subjects) were examined for a broad range of enteropathogens by standard microbiology, including microscopy, enzyme immunoassay for viral pathogens (adenovirus, astrovirus, and rotavirus) and protozoa (Giardia, Cryptosporidium, and Entamoeba histolytica), as well as by using reverse transcription real-time polymerase for norovirus. Antimicrobial susceptibility testing was performed using the disk diffusion method. Overall, rotavirus (22% versus 2%), norovirus (13% versus 7%), adenovirus (3% versus 0%), Shigella (6% versus 1%), enterotoxigenic Escherichia coli (8% versus 4%), Vibrio (7% versus 0%), and Aeromonas (9% versus 3%) were identified significantly more frequently in cases than control subjects. Campylobacter, Plesiomonas, Salmonella, and diarrheagenic E. coli (enteropathogenic, enteroinvasive, enteroaggregative) were identified in similar proportions in diarrheal and non-diarrheal stools. Campylobacter was resistant to second-generation quinolone drugs (ciprofloxacin and norfloxacin), whereas Vibrio and Shigella were resistant to nalidixic acid and trimethoprim/sulfamethoxazole. This study documents the important role of rotavirus and norovirus in acute diarrhea in children younger than 5 years, followed by the bacteria Shigella, enterotoxigenic E. coli, Vibrio cholera, and Aeromonas. Data on the prevalence and epidemiology of enteropathogens identify potential pathogens for public health interventions, whereas pathogen antibiotic resistance pattern data may provide guidance on choice of therapy in clinical settings.publishedVersio

    Clinical effect of obesity on N-terminal pro-B-type natriuretic peptide cut-off concentrations for the diagnosis of acute heart failure

    Get PDF
    AIMS Obese patients have lower natriuretic peptide concentrations. We hypothesized that adjusting the concentration of N-terminal pro-B-type natriuretic peptide (NT-proBNP) for obesity could further increase its clinical utility in the early diagnosis of acute heart failure (AHF). METHODS AND RESULTS This hypothesis was tested in a prospective diagnostic study enrolling unselected patients presenting to the emergency department with acute dyspnoea. Two independent cardiologists/internists centrally adjudicated the final diagnosis using all individual patient information including cardiac imaging. NT-proBNP plasma concentrations were applied: first, using currently recommended cut-offs; second, using cut-offs lowered by 33% with body mass index (BMI) of 30-34.9 kg/m2^{2} and by 50% with BMI ≥ 35 kg/m2^{2} . Among 2038 patients, 509 (25%) were obese, of which 271 (53%) had AHF. The diagnostic accuracy of NT-proBNP as quantified by the area under the receiver-operating characteristic curve was lower in obese versus non-obese patients (0.890 vs. 0.938). For rapid AHF rule-out in obese patients, the currently recommended cut-off of 300 pg/ml achieved a sensitivity of 96.7% (95% confidence interval [CI] 93.8-98.2%), ruling out 29% of patients and missing 9 AHF patients. For rapid AHF rule-in, the age-dependent cut-off concentrations (age 75 years: 1800 pg/ml) achieved a specificity of 84.9% (95% CI 79.8-88.9%). Proportionally lowering the currently recommended cut-offs by BMI increased sensitivity to 98.2% (95% CI 95.8-99.2%), missing 5 AHF patients; reduced the proportion of AHF patients remaining in the 'gray zone' (48% vs. 26%; p = 0.002), achieving a specificity of 76.5% (95% CI 70.7-81.4%). CONCLUSIONS Adjusting NT-proBNP concentrations for obesity seems to further increase its clinical utility in the early diagnosis of AHF

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two

    Get PDF
    Background The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd. Methods We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background. Results First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001). Conclusions In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival

    stairs and fire

    Get PDF

    Discutindo a educação ambiental no cotidiano escolar: desenvolvimento de projetos na escola formação inicial e continuada de professores

    Get PDF
    A presente pesquisa buscou discutir como a Educação Ambiental (EA) vem sendo trabalhada, no Ensino Fundamental e como os docentes desta escola compreendem e vem inserindo a EA no cotidiano escolar., em uma escola estadual do município de Tangará da Serra/MT, Brasil. Para tanto, realizou-se entrevistas com os professores que fazem parte de um projeto interdisciplinar de EA na escola pesquisada. Verificou-se que o projeto da escola não vem conseguindo alcançar os objetivos propostos por: desconhecimento do mesmo, pelos professores; formação deficiente dos professores, não entendimento da EA como processo de ensino-aprendizagem, falta de recursos didáticos, planejamento inadequado das atividades. A partir dessa constatação, procurou-se debater a impossibilidade de tratar do tema fora do trabalho interdisciplinar, bem como, e principalmente, a importância de um estudo mais aprofundado de EA, vinculando teoria e prática, tanto na formação docente, como em projetos escolares, a fim de fugir do tradicional vínculo “EA e ecologia, lixo e horta”.Facultad de Humanidades y Ciencias de la Educació
    corecore