59 research outputs found

    Entomological aspects and the role of human behaviour in malaria transmission in a highland region of the Republic of Yemen

    Get PDF
    © 2016 Al-Eryani et al. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/ publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. The attached file is the published version of the article

    Genome-Wide Analysis of the Emerging Infection with Mycobacterium avium Subspecies paratuberculosis in the Arabian Camels (Camelus dromedarius)

    Get PDF
    Mycobacterium avium subspecies paratuberculosis (M. ap) is the causative agent of paratuberculosis or Johne's disease (JD) in herbivores with potential involvement in cases of Crohn's disease in humans. JD is spread worldwide and is economically important for both beef and dairy industries. Generally, pathogenic ovine strains (M. ap-S) are mainly found in sheep while bovine strains (M. ap-C) infect other ruminants (e.g. cattle, goat, deer), as well as sheep. In an effort to characterize this emerging infection in dromedary/Arabian camels, we successfully cultured M. ap from several samples collected from infected camels suffering from chronic, intermittent diarrhea suggestive of JD. Gene-based typing of isolates indicated that all isolates belong to sheep lineage of strains of M. ap (M. ap-S), suggesting a putative transmission from infected sheep herds. Screening sheep and goat herds associated with camels identified the circulation of this type in sheep but not goats. The current genome-wide analysis recognizes these camel isolates as a sub-lineage of the sheep strain with a significant number of single nucleotide polymorphisms (SNPs) between sheep and camel isolates (∼1000 SNPs). Such polymorphism could represent geographical differences among isolates or host adaptation of M. ap during camel infection. To our knowledge, this is the first attempt to examine the genomic basis of this emerging infection in camels with implications on the evolution of this important pathogen. The sequenced genomes of M. ap isolates from camels will further assist our efforts to understand JD pathogenesis and the dynamic of disease transmission across animal species

    Mixed Climatology, Non-synoptic Phenomena and Downburst Wind Loading of Structures

    Get PDF
    Modern wind engineering was born in 1961, when Davenport published a paper in which meteorology, micrometeorology, climatology, bluff-body aerodynamics and structural dynamics were embedded within a homogeneous framework of the wind loading of structures called today \u201cDavenport chain\u201d. Idealizing the wind with a synoptic extra-tropical cyclone, this model was so simple and elegant as to become a sort of axiom. Between 1976 and 1977 Gomes and Vickery separated thunderstorm from non-thunderstorm winds, determined their disjoint extreme distributions and derived a mixed model later extended to other Aeolian phenomena; this study, which represents a milestone in mixed climatology, proved the impossibility of labelling a heterogeneous range of events by the generic term \u201cwind\u201d. This paper provides an overview of this matter, with particular regard to the studies conducted at the University of Genova on thunderstorm downbursts

    The effect of type of femoral component fixation on mortality and morbidity after hip hemiarthroplasty:A systematic review and meta-analysis

    Get PDF
    Background: Hip hemiarthroplasty is a well-established treatment of displaced femoral neck fracture, although debate exists over whether cemented or uncemented fixation is superior. Uncemented prostheses have typically been used in younger, healthier patients and cemented prostheses in older patients with less-stable bone. Also, earlier research has suggested that bone cement has cytotoxic effects and may trigger cardiovascular and respiratory adverse events. Questions/Purposes: The aim of this systematic review and meta-analysis was to compare morbidity and mortality rates after cemented and uncemented hemiarthroplasty for the treatment of displaced femoral neck fractures in elderly patients. Methods: Using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, we searched seven medical databases for randomized clinical trials and observational studies. We compared cemented and uncemented hemiarthroplasty using the Harris Hip Score (HHS), as well as measures of postoperative pain, mortality, and complications. Data were extracted and pooled as risk ratios or standardized mean difference with their corresponding 95% confidence intervals in a meta-analysis model. Results: The meta-analysis included 34 studies (12 randomized trials and 22 observational studies), with a total of 42,411 patients. In the pooled estimate, cemented hemiarthroplasty was associated with less risk of postoperative pain than uncemented hemiarthroplasty. There were no significant differences between groups regarding HHS or rates of postoperative mortality, pulmonary embolism, cardiac arrest, myocardial infarction, acute cardiac arrhythmia, or deep venous thrombosis. Conclusions: While we found that cemented hemiarthroplasty results in less postoperative pain than uncemented hemiarthroplasty in older patients with femoral neck fracture, the lack of significant differences in functional hip scores, mortality, and complications was surprising. Further high-level research is needed

    Myocardial tagging by Cardiovascular Magnetic Resonance: evolution of techniques--pulse sequences, analysis algorithms, and applications

    Get PDF
    Cardiovascular magnetic resonance (CMR) tagging has been established as an essential technique for measuring regional myocardial function. It allows quantification of local intramyocardial motion measures, e.g. strain and strain rate. The invention of CMR tagging came in the late eighties, where the technique allowed for the first time for visualizing transmural myocardial movement without having to implant physical markers. This new idea opened the door for a series of developments and improvements that continue up to the present time. Different tagging techniques are currently available that are more extensive, improved, and sophisticated than they were twenty years ago. Each of these techniques has different versions for improved resolution, signal-to-noise ratio (SNR), scan time, anatomical coverage, three-dimensional capability, and image quality. The tagging techniques covered in this article can be broadly divided into two main categories: 1) Basic techniques, which include magnetization saturation, spatial modulation of magnetization (SPAMM), delay alternating with nutations for tailored excitation (DANTE), and complementary SPAMM (CSPAMM); and 2) Advanced techniques, which include harmonic phase (HARP), displacement encoding with stimulated echoes (DENSE), and strain encoding (SENC). Although most of these techniques were developed by separate groups and evolved from different backgrounds, they are in fact closely related to each other, and they can be interpreted from more than one perspective. Some of these techniques even followed parallel paths of developments, as illustrated in the article. As each technique has its own advantages, some efforts have been made to combine different techniques together for improved image quality or composite information acquisition. In this review, different developments in pulse sequences and related image processing techniques are described along with the necessities that led to their invention, which makes this article easy to read and the covered techniques easy to follow. Major studies that applied CMR tagging for studying myocardial mechanics are also summarized. Finally, the current article includes a plethora of ideas and techniques with over 300 references that motivate the reader to think about the future of CMR tagging

    Mapping geographical inequalities in oral rehydration therapy coverage in low-income and middle-income countries, 2000-17

    Get PDF
    Background Oral rehydration solution (ORS) is a form of oral rehydration therapy (ORT) for diarrhoea that has the potential to drastically reduce child mortality; yet, according to UNICEF estimates, less than half of children younger than 5 years with diarrhoea in low-income and middle-income countries (LMICs) received ORS in 2016. A variety of recommended home fluids (RHF) exist as alternative forms of ORT; however, it is unclear whether RHF prevent child mortality. Previous studies have shown considerable variation between countries in ORS and RHF use, but subnational variation is unknown. This study aims to produce high-resolution geospatial estimates of relative and absolute coverage of ORS, RHF, and ORT (use of either ORS or RHF) in LMICs. Methods We used a Bayesian geostatistical model including 15 spatial covariates and data from 385 household surveys across 94 LMICs to estimate annual proportions of children younger than 5 years of age with diarrhoea who received ORS or RHF (or both) on continuous continent-wide surfaces in 2000-17, and aggregated results to policy-relevant administrative units. Additionally, we analysed geographical inequality in coverage across administrative units and estimated the number of diarrhoeal deaths averted by increased coverage over the study period. Uncertainty in the mean coverage estimates was calculated by taking 250 draws from the posterior joint distribution of the model and creating uncertainty intervals (UIs) with the 2 center dot 5th and 97 center dot 5th percentiles of those 250 draws. Findings While ORS use among children with diarrhoea increased in some countries from 2000 to 2017, coverage remained below 50% in the majority (62 center dot 6%; 12 417 of 19 823) of second administrative-level units and an estimated 6 519 000 children (95% UI 5 254 000-7 733 000) with diarrhoea were not treated with any form of ORT in 2017. Increases in ORS use corresponded with declines in RHF in many locations, resulting in relatively constant overall ORT coverage from 2000 to 2017. Although ORS was uniformly distributed subnationally in some countries, within-country geographical inequalities persisted in others; 11 countries had at least a 50% difference in one of their units compared with the country mean. Increases in ORS use over time were correlated with declines in RHF use and in diarrhoeal mortality in many locations, and an estimated 52 230 diarrhoeal deaths (36 910-68 860) were averted by scaling up of ORS coverage between 2000 and 2017. Finally, we identified key subnational areas in Colombia, Nigeria, and Sudan as examples of where diarrhoeal mortality remains higher than average, while ORS coverage remains lower than average. Interpretation To our knowledge, this study is the first to produce and map subnational estimates of ORS, RHF, and ORT coverage and attributable child diarrhoeal deaths across LMICs from 2000 to 2017, allowing for tracking progress over time. Our novel results, combined with detailed subnational estimates of diarrhoeal morbidity and mortality, can support subnational needs assessments aimed at furthering policy makers' understanding of within-country disparities. Over 50 years after the discovery that led to this simple, cheap, and life-saving therapy, large gains in reducing mortality could still be made by reducing geographical inequalities in ORS coverage. Copyright (c) 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license.Peer reviewe

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe
    corecore