97 research outputs found

    The impact of age on predictive performance of national early warning score at arrival to emergency departments: development and external validation

    Get PDF
    Study objective: To investigate how age affects the predictive performance of the National Early Warning Score (NEWS) at arrival to the emergency department (ED) regarding inhospital mortality and intensive care admission.Methods: International multicenter retrospective cohorts from 2 Danish and 3 Dutch ED. Development cohort: 14,809 Danish patients aged >= 18 years with at least systolic blood pressure or pulse measured from the Danish Multicenter Cohort. External validation cohort: 50,448 Dutch patients aged >18 years with all vital signs measured from the Netherlands Emergency Department Evaluation Database (NEED). Multivariable logistic regression was used for model building. Performance was evaluated overall and within age categories: 18 to 64 years, 65 to 80 years, and more than 80 years.Results: In the Danish Multicenter Cohort, a total of 2.5% died inhospital, and 2.8% were admitted to the ICU, compared with 2.8% and 1.6%, respectively, in the NEED. Age did not add information for the prediction of intensive care admission but was the strongest predictor for inhospital mortality. For NEWS alone, severe underestimation of risk was observed for persons above 80 while overall Area Under Receiver Operating Characteristic (AUROC) was 0.82 (confidence interval [CI] 0.80 to 0.84) in the Danish Multicenter Cohort versus 0.75 (CI 0.75 to 0.77) in the NEED. When combining NEWS with age, underestimation of risks was eliminated for persons above 80, and overall AUROC increased significantly to 0.86 (CI 0.85 to 0.88) in the Danish Multicenter Cohort versus 0.82 (CI 0.81 to 0.83) in the NEED.Conclusion: Combining NEWS with age improved the prediction performance regarding inhospital mortality, mostly for persons aged above 80, and can potentially improve decision policies at arrival to EDs

    The burden of neglected tropical diseases in Ethiopia, and opportunities for integrated control and elimination

    Get PDF
    Background: Neglected tropical diseases (NTDs) are a group of chronic parasitic diseases and related conditions that are the most common diseases among the 2·7 billion people globally living on less than US$2 per day. In response to the growing challenge of NTDs, Ethiopia is preparing to launch a NTD Master Plan. The purpose of this review is to underscore the burden of NTDs in Ethiopia, highlight the state of current interventions, and suggest ways forward. Results: This review indicates that NTDs are significant public health problems in Ethiopia. From the analysis reported here, Ethiopia stands out for having the largest number of NTD cases following Nigeria and the Democratic Republic of Congo. Ethiopia is estimated to have the highest burden of trachoma, podoconiosis and cutaneous leishmaniasis in sub-Saharan Africa (SSA), the second highest burden in terms of ascariasis, leprosy and visceral leishmaniasis, and the third highest burden of hookworm. Infections such as schistosomiasis, trichuriasis, lymphatic filariasis and rabies are also common. A third of Ethiopians are infected with ascariasis, one quarter is infected with trichuriasis and one in eight Ethiopians lives with hookworm or is infected with trachoma. However, despite these high burdens of infection, the control of most NTDs in Ethiopia is in its infancy. In terms of NTD control achievements, Ethiopia reached the leprosy elimination target of 1 case/10,000 population in 1999. No cases of human African trypanosomiasis have been reported since 1984. Guinea worm eradication is in its final phase. The Onchocerciasis Control Program has been making steady progress since 2001. A national blindness survey was conducted in 2006 and the trachoma program has kicked off in some regions. Lymphatic Filariasis, podoconiosis and rabies mapping are underway. Conclusion: Ethiopia bears a significant burden of NTDs compared to other SSA countries. To achieve success in integrated control of NTDs, integrated mapping, rapid scale up of interventions and operational research into co implementation of intervention packages will be crucial

    High Prevalence of Malaria in Zambezia, Mozambique: The Protective Effect of IRS versus Increased Risks Due to Pig-Keeping and House Construction

    Get PDF
    BACKGROUND: African countries are scaling up malaria interventions, especially insecticide treated nets (ITN) and indoor residual spraying (IRS), for which ambitious coverage targets have been set. In spite of these efforts infection prevalence remains high in many parts of the continent. This study investigated risk factors for malaria infection in children using three malaria indicator surveys from Zambezia province, Mozambique. The impact of IRS and ITNs, the effects of keeping farm animals and of the construction material of roofs of houses and other potential risk factors associated with malaria infection in children were assessed. METHODS: Cross-sectional community-based surveys were conducted in October of 2006, 2007 and 2008. A total of 8338 children (ages 1-15 years) from 2748 households were included in the study. All children were screened for malaria by rapid diagnostic tests. Caregiver interviews were used to assess household demographic and wealth characteristics and ITN and IRS coverage. Associations between malaria infection, vector control interventions and potential risk factors were assessed. RESULTS: Overall, the prevalence of malaria infection was 47.8% (95%CI: 38.7%-57.1%) in children 1-15 years of age, less than a quarter of children (23.1%, 95%CI: 19.1%-27.6%) were sleeping under ITN and almost two thirds were living in IRS treated houses (coverage 65.4%, 95%CI: 51.5%-77.0%). Protective factors that were independently associated with malaria infection were: sleeping in an IRS house without sleeping under ITN (Odds Ratio (OR)= 0.6; 95%CI: 0.4-0.9); additional protection due to sleeping under ITN in an IRS treated house (OR = 0.5; 95%CI: 0.3-0.7) versus sleeping in an unsprayed house without a ITN; and parental education (primary/secondary: OR = 0.6; 95%CI: 0.5-0.7) versus parents with no education. Increased risk of infection was associated with: current fever (OR = 1.2; 95%CI: 1.0-1.5) versus no fever; pig keeping (OR = 3.2; 95%CI: 2.1-4.9) versus not keeping pigs; living in houses with a grass roof (OR = 1.7; 95%CI: 1.3-2.4) versus other roofing materials and bigger household size (8-15 people: OR = 1.6; 95%CI: 1.3-2.1) versus small households (1-4 persons). CONCLUSION: Malaria infection among children under 15 years of age in Zambezia remained high but conventional malaria vector control methods, in particular IRS, provided effective means of protection. Household ownership of farm animals, particularly pigs, and living in houses with a grass roof were independently associated with increased risk of infection, even after allowing for household wealth. To reduce the burden of malaria, national control programs need to ensure high coverage of effective IRS and promote the use of ITNs, particularly in households with elevated risks of infection, such as those keeping farm animals, and those with grass roofs

    Comparing the effects of sun exposure and vitamin D supplementation on vitamin D insufficiency, and immune and cardio-metabolic function: The Sun Exposure and Vitamin D Supplementation (SEDS) Study

    Get PDF
    Background: Adults living in the sunny Australian climate are at high risk of skin cancer, but vitamin D deficiency (defined here as a serum 25-hydroxyvitamin D (25(OH)D) concentration of less than 50 nmol/L) is also common. Vitamin D deficiency may be a risk factor for a range of diseases. However, the optimal strategies to achieve and maintain vitamin D adequacy (sun exposure, vitamin D supplementation or both), and whether sun exposure itself has benefits over and above initiating synthesis of vitamin D, remain unclear. The Sun Exposure and Vitamin D Supplementation (SEDS) Study aims to compare the effectiveness of sun exposure and vitamin D supplementation for the management of vitamin D insufficiency, and to test whether these management strategies differentially affect markers of immune and cardio-metabolic function. Methods/Design: The SEDS Study is a multi-centre, randomised controlled trial of two different daily doses of vitamin D supplementation, and placebo, in conjunction with guidance on two different patterns of sun exposure. Participants recruited from across Australia are aged 18-64 years and have a recent vitamin D test result showing a serum 25(OH)D level of 40-60 nmol/L. Discussion: This paper discusses the rationale behind the study design, and considers the challenges but necessity of data collection within a non-institutionalised adult population, in order to address the study aims. We also discuss the challenges of participant recruitment and retention, ongoing engagement of referring medical practitioners and address issues of compliance and participant retention. Trial registration: Australia New Zealand Clinical Trials Registry: ACTRN12613000290796 Registered 14 March 2013

    Real-life use of vitamin D<sub>3-</sub>fortified bread and milk during a winter season: the effects of CYP2R1 and GC genes on 25-hydroxyvitamin D concentrations in Danish families, the VitmaD study.

    Get PDF
    Common genetic variants rs10741657 and rs10766197 in CYP2R1 and rs4588 and rs842999 in GC and a combined genetic risk score (GRS) of these four variants influence late summer 25-hydroxyvitamin D (25(OH)D) concentrations. The objectives were to identify those who are most at risk of developing low vitamin D status during winter and to assess whether vitamin D(3)-fortified bread and milk will increase 25(OH)D concentrations in those with genetically determined low 25(OH)D concentrations at late summer. We used data from the VitmaD study. Participants were allocated to either vitamin D(3)-fortified bread and milk or non-fortified bread and milk during winter. In the fortification group, CYP2R1 (rs10741657) and GC (rs4588 and rs842999) were statistically significantly associated with winter 25(OH)D concentrations and CYP2R1 (rs10766197) was borderline significant. There was a negative linear trend between 25(OH)D concentrations and carriage of 0–8 risk alleles (p < 0.0001). No association was found for the control group (p = 0.1428). There was a significant positive linear relationship between different quintiles of total vitamin D intake and the increase in 25(OH)D concentrations among carriers of 0–2 (p = 0.0012), 3 (p = 0.0001), 4 (p = 0.0118) or 5 (p = 0.0029) risk alleles, but not among carriers of 6–8 risk alleles (p = 0.1051). Carriers of a high GRS were more prone to be vitamin D deficient compared to carriers of a low GRS. Furthermore, rs4588-AA carriers have a low but very stable 25(OH)D concentration, and interestingly, also low PTH level. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s12263-014-0413-7) contains supplementary material, which is available to authorized users

    The argument for integrating vector control with multiple drug administration campaigns to ensure elimination of lymphatic filariasis

    Get PDF
    BACKGROUND: There is a danger that mass drug administration campaigns may fail to maintain adequate treatment coverage to achieve lymphatic filariasis elimination. Hence, additional measures to suppress transmission might be needed to ensure the success of the Global Program for the Elimination of Lymphatic Filariasis. DISCUSSION: Vector control successfully eliminated lymphatic filariasis when implemented alone or with mass drug administration. Challenges to lymphatic filariasis elimination include uncertainty of the exact level and duration of microfilarial suppression required for elimination, the mobility of infected individuals, consistent non-participation of some infected individuals with mass drug administration, the possible development of anti-filarial drug resistance and treatment strategies in areas co-endemic with loasis. Integration of vector control with mass drug administration can address some of these challenges. The potential benefits of vector control would include: (1) the ability to suppress filariasis transmission without the need to identify all individual 'foci of infection'; (2) minimizing the risk of reestablishment of transmission from imported microfilaria positive individuals; and (3) decreasing the risk of dengue or malaria transmission where, respectively, Aedes or Anopheles are lymphatic filariasis vectors. SUMMARY: With adequate sustained treatment coverage, mass drug administration should meet the criteria for elimination of lymphatic filariasis. However, it may be difficult to sustain sufficiently high mass drug administration coverage to achieve lymphatic filariasis elimination in some areas, particularly, where Aedes species are the vectors. Since vector control was effective in controlling and even eliminating lymphatic filariasis transmission, integration of vector control with mass drug administration will ensure the sustainability of transmission suppression and thereby better ensure the success of national filariasis elimination programs. Although trials of some vector control interventions are needed, proven vector control strategies are ready for immediate integration with mass drug administration for many important vectors. Vector control is the only presently available additional lymphatic filariasis control measure with the potential for immediate implementation

    Spatial Analysis of Land Cover Determinants of Malaria Incidence in the Ashanti Region, Ghana

    Get PDF
    Malaria belongs to the infectious diseases with the highest morbidity and mortality worldwide. As a vector-borne disease malaria distribution is strongly influenced by environmental factors. The aim of this study was to investigate the association between malaria risk and different land cover classes by using high-resolution multispectral Ikonos images and Poisson regression analyses. The association of malaria incidence with land cover around 12 villages in the Ashanti Region, Ghana, was assessed in 1,988 children <15 years of age. The median malaria incidence was 85.7 per 1,000 inhabitants and year (range 28.4–272.7). Swampy areas and banana/plantain production in the proximity of villages were strong predictors of a high malaria incidence. An increase of 10% of swampy area coverage in the 2 km radius around a village led to a 43% higher incidence (relative risk [RR] = 1.43, p<0.001). Each 10% increase of area with banana/plantain production around a village tripled the risk for malaria (RR = 3.25, p<0.001). An increase in forested area of 10% was associated with a 47% decrease of malaria incidence (RR = 0.53, p = 0.029)

    Malaria in Africa: Vector Species' Niche Models and Relative Risk Maps

    Get PDF
    A central theoretical goal of epidemiology is the construction of spatial models of disease prevalence and risk, including maps for the potential spread of infectious disease. We provide three continent-wide maps representing the relative risk of malaria in Africa based on ecological niche models of vector species and risk analysis at a spatial resolution of 1 arc-minute (9 185 275 cells of approximately 4 sq km). Using a maximum entropy method we construct niche models for 10 malaria vector species based on species occurrence records since 1980, 19 climatic variables, altitude, and land cover data (in 14 classes). For seven vectors (Anopheles coustani, A. funestus, A. melas, A. merus, A. moucheti, A. nili, and A. paludis) these are the first published niche models. We predict that Central Africa has poor habitat for both A. arabiensis and A. gambiae, and that A. quadriannulatus and A. arabiensis have restricted habitats in Southern Africa as claimed by field experts in criticism of previous models. The results of the niche models are incorporated into three relative risk models which assume different ecological interactions between vector species. The “additive” model assumes no interaction; the “minimax” model assumes maximum relative risk due to any vector in a cell; and the “competitive exclusion” model assumes the relative risk that arises from the most suitable vector for a cell. All models include variable anthrophilicity of vectors and spatial variation in human population density. Relative risk maps are produced from these models. All models predict that human population density is the critical factor determining malaria risk. Our method of constructing relative risk maps is equally general. We discuss the limits of the relative risk maps reported here, and the additional data that are required for their improvement. The protocol developed here can be used for any other vector-borne disease
    corecore