78 research outputs found

    Safety and immunogenicity of rVSVΔG-ZEBOV-GP Ebola vaccine in adults and children in Lambaréné, Gabon: A phase I randomised trial.

    Get PDF
    BACKGROUND: The rVSVΔG-ZEBOV-GP vaccine prevented Ebola virus disease when used at 2 × 107 plaque-forming units (PFU) in a trial in Guinea. This study provides further safety and immunogenicity data. METHODS AND FINDINGS: A randomised, open-label phase I trial in Lambaréné, Gabon, studied 5 single intramuscular vaccine doses of 3 × 103, 3 × 104, 3 × 105, 3 × 106, or 2 × 107 PFU in 115 adults and a dose of 2 × 107 PFU in 20 adolescents and 20 children. The primary objective was safety and tolerability 28 days post-injection. Immunogenicity, viraemia, and shedding post-vaccination were evaluated as secondary objectives. In adults, mild-to-moderate adverse events were frequent, but there were no serious or severe adverse events related to vaccination. Before vaccination, Zaire Ebola virus (ZEBOV)-glycoprotein (GP)-specific and ZEBOV antibodies were detected in 11% and 27% of adults, respectively. In adults, 74%-100% of individuals who received a dose 3 × 104, 3 × 105, 3 × 106, or 2 × 107 PFU had a ≥4.0-fold increase in geometric mean titres (GMTs) of ZEBOV-GP-specific antibodies at day 28, reaching GMTs of 489 (95% CI: 264-908), 556 (95% CI: 280-1,101), 1,245 (95% CI: 899-1,724), and 1,503 (95% CI: 931-2,426), respectively. Twenty-two percent of adults had a ≥4-fold increase of ZEBOV antibodies, with GMTs at day 28 of 1,015 (647-1,591), 1,887 (1,154-3,085), 1,445 (1,013-2,062), and 3,958 (2,249-6,967) for the same doses, respectively. These antibodies persisted up to day 180 for doses ≥3 × 105 PFU. Adults with antibodies before vaccination had higher GMTs throughout. Neutralising antibodies were detected in more than 50% of participants at doses ≥3 × 105 PFU. As in adults, no serious or severe adverse events related to vaccine occurred in adolescents or children. At day 2, vaccine RNA titres were higher for adolescents and children than adults. At day 7, 78% of adolescents and 35% of children had recombinant vesicular stomatitis virus RNA detectable in saliva. The vaccine induced high GMTs of ZEBOV-GP-specific antibodies at day 28 in adolescents, 1,428 (95% CI: 1,025-1,989), and children, 1,620 (95% CI: 806-3,259), and in both groups antibody titres increased up to day 180. The absence of a control group, lack of stratification for baseline antibody status, and imbalances in male/female ratio are the main limitations of this study. CONCLUSIONS: Our data confirm the acceptable safety and immunogenicity profile of the 2 × 107 PFU dose in adults and support consideration of lower doses for paediatric populations and those who request boosting. TRIAL REGISTRATION: Pan African Clinical Trials Registry PACTR201411000919191

    Neurobiology of rodent self-grooming and its value for translational neuroscience

    Get PDF
    Self-grooming is a complex innate behaviour with an evolutionarily conserved sequencing pattern and is one of the most frequently performed behavioural activities in rodents. In this Review, we discuss the neurobiology of rodent self-grooming, and we highlight studies of rodent models of neuropsychiatric disorders-including models of autism spectrum disorder and obsessive compulsive disorder-that have assessed self-grooming phenotypes. We suggest that rodent self-grooming may be a useful measure of repetitive behaviour in such models, and therefore of value to translational psychiatry. Assessment of rodent self-grooming may also be useful for understanding the neural circuits that are involved in complex sequential patterns of action.National Institutes of Health (U.S.) (Grant NS025529)National Institutes of Health (U.S.) (Grant HD028341)National Institutes of Health (U.S.) (Grant MH060379

    The global distribution of lymphatic filariasis, 2000–18: a geospatial analysis

    Get PDF
    Background Lymphatic filariasis is a neglected tropical disease that can cause permanent disability through disruption of the lymphatic system. This disease is caused by parasitic filarial worms that are transmitted by mosquitos. Mass drug administration (MDA) of antihelmintics is recommended by WHO to eliminate lymphatic filariasis as a public health problem. This study aims to produce the first geospatial estimates of the global prevalence of lymphatic filariasis infection over time, to quantify progress towards elimination, and to identify geographical variation in distribution of infection. Methods A global dataset of georeferenced surveyed locations was used to model annual 2000–18 lymphatic filariasis prevalence for 73 current or previously endemic countries. We applied Bayesian model-based geostatistics and time series methods to generate spatially continuous estimates of global all-age 2000–18 prevalence of lymphatic filariasis infection mapped at a resolution of 5 km2 and aggregated to estimate total number of individuals infected. Findings We used 14 927 datapoints to fit the geospatial models. An estimated 199 million total individuals (95% uncertainty interval 174–234 million) worldwide were infected with lymphatic filariasis in 2000, with totals for WHO regions ranging from 3·1 million (1·6–5·7 million) in the region of the Americas to 107 million (91–134 million) in the South-East Asia region. By 2018, an estimated 51 million individuals (43–63 million) were infected. Broad declines in prevalence are observed globally, but focal areas in Africa and southeast Asia remain less likely to have attained infection prevalence thresholds proposed to achieve local elimination. Interpretation Although the prevalence of lymphatic filariasis infection has declined since 2000, MDA is still necessary across large populations in Africa and Asia. Our mapped estimates can be used to identify areas where the probability of meeting infection thresholds is low, and when coupled with large uncertainty in the predictions, indicate additional data collection or intervention might be warranted before MDA programmes cease

    Unsupervised record matching with noisy and incomplete data

    Get PDF
    We consider the problem of duplicate detection in noisy and incomplete data: given a large data set in which each record has multiple entries (attributes), detect which distinct records refer to the same real world entity. This task is complicated by noise (such as misspellings) and missing data, which can lead to records being different, despite referring to the same entity. Our method consists of three main steps: creating a similarity score between records, grouping records together into "unique entities", and refining the groups. We compare various methods for creating similarity scores between noisy records, considering different combinations of string matching, term frequency-inverse document frequency methods, and n-gram techniques. In particular, we introduce a vectorized soft term frequency-inverse document frequency method, with an optional refinement step. We also discuss two methods to deal with missing data in computing similarity scores. We test our method on the Los Angeles Police Department Field Interview Card data set, the Cora Citation Matching data set, and two sets of restaurant review data. The results show that the methods that use words as the basic units are preferable to those that use 3-grams. Moreover, in some (but certainly not all) parameter ranges soft term frequency-inverse document frequency methods can outperform the standard term frequency-inverse document frequency method. The results also confirm that our method for automatically determining the number of groups typically works well in many cases and allows for accurate results in the absence of a priori knowledge of the number of unique entities in the data set

    Insect pathogens as biological control agents: back to the future

    Get PDF
    The development and use of entomopathogens as classical, conservation and augmentative biological control agents have included a number of successes and some setbacks in the past 15 years. In this forum paper we present current information on development, use and future directions of insect-specific viruses, bacteria, fungi and nematodes as components of integrated pest management strategies for control of arthropod pests of crops, forests, urban habitats, and insects of medical and veterinary importance. Insect pathogenic viruses are a fruitful source of MCAs, particularly for the control of lepidopteran pests. Most research is focused on the baculoviruses, important pathogens of some globally important pests for which control has become difficult due to either pesticide resistance or pressure to reduce pesticide residues. Baculoviruses are accepted as safe, readily mass produced, highly pathogenic and easily formulated and applied control agents. New baculovirus products are appearing in many countries and gaining an increased market share. However, the absence of a practical in vitro mass production system, generally higher production costs, limited post application persistence, slow rate of kill and high host specificity currently contribute to restricted use in pest control. Overcoming these limitations are key research areas for which progress could open up use of insect viruses to much larger markets. A small number of entomopathogenic bacteria have been commercially developed for control of insect pests. These include several Bacillus thuringiensis sub-species, Lysinibacillus (Bacillus) sphaericus, Paenibacillus spp. and Serratia entomophila. B. thuringiensis sub-species kurstaki is the most widely used for control of pest insects of crops and forests, and B. thuringiensis sub-species israelensis and L. sphaericus are the primary pathogens used for medically important pests including dipteran vectors,. These pathogens combine the advantages of chemical pesticides and microbial control agents (MCAs): they are fast acting, easy to produce at a relatively low cost, easy to formulate, have a long shelf life and allow delivery using conventional application equipment and systemics (i.e. in transgenic plants). Unlike broad spectrum chemical pesticides, B. thuringiensis toxins are selective and negative environmental impact is very limited. Of the several commercially produced MCAs, B. thuringiensis (Bt) has more than 50% of market share. Extensive research, particularly on the molecular mode of action of Bt toxins, has been conducted over the past two decades. The Bt genes used in insect-resistant transgenic crops belong to the Cry and vegetative insecticidal protein families of toxins. Bt has been highly efficacious in pest management of corn and cotton, drastically reducing the amount of broad spectrum chemical insecticides used while being safe for consumers and non-target organisms. Despite successes, the adoption of Bt crops has not been without controversy. Although there is a lack of scientific evidence regarding their detrimental effects, this controversy has created the widespread perception in some quarters that Bt crops are dangerous for the environment. In addition to discovery of more efficacious isolates and toxins, an increase in the use of Bt products and transgenes will rely on innovations in formulation, better delivery systems and ultimately, wider public acceptance of transgenic plants expressing insect-specific Bt toxins. Fungi are ubiquitous natural entomopathogens that often cause epizootics in host insects and possess many desirable traits that favor their development as MCAs. Presently, commercialized microbial pesticides based on entomopathogenic fungi largely occupy niche markets. A variety of molecular tools and technologies have recently allowed reclassification of numerous species based on phylogeny, as well as matching anamorphs (asexual forms) and teleomorphs (sexual forms) of several entomopathogenic taxa in the Phylum Ascomycota. Although these fungi have been traditionally regarded exclusively as pathogens of arthropods, recent studies have demonstrated that they occupy a great diversity of ecological niches. Entomopathogenic fungi are now known to be plant endophytes, plant disease antagonists, rhizosphere colonizers, and plant growth promoters. These newly understood attributes provide possibilities to use fungi in multiple roles. In addition to arthropod pest control, some fungal species could simultaneously suppress plant pathogens and plant parasitic nematodes as well as promote plant growth. A greater understanding of fungal ecology is needed to define their roles in nature and evaluate their limitations in biological control. More efficient mass production, formulation and delivery systems must be devised to supply an ever increasing market. More testing under field conditions is required to identify effects of biotic and abiotic factors on efficacy and persistence. Lastly, greater attention must be paid to their use within integrated pest management programs; in particular, strategies that incorporate fungi in combination with arthropod predators and parasitoids need to be defined to ensure compatibility and maximize efficacy. Entomopathogenic nematodes (EPNs) in the genera Steinernema and Heterorhabditis are potent MCAs. Substantial progress in research and application of EPNs has been made in the past decade. The number of target pests shown to be susceptible to EPNs has continued to increase. Advancements in this regard primarily have been made in soil habitats where EPNs are shielded from environmental extremes, but progress has also been made in use of nematodes in above-ground habitats owing to the development of improved protective formulations. Progress has also resulted from advancements in nematode production technology using both in vivo and in vitro systems; novel application methods such as distribution of infected host cadavers; and nematode strain improvement via enhancement and stabilization of beneficial traits. Innovative research has also yielded insights into the fundamentals of EPN biology including major advances in genomics, nematode-bacterial symbiont interactions, ecological relationships, and foraging behavior. Additional research is needed to leverage these basic findings toward direct improvements in microbial control

    Improved Medical Treatment and Surgical Surveillance of Children and Adolescents with Ulcerative Colitis in the United Kingdom

    Get PDF
    This is a pre-copyedited, author-produced version of an article accepted for publication in Inflammatory Bowel Diseases following peer review. The version of record, Auth, M. K.-K., et al. (2018). "Improved Medical Treatment and Surgical Surveillance of Children and Adolescents with Ulcerative Colitis in the United Kingdom." Inflammatory Bowel Diseases: izy042-izy042. is available online at:https://doi.org/10.1093/ibd/izy042Background: Pediatric ulcerative colitis (UC) presents at an earlier age and increasing prevalence. Our aim was to examine morbidity, steroid sparing strategies, and surgical outcome in children with active UC. Methods: A national prospective audit was conducted for the inpatient period of all children with UC for medical or surgical treatment in the United Kingdom (UK) over 1 year. Thirty-two participating centers recruited 224 children in 298 admissions, comparisons over 6 years were made with previous audits. Results: Over 6 years, recording of Paediatric Ulcerative Colitis Activity Index (PUCAI) score (median 65)(23% to 55%, P < 0.001), guidelines for acute severe colitis (43% to 77%, P < 0.04), and ileal pouch surgery registration (4% to 56%, P < 0.001) have increased. Corticosteroids were given in 183/298 episodes (61%) with 61/183 (33%) not responding and requiring second line therapy or surgery. Of those treated with anti-TNFalpha (16/61, 26%), 3/16 (18.8%) failed to respond and required colectomy. Prescription of rescue therapy (26% to 49%, P = 0.04) and proportion of anti-TNFalpha (20% to 53%, P = 0.03) had increased, colectomy rate (23.7% to 15%) was not significantly reduced (P = 0.5). Subtotal colectomy was the most common surgery performed (n = 40), and surgical complications from all procedures occurred in 33%. In 215/224 (96%) iron deficiency anemia was detected and in 51% treated, orally (50.2%) or intravenously (49.8%). Conclusions: A third of children were not responsive to steroids, and a quarter of these were treated with anti-TNFalpha. Colectomy was required in 41/298 (13.7%) of all admissions. Our national audit program indicates effectiveness of actions taken to reduce steroid dependency, surgery, and iron deficiency. 10.1093/ibd/izy042_video1izy042.video15769503407001.Dr Richard K Russell is supported by an NHS Scotland Research Senior fellowship. Linda J Williams has been supported by the Royal College of Physicians

    Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17

    Get PDF
    Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe

    Global, regional, and national incidence, prevalence, and years lived with disability for 310 diseases and injuries, 1990-2015:a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background Non-fatal outcomes of disease and injury increasingly detract from the ability of the world's population to live in full health, a trend largely attributable to an epidemiological transition in many countries from causes affecting children, to non-communicable diseases (NCDs) more common in adults. For the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015), we estimated the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015.Methods We estimated incidence and prevalence by age, sex, cause, year, and geography with a wide range of updated and standardised analytical procedures. Improvements from GBD 2013 included the addition of new data sources, updates to literature reviews for 85 causes, and the identification and inclusion of additional studies published up to November, 2015, to expand the database used for estimation of non-fatal outcomes to 60 900 unique data sources. Prevalence and incidence by cause and sequelae were determined with DisMod-MR 2.1, an improved version of the DisMod-MR Bayesian meta-regression tool first developed for GBD 2010 and GBD 2013. For some causes, we used alternative modelling strategies where the complexity of the disease was not suited to DisMod-MR 2.1 or where incidence and prevalence needed to be determined from other data. For GBD 2015 we created a summary indicator that combines measures of income per capita, educational attainment, and fertility (the Socio-demographic Index [SDI]) and used it to compare observed patterns of health loss to the expected pattern for countries or locations with similar SDI scores.Findings We generated 9.3 billion estimates from the various combinations of prevalence, incidence, and YLDs for causes, sequelae, and impairments by age, sex, geography, and year. In 2015, two causes had acute incidences in excess of 1 billion: upper respiratory infections (17.2 billion, 95% uncertainty interval [UI] 15.4-19.2 billion) and diarrhoeal diseases (2.39 billion, 2.30-2.50 billion). Eight causes of chronic disease and injury each affected more than 10% of the world's population in 2015: permanent caries, tension-type headache, iron-deficiency anaemia, age-related and other hearing loss, migraine, genital herpes, refraction and accommodation disorders, and ascariasis. The impairment that affected the greatest number of people in 2015 was anaemia, with 2.36 billion (2.35-2.37 billion) individuals affected. The second and third leading impairments by number of individuals affected were hearing loss and vision loss, respectively. Between 2005 and 2015, there was little change in the leading causes of years lived with disability (YLDs) on a global basis. NCDs accounted for 18 of the leading 20 causes of age-standardised YLDs on a global scale. Where rates were decreasing, the rate of decrease for YLDs was slower than that of years of life lost (YLLs) for nearly every cause included in our analysis. For low SDI geographies, Group 1 causes typically accounted for 20-30% of total disability, largely attributable to nutritional deficiencies, malaria, neglected tropical diseases, HIV/AIDS, and tuberculosis. Lower back and neck pain was the leading global cause of disability in 2015 in most countries. The leading cause was sense organ disorders in 22 countries in Asia and Africa and one in central Latin America; diabetes in four countries in Oceania; HIV/AIDS in three southern sub-Saharan African countries; collective violence and legal intervention in two north African and Middle Eastern countries; iron-deficiency anaemia in Somalia and Venezuela; depression in Uganda; onchoceriasis in Liberia; and other neglected tropical diseases in the Democratic Republic of the Congo.Interpretation Ageing of the world's population is increasing the number of people living with sequelae of diseases and injuries. Shifts in the epidemiological profile driven by socioeconomic change also contribute to the continued increase in years lived with disability (YLDs) as well as the rate of increase in YLDs. Despite limitations imposed by gaps in data availability and the variable quality of the data available, the standardised and comprehensive approach of the GBD study provides opportunities to examine broad trends, compare those trends between countries or subnational geographies, benchmark against locations at similar stages of development, and gauge the strength or weakness of the estimates available. Copyright (C) The Author(s). Published by Elsevier Ltd.</p

    Global burden of 369 diseases and injuries in 204 countries and territories, 1990-2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    corecore