895 research outputs found

    Determination of the Processes Driving the Acquisition of Immunity to Malaria Using a Mathematical Transmission Model

    Get PDF
    Acquisition of partially protective immunity is a dominant feature of the epidemiology of malaria among exposed individuals. The processes that determine the acquisition of immunity to clinical disease and to asymptomatic carriage of malaria parasites are poorly understood, in part because of a lack of validated immunological markers of protection. Using mathematical models, we seek to better understand the processes that determine observed epidemiological patterns. We have developed an age-structured mathematical model of malaria transmission in which acquired immunity can act in three ways (“immunity functions”): reducing the probability of clinical disease, speeding the clearance of parasites, and increasing tolerance to subpatent infections. Each immunity function was allowed to vary in efficacy depending on both age and malaria transmission intensity. The results were compared to age patterns of parasite prevalence and clinical disease in endemic settings in northeastern Tanzania and The Gambia. Two types of immune function were required to reproduce the epidemiological age-prevalence curves seen in the empirical data; a form of clinical immunity that reduces susceptibility to clinical disease and develops with age and exposure (with half-life of the order of five years or more) and a form of anti-parasite immunity which results in more rapid clearance of parasitaemia, is acquired later in life and is longer lasting (half-life of >20 y). The development of anti-parasite immunity better reproduced observed epidemiological patterns if it was dominated by age-dependent physiological processes rather than by the magnitude of exposure (provided some exposure occurs). Tolerance to subpatent infections was not required to explain the empirical data. The model comprising immunity to clinical disease which develops early in life and is exposure-dependent, and anti-parasite immunity which develops later in life and is not dependent on the magnitude of exposure, appears to best reproduce the pattern of parasite prevalence and clinical disease by age in different malaria transmission settings. Understanding the effector mechanisms underlying these two immune functions will assist in the design of transmission-reducing interventions against malaria

    Loss of Population Levels of Immunity to Malaria as a Result of Exposure-Reducing Interventions: Consequences for Interpretation of Disease Trends

    Get PDF
    BACKGROUND: The persistence of malaria as an endemic infection and one of the major causes of childhood death in most parts of Africa has lead to a radical new call for a global effort towards eradication. With the deployment of a highly effective vaccine still some years away, there has been an increased focus on interventions which reduce exposure to infection in the individual and -by reducing onward transmission-at the population level. The development of appropriate monitoring of these interventions requires an understanding of the timescales of their effect. METHODS & FINDINGS: Using a mathematical model for malaria transmission which incorporates the acquisition and loss of both clinical and parasite immunity, we explore the impact of the trade-off between reduction in exposure and decreased development of immunity on the dynamics of disease following a transmission-reducing intervention such as insecticide-treated nets. Our model predicts that initially rapid reductions in clinical disease incidence will be observed as transmission is reduced in a highly immune population. However, these benefits in the first 5-10 years after the intervention may be offset by a greater burden of disease decades later as immunity at the population level is gradually lost. The negative impact of having fewer immune individuals in the population can be counterbalanced either by the implementation of highly-effective transmission-reducing interventions (such as the combined use of insecticide-treated nets and insecticide residual sprays) for an indefinite period or the concurrent use of a pre-erythrocytic stage vaccine or prophylactic therapy in children to protect those at risk from disease as immunity is lost in the population. CONCLUSIONS: Effective interventions will result in rapid decreases in clinical disease across all transmission settings while population-level immunity is maintained but may subsequently result in increases in clinical disease many years later as population-level immunity is lost. A dynamic, evolving intervention programme will therefore be necessary to secure substantial, stable reductions in malaria transmission

    Utilising Assured Multi-Agent Reinforcement Learning within safety-critical scenarios

    Get PDF
    Multi-agent reinforcement learning allows a team of agents to learn how to work together to solve complex decision-making problems in a shared environment. However, this learning process utilises stochastic mechanisms, meaning that its use in safety-critical domains can be problematic. To overcome this issue, we propose an Assured Multi-Agent Reinforcement Learning (AMARL) approach that uses a model checking technique called quantitative verification to provide formal guarantees of agent compliance with safety, performance, and other non-functional requirements during and after the reinforcement learning process. We demonstrate the applicability of our AMARL approach in three different patrolling navigation domains in which multi-agent systems must learn to visit key areas by using different types of reinforcement learning algorithms (temporal difference learning, game theory, and direct policy search). Furthermore, we compare the effectiveness of these algorithms when used in combination with and without our approach. Our extensive experiments with both homogeneous and heterogeneous multi-agent systems of different sizes show that the use of AMARL leads to safety requirements being consistently satisfied and to better overall results than standard reinforcement learning

    Viral factors in influenza pandemic risk assessment

    Get PDF
    The threat of an influenza A virus pandemic stems from continual virus spillovers from reservoir species, a tiny fraction of which spark sustained transmission in humans. To date, no pandemic emergence of a new influenza strain has been preceded by detection of a closely related precursor in an animal or human. Nonetheless, influenza surveillance efforts are expanding, prompting a need for tools to assess the pandemic risk posed by a detected virus. The goal would be to use genetic sequence and/or biological assays of viral traits to identify those non-human influenza viruses with the greatest risk of evolving into pandemic threats, and/or to understand drivers of such evolution, to prioritize pandemic prevention or response measures. We describe such efforts, identify progress and ongoing challenges, and discuss three specific traits of influenza viruses (hemagglutinin receptor binding specificity, hemagglutinin pH of activation, and polymerase complex efficiency) that contribute to pandemic risk

    Intrinsic calf factors associated with the behavior of healthy pre-weaned group-housed dairy-bred calves

    Get PDF
    Technology-derived behaviors are researched for disease detection in artificially-reared calves. Whilst existing studies demonstrate differences in behaviors between healthy and diseased calves, intrinsic calf factors (e.g., sex and birthweight) that may affect these behaviors have received little systematic study. This study aimed to understand the impact of a range of calf factors on milk feeding and activity variables of dairy-bred calves. Calves were group-housed from ~7 days to 39 days of age. Seven liters of milk replacer was available daily from an automatic milk feeder, which recorded feeding behaviors and live-weight. Calves were health scored daily and a tri-axial accelerometer used to record activity variables. Healthy calves were selected by excluding data collected 3 days either side of a poor health score or a treatment event. Thirty-one calves with 10 days each were analyzed. Mixed models were used to identify which of live-weight, age, sex, season of birth, age of inclusion into the group, dam parity, birthweight, and sire breed type (beef or dairy), had a significant influence on milk feeding and activity variables. Heavier calves visited the milk machine more frequently for shorter visits, drank faster and were more likely to drink their daily milk allowance than lighter calves. Older calves had a shorter mean standing bout length and were less active than younger calves. Calves born in summer had a longer daily lying time, performed more lying and standing bouts/day and had shorter mean standing bouts than those born in autumn or winter. Male calves had a longer mean lying bout length, drank more slowly and were less likely to consume their daily milk allowance than their female counterparts. Calves that were born heavier had fewer lying and standing bouts each day, a longer mean standing bout length and drank less milk per visit. Beef-sired calves had a longer mean lying bout length and drank more slowly than their dairy sired counterparts. Intrinsic calf factors influence different healthy calf behaviors in different ways. These factors must be considered in the design of research studies and the field application of behavior-based disease detection tools in artificially reared calves

    Watershed Management on Range and Forest Lands Proceedings of the Fifth Workshop of the United States/Australia Rangelands Panel

    Get PDF
    Preface: The U.S.-Australia Cooperative Rangeland Science Program In October 1968 the governments of the United States and Australia entered into an agreement for the purpose of facilitating close cooperative activities between the scientific communities of the two countries. The joint communique issued at that time designated the U.S. National Science Foundation and the Australian Commonwealth Department of Education and Science as the coordinating agencies. Both countries were to encourage binational teamwork in research, interchanges of scientists, joint seminars, and exchanges of information. A United States-Australia Rangeland Panel was established in December 1969 to further cooperation between the two countries in the rangeland sciences. The present panel includes the following

    Conservation successes and challenges for wide-ranging sharks and rays

    Get PDF
    Overfishing is the most significant threat facing sharks and rays. Given the growth in consumption of seafood, combined with the compounding effects of habitat loss, climate change, and pollution, there is a need to identify recovery paths, particularly in poorly managed and poorly monitored fisheries. Here, we document conservation through fisheries management success for 11 coastal sharks in US waters by comparing population trends through a Bayesian state-space model before and after the implementation of the 1993 Fisheries Management Plan for Sharks. We took advantage of the spatial and temporal gradients in fishing exposure and fisheries management in the Western Atlantic to analyze the effect on the Red List status of all 26 wide-ranging coastal sharks and rays. We show that extinction risk was greater where fishing pressure was higher, but this was offset by the strength of management engagement (indicated by strength of National and Regional Plan of Action for sharks and rays). The regional Red List Index (which tracks changes in extinction risk through time) declined in all regions until the 1980s but then improved in the North and Central Atlantic such that the average extinction risk is currently half that in the Southwest. Many sharks and rays are wide ranging, and successful fisheries management in one country can be undone by poorly regulated or unregulated fishing elsewhere. Our study underscores that well-enforced, science-based management of carefully monitored fisheries can achieve conservation success, even for slow-growing species
    corecore