2,205 research outputs found

    Plasmodium vivax and Plasmodium falciparum infection dynamics: re-infections, recrudescences and relapses

    Get PDF
    Background: In malaria endemic populations, complex patterns of Plasmodium vivax and Plasmodium falciparum blood-stage infection dynamics may be observed. Genotyping samples from longitudinal cohort studies for merozoite surface protein (msp) variants increases the information available in the data, allowing multiple infecting parasite clones in a single individual to be identified. msp genotyped samples from two longitudinal cohorts in Papua New Guinea (PNG) and Thailand were analysed using a statistical model where the times of acquisition and clearance of each clone in every individual were estimated using a process of data augmentation. Results: For the populations analysed, the duration of blood-stage P. falciparum infection was estimated as 36 (95% Credible Interval (CrI): 29, 44) days in PNG, and 135 (95% CrI 94, 191) days in Thailand. Experiments on simulated data indicated that it was not possible to accurately estimate the duration of blood-stage P. vivax infections due to the lack of identifiability between a single blood-stage infection and multiple, sequential blood-stage infections caused by relapses. Despite this limitation, the method and data point towards short duration of blood-stage P. vivax infection with a lower bound of 24 days in PNG, and 29 days in Thailand. On an individual level, P. vivax recurrences cannot be definitively classified into re-infections, recrudescences or relapses, but a probabilistic relapse phenotype can be assigned to each P. vivax sample, allowing investigation of the association between epidemiological covariates and the incidence of relapses. Conclusion: The statistical model developed here provides a useful new tool for in-depth analysis of malaria data from longitudinal cohort studies, and future application to data sets with multi-locus genotyping will allow more detailed investigation of infection dynamics

    The rest-frame ultraviolet spectra of GRBs from massive rapidly-rotating stellar progenitors

    Full text link
    The properties of a massive star prior to its final explosion are imprinted in the circumstellar medium (CSM) created by its wind and termination shock. We perform a detailed, comprehensive calculation of the time-variable and angle-dependent transmission spectra of an average-luminosity Gamma-Ray Burst (GRB) which explodes in the CSM structure produced by the collapse of a 20 Msun, rapidly rotating, Z=0.001 progenitor star. We study both the case in which metals are initially in the gaseous phase, as well as the situation in which they are heavily depleted into dust. We find that high-velocity lines from low-ionization states of silicon, carbon, and iron are initially present in the spectrum only if the metals are heavily depleted into dust prior to the GRB explosion. However, such lines disappear on timescales of a fraction of a second for a burst observed on-axis, and of a few seconds for a burst seen at high-latitude, making their observation virtually impossible. Rest-frame lines produced in the termination shock are instead clearly visible in all conditions. We conclude that time-resolved, early-time spectroscopy is not a promising way in which the properties of the GRB progenitor wind can be routinely studied. Previous detections of high velocity features in GRB UV spectra must have been due either due to a superposition of a physically unrelated absorber or to a progenitor star with very unusual properties.Comment: Published in MNRAS; higher resolution figures in published version

    Stakeholder narratives on trypanosomiasis, their effect on policy and the scope for One Health

    Get PDF
    Background This paper explores the framings of trypanosomiasis, a widespread and potentially fatal zoonotic disease transmitted by tsetse flies (Glossina species) affecting both humans and livestock. This is a country case study focusing on the political economy of knowledge in Zambia. It is a pertinent time to examine this issue as human population growth and other factors have led to migration into tsetse-inhabited areas with little historical influence from livestock. Disease transmission in new human-wildlife interfaces such as these is a greater risk, and opinions on the best way to manage this are deeply divided. Methods A qualitative case study method was used to examine the narratives on trypanosomiasis in the Zambian policy context through a series of key informant interviews. Interviewees included key actors from international organisations, research organisations and local activists from a variety of perspectives acknowledging the need to explore the relationships between the human, animal and environmental sectors. Principal Findings Diverse framings are held by key actors looking from, variously, the perspectives of wildlife and environmental protection, agricultural development, poverty alleviation, and veterinary and public health. From these viewpoints, four narratives about trypanosomiasis policy were identified, focused around four different beliefs: that trypanosomiasis is protecting the environment, is causing poverty, is not a major problem, and finally, that it is a Zambian rather than international issue to contend with. Within these narratives there are also conflicting views on the best control methods to use and different reasoning behind the pathways of response. These are based on apparently incompatible priorities of people, land, animals, the economy and the environment. The extent to which a One Health approach has been embraced and the potential usefulness of this as a way of reconciling the aims of these framings and narratives is considered throughout the paper. Conclusions/Significance While there has historically been a lack of One Health working in this context, the complex, interacting factors that impact the disease show the need for cross-sector, interdisciplinary decision making to stop rival narratives leading to competing actions. Additional recommendations include implementing: surveillance to assess under-reporting of disease and consequential under-estimation of disease risk; evidence-based decision making; increased and structurally managed funding across countries; and focus on interactions between disease drivers, disease incidence at the community level, and poverty and equity impacts

    New Insights into Human Nondisjunction of Chromosome 21 in Oocytes

    Get PDF
    Nondisjunction of chromosome 21 is the leading cause of Down syndrome. Two risk factors for maternal nondisjunction of chromosome 21 are increased maternal age and altered recombination. In order to provide further insight on mechanisms underlying nondisjunction, we examined the association between these two well established risk factors for chromosome 21 nondisjunction. In our approach, short tandem repeat markers along chromosome 21 were genotyped in DNA collected from individuals with free trisomy 21 and their parents. This information was used to determine the origin of the nondisjunction error and the maternal recombination profile. We analyzed 615 maternal meiosis I and 253 maternal meiosis II cases stratified by maternal age. The examination of meiosis II errors, the first of its type, suggests that the presence of a single exchange within the pericentromeric region of 21q interacts with maternal age-related risk factors. This observation could be explained in two general ways: 1) a pericentromeric exchange initiates or exacerbates the susceptibility to maternal age risk factors or 2) a pericentromeric exchange protects the bivalent against age-related risk factors allowing proper segregation of homologues at meiosis I, but not segregation of sisters at meiosis II. In contrast, analysis of maternal meiosis I errors indicates that a single telomeric exchange imposes the same risk for nondisjunction, irrespective of the age of the oocyte. Our results emphasize the fact that human nondisjunction is a multifactorial trait that must be dissected into its component parts to identify specific associated risk factors

    Muc5b Is the Major Polymeric Mucin in Mucus from Thoroughbred Horses With and Without Airway Mucus Accumulation

    Get PDF
    Mucus accumulation is a feature of inflammatory airway disease in the horse and has been associated with reduced performance in racehorses. In this study, we have analysed the two major airways gel-forming mucins Muc5b and Muc5ac in respect of their site of synthesis, their biochemical properties, and their amounts in mucus from healthy horses and from horses with signs of airway mucus accumulation. Polyclonal antisera directed against equine Muc5b and Muc5ac were raised and characterised. Immunohistochemical staining of normal equine trachea showed that Muc5ac and Muc5b are produced by cells in the submucosal glands, as well as surface epithelial goblet cells. Western blotting after agarose gel electrophoresis of airway mucus from healthy horses, and horses with mucus accumulation, was used to determine the amounts of these two mucins in tracheal wash samples. The results showed that in healthy horses Muc5b was the predominant mucin with small amounts of Muc5ac. The amounts of Muc5b and Muc5ac were both dramatically increased in samples collected from horses with high mucus scores as determined visually at the time of endoscopy and that this increase also correlated with increase number of bacteria present in the sample. The change in amount of Muc5b and Muc5ac indicates that Muc5b remains the most abundant mucin in mucus. In summary, we have developed mucin specific polyclonal antibodies, which have allowed us to show that there is a significant increase in Muc5b and Muc5ac in mucus accumulated in equine airways and these increases correlated with the numbers of bacteria

    Modelling diverse root density dynamics and deep nitrogen uptake — a simple approach

    Get PDF
    We present a 2-D model for simulation of root density and plant nitrogen (N) uptake for crops grown in agricultural systems, based on a modification of the root density equation originally proposed by Gerwitz and Page in J Appl Ecol 11:773–781, (1974). A root system form parameter was introduced to describe the distribution of root length vertically and horizontally in the soil profile. The form parameter can vary from 0 where root density is evenly distributed through the soil profile, to 8 where practically all roots are found near the surface. The root model has other components describing root features, such as specific root length and plant N uptake kinetics. The same approach is used to distribute root length horizontally, allowing simulation of root growth and plant N uptake in row crops. The rooting depth penetration rate and depth distribution of root density were found to be the most important parameters controlling crop N uptake from deeper soil layers. The validity of the root distribution model was tested with field data for white cabbage, red beet, and leek. The model was able to simulate very different root distributions, but it was not able to simulate increasing root density with depth as seen in the experimental results for white cabbage. The model was able to simulate N depletion in different soil layers in two field studies. One included vegetable crops with very different rooting depths and the other compared effects of spring wheat and winter wheat. In both experiments variation in spring soil N availability and depth distribution was varied by the use of cover crops. This shows the model sensitivity to the form parameter value and the ability of the model to reproduce N depletion in soil layers. This work shows that the relatively simple root model developed, driven by degree days and simulated crop growth, can be used to simulate crop soil N uptake and depletion appropriately in low N input crop production systems, with a requirement of few measured parameters

    Stratified care versus usual care for management of patients presenting with sciatica in primary care (SCOPiC): a randomised controlled trial

    Get PDF
    Background Sciatica has a substantial impact on individuals and society. Stratified care has been shown to lead to better outcomes among patients with non-specific low back pain, but it has not been tested for sciatica. We aimed to investigate the clinical and cost-effectiveness of stratified care versus non-stratified usual care for patients presenting with sciatica in primary care. Methods We did a two-parallel arm, pragmatic, randomised controlled trial across three centres in the UK (North Staffordshire, North Shropshire/Wales, and Cheshire). Eligible patients were aged 18 years or older, had a clinical diagnosis of sciatica, access to a mobile phone or landline number, were not pregnant, were not currently receiving treatment for the same problem, and had no previous spinal surgery. Patients were recruited from general practices and randomly assigned (1:1) by a remote web-based service to stratified care or usual care, stratified by centre and stratification group allocation. In the stratified care arm, a combination of prognostic and clinical criteria associated with referral to spinal specialist services were used to allocate patients to one of three groups for matched care pathways. Group 1 was offered brief advice and support in up to two physiotherapy sessions; group 2 was offered up to six physiotherapy sessions; and group 3 was fast-tracked to MRI and spinal specialist assessment within 4 weeks of randomisation. The primary outcome was self-reported time to first resolution of sciatica symptoms, defined as “completely recovered” or “much better” on a 6-point ordinal scale, collected via text messages or telephone calls. Analyses were by intention to treat. Health-care costs and cost-effectiveness were also assessed. This trial is registered on the ISRCTN registry, ISRCTN75449581. Findings Between May 28, 2015, and July 18, 2017, 476 patients from 42 general practices around three UK centres were randomly assigned to stratified care or usual care (238 in each arm). For the primary outcome, the overall response rate was 89% (9467 of 10 601 text messages sent; 4688 [88%] of 5310 in the stratified care arm and 4779 [90%] of 5291 in the usual care arm). Median time to symptom resolution was 10 weeks (95% CI 6·4–13·6) in the stratified care arm and 12 weeks (9·4–14·6) in the usual care arm, with the survival analysis showing no significant difference between the arms (hazard ratio 1·14 [95% CI 0·89–1·46]). Stratified care was not cost-effective compared to usual care. Interpretation The stratified care model for patients with sciatica consulting in primary care was not better than usual care for either clinical or health economic outcomes. These results do not support a transition to this stratified care model for patients with sciatica
    corecore