821 research outputs found

    Developmental allometry and paediatric malaria

    Get PDF
    WHO estimates that 80% of mortality due to malaria occurs among infants and young children. Though it has long been established that malaria disproportionately affects children under age five, our understanding of the underlying biological mechanisms for this distribution remains incomplete. Many studies use age as an indicator of exposure, but age may affect malaria burden independently of previous exposure. Not only does the severity of malaria infection change with age, but the clinical manifestation of disease does as well: younger children are more likely to suffer severe anaemia, while older children are more likely to develop cerebral malaria. Intensity of transmission and acquired immunity are important determinants of this age variation, but age differences remain consistent over varying transmission levels. Thus, age differences in clinical presentation may involve inherent age-related factors as well as still-undiscovered facets of acquired immunity, perhaps including the rates at which relevant aspects of immunity are acquired. The concept of "allometry" - the relative growth of a part in relation to that of an entire organism or to a standard - has not previously been applied in the context of malaria infection. However, because malaria affects a number of organs and cells, including the liver, red blood cells, white blood cells, and spleen, which may intrinsically develop at rates partly independent of each other and of a child's overall size, developmental allometry may influence the course and consequences of malaria infection. Here, scattered items of evidence have been collected from a variety of disciplines, aiming to suggest possible research paths for investigating exposure-independent age differences affecting clinical outcomes of malaria infection

    Inactivation of Poxviruses by Upper-Room UVC Light in a Simulated Hospital Room Environment

    Get PDF
    In the event of a smallpox outbreak due to bioterrorism, delays in vaccination programs may lead to significant secondary transmission. In the early phases of such an outbreak, transmission of smallpox will take place especially in locations where infected persons may congregate, such as hospital emergency rooms. Air disinfection using upper-room 254 nm (UVC) light can lower the airborne concentrations of infective viruses in the lower part of the room, and thereby control the spread of airborne infections among room occupants without exposing occupants to a significant amount of UVC. Using vaccinia virus aerosols as a surrogate for smallpox we report on the effectiveness of air disinfection, via upper-room UVC light, under simulated real world conditions including the effects of convection, mechanical mixing, temperature and relative humidity. In decay experiments, upper-room UVC fixtures used with mixing by a conventional ceiling fan produced decreases in airborne virus concentrations that would require additional ventilation of more than 87 air changes per hour. Under steady state conditions the effective air changes per hour associated with upper-room UVC ranged from 18 to 1000. The surprisingly high end of the observed range resulted from the extreme susceptibility of vaccinia virus to UVC at low relative humidity and use of 4 UVC fixtures in a small room with efficient air mixing. Increasing the number of UVC fixtures or mechanical ventilation rates resulted in greater fractional reduction in virus aerosol and UVC effectiveness was higher in winter compared to summer for each scenario tested. These data demonstrate that upper-room UVC has the potential to greatly reduce exposure to susceptible viral aerosols. The greater survival at baseline and greater UVC susceptibility of vaccinia under winter conditions suggest that while risk from an aerosol attack with smallpox would be greatest in winter, protective measures using UVC may also be most efficient at this time. These data may also be relevant to influenza, which also has improved aerosol survival at low RH and somewhat similar sensitivity to UVC

    Prognostic markers in cancer: the evolution of evidence from single studies to meta-analysis, and beyond

    Get PDF
    In oncology, prognostic markers are clinical measures used to help elicit an individual patient's risk of a future outcome, such as recurrence of disease after primary treatment. They thus facilitate individual treatment choice and aid in patient counselling. Evidence-based results regarding prognostic markers are therefore very important to both clinicians and their patients. However, there is increasing awareness that prognostic marker studies have been neglected in the drive to improve medical research. Large protocol-driven, prospective studies are the ideal, with appropriate statistical analysis and clear, unbiased reporting of the methods used and the results obtained. Unfortunately, published prognostic studies rarely meet such standards, and systematic reviews and meta-analyses are often only able to draw attention to the paucity of good-quality evidence. We discuss how better-quality prognostic marker evidence can evolve over time from initial exploratory studies, to large protocol-driven primary studies, and then to meta-analysis or even beyond, to large prospectively planned pooled analyses and to the initiation of tumour banks. We highlight articles that facilitate each stage of this process, and that promote current guidelines aimed at improving the design, analysis, and reporting of prognostic marker research. We also outline why collaborative, multi-centre, and multi-disciplinary teams should be an essential part of future studies

    Methodology of a novel risk stratification algorithm for patients with multiple myeloma in the relapsed setting

    Get PDF
    Introduction Risk stratification tools provide valuable information to inform treatment decisions. Existing algorithms for patients with multiple myeloma (MM) were based on patients with newly diagnosed disease, and these have not been validated in the relapsed setting or in routine clinical practice. We developed a risk stratification algorithm (RSA) for patients with MM at initiation of second-line (2L) treatment, based on data from the Czech Registry of Monoclonal Gammopathies. Methods Predictors of overall survival (OS) at 2L treatment were identified using Cox proportional hazards models and backward selection. Risk scores were obtained by multiplying the hazard ratios for each predictor. The K-adaptive partitioning for survival (KAPS) algorithm defined four groups of stratification based on individual risk scores. Results Performance of the RSA was assessed using Nagelkerke’s R2 test and Harrell’s concordance index through Kaplan–Meier analysis of OS data. Prognostic groups were successfully defined based on real-world data. Use of a multiplicative score based on Cox modeling and KAPS to define cut-off values was effective. Conclusion Through innovative methods of risk assessment and collaboration between physicians and statisticians, the RSA was capable of stratifying patients at 2L treatment by survival expectations. This approach can be used to develop clinical decision-making tools in other disease areas to improve patient management

    Pseudomonas aeruginosa Adaptation to Lungs of Cystic Fibrosis Patients Leads to Lowered Resistance to Phage and Protist Enemies

    Get PDF
    Pathogenic life styles can lead to highly specialized interactions with host species, potentially resulting in fitness trade-offs in other ecological contexts. Here we studied how adaptation of the environmentally transmitted bacterial pathogen, Pseudomonas aeruginosa, to cystic fibrosis (CF) patients affects its survival in the presence of natural phage (14/1, ΦKZ, PNM and PT7) and protist (Tetrahymena thermophila and Acanthamoebae polyphaga) enemies. We found that most of the bacteria isolated from relatively recently intermittently colonised patients (1-25 months), were innately phage-resistant and highly toxic for protists. In contrast, bacteria isolated from long time chronically infected patients (2-23 years), were less efficient in both resisting phages and killing protists. Moreover, chronic isolates showed reduced killing of wax moth larvae (Galleria mellonella) probably due to weaker in vitro growth and protease expression. These results suggest that P. aeruginosa long-term adaptation to CF-lungs could trade off with its survival in aquatic environmental reservoirs in the presence of microbial enemies, while lowered virulence could reduce pathogen opportunities to infect insect vectors; factors that are both likely to result in poorer environmental transmission. From an applied perspective, phage therapy could be useful against chronic P. aeruginosa lung infections that are often characterized by multidrug resistance: chronic isolates were least resistant to phages and their poor growth will likely slow down the emergence of beneficial resistance mutations

    First GIS analysis of modern stone tools used by wild chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    Get PDF
    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record.Leverhulme Trust [IN-052]; MEXT [20002001, 24000001]; JSPS-U04-PWS; FCT-Portugal [SFRH/BD/36169/2007]; Wenner-Gren Foundation for Anthropological Researc

    Modeling Routes of Chronic Wasting Disease Transmission: Environmental Prion Persistence Promotes Deer Population Decline and Extinction

    Get PDF
    Chronic wasting disease (CWD) is a fatal disease of deer, elk, and moose transmitted through direct, animal-to-animal contact, and indirectly, via environmental contamination. Considerable attention has been paid to modeling direct transmission, but despite the fact that CWD prions can remain infectious in the environment for years, relatively little information exists about the potential effects of indirect transmission on CWD dynamics. In the present study, we use simulation models to demonstrate how indirect transmission and the duration of environmental prion persistence may affect epidemics of CWD and populations of North American deer. Existing data from Colorado, Wyoming, and Wisconsin's CWD epidemics were used to define plausible short-term outcomes and associated parameter spaces. Resulting long-term outcomes range from relatively low disease prevalence and limited host-population decline to host-population collapse and extinction. Our models suggest that disease prevalence and the severity of population decline is driven by the duration that prions remain infectious in the environment. Despite relatively low epidemic growth rates, the basic reproductive number, R0, may be much larger than expected under the direct-transmission paradigm because the infectious period can vastly exceed the host's life span. High prion persistence is expected to lead to an increasing environmental pool of prions during the early phases (i.e. approximately during the first 50 years) of the epidemic. As a consequence, over this period of time, disease dynamics will become more heavily influenced by indirect transmission, which may explain some of the observed regional differences in age and sex-specific disease patterns. This suggests management interventions, such as culling or vaccination, will become increasingly less effective as CWD epidemics progress

    Incidence and pattern of mycophenolate discontinuation associated with abnormal monitoring blood-test results: cohort study using data from the Clinical Practice Research Datalink Aurum

    Get PDF
    Abstract Objective To examine the incidence and pattern of mycophenolate discontinuation associated with abnormal monitoring blood-tests. Methods Data from people prescribed mycophenolate for common inflammatory conditions in the Clinical Practice Research Datalink was used. Participants were followed from first mycophenolate prescription. Primary outcome was drug discontinuation with an associated abnormal blood-test result within 60 days. Secondary outcomes were drug discontinuation for any reason, and discontinuations associated with severely abnormal blood-test results within 60 days. Multivariable cox-regression was used to examine factors associated with primary outcome. Results The cohort included 992 participants (68.9% female, mean age 51.95 years, 47.1% with SLE) contributing 1,885 person-years of follow-up. The incidence of mycophenolate discontinuation associated with any (severely) abnormal blood-test results was 153.46 (21.07) per 1000 person-years in the first, and 32.39 (7.91) per 1000 person-years in later years of prescription, respectively. 11.5% (1.7%) patients prescribed mycophenolate discontinued treatment with any (severely) abnormal blood-test results in the first year of prescription. After this period mean 2.6% (0.7%) patients discontinued treatment with any (severely) abnormal blood-test results per year. Increased serum creatinine and cytopenia were more commonly associated with mycophenolate discontinuation than elevated liver enzymes. CKD-stage ≥3 was significantly associated with mycophenolate discontinuation with any blood-test abnormalities (aHR (95%CI) 2.22 (1.47–3.37)). Conclusion Mycophenolate is uncommonly discontinued for blood-test abnormalities, and, even less often discontinued for severe blood-test abnormalities after the first year of prescription. Consideration may be given for less frequent monitoring after one-year of treatment, especially in those without CKD-stage ≥3. </jats:sec

    An appraisal of students' awareness of "self-reflection" in a first-year pathology course of undergraduate medical/dental education

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Self-reflection and reflective practice are increasingly considered as essential attributes of competent professionals functioning in complex and ever-changing healthcare systems of the 21<sup>st </sup>century. The aim of this study was to determine the extent of students' awareness and understanding of the reflective process and the meaning of 'self-reflection' within the contextual framework of their learning environment in the first-year of their medical/dental education. We endorse that the introduction of such explicit educational tasks at this early stage enhances and promotes students' awareness, understanding, and proficiency of this skill in their continuing life-long health professional learning.</p> <p>Methods</p> <p>Over two years, students registered in first-year pathology at the University of Saskatchewan were introduced to a self-reflection assignment which comprised in the submission of a one-page reflective document to a template of reflective questions provided in the given context of their learning environment. This was a mandatory but ungraded component at the midterm and final examinations. These documents were individually analyzed and thematically categorized to a "5 levels-of-reflection-awareness" scale using a specially-designed rubric based on the accepted major theories of reflection that included students' identification of: 1) personal abilities, 2) personal learning styles 3) relationships between course material and student history 4) emotional responses and 5) future applications.</p> <p>Results</p> <p>410 self-reflection documents were analyzed. The student self-awareness on personal learning style (72.7% level 3+) and course content (55.2% level 3+) were well-reflected. Reflections at a level 1 awareness included identification of a) specific teaching strategies utilized to enhance learning (58.4%), b) personal strengths/weaknesses (53%), and c) emotional responses, values, and beliefs (71.5%). Students' abilities to connect information to life experiences and to future events with understanding were more evenly distributed across all 5 levels of reflection-awareness.</p> <p>Conclusions</p> <p>Exposure to self-reflection assignments in the early years of undergraduate medical education increases student awareness and promotes the creation of personal meaning of one's reactions, values, and premises in the context of student learning environments. Early introduction with repetition to such cognitive processes as practice tools increases engagement in reflection that may facilitate proficiency in mastering this competency leading to the creation of future reflective health professionals.</p

    Attainment of Low Disease Activity and Remission Targets reduces the risk of severe flare and new damage in Childhood Lupus

    Get PDF
    Objectives: To assess the achievability and effect of attaining low disease activity (LDA) or remission in childhood (cSLE). / Methods: Attainment of three adult-SLE derived definitions of LDA (LLDAS, LA, Toronto-LDA), and four definitions of remission (clinical-SLEDAI-defined remission on/off treatment, pBILAG-defined remission on/off treatment) was assessed in UK JSLE Cohort Study patients longitudinally. Prentice-Williams-Petersen-GAP recurrent event models assessed the impact of LDA/remission attainment on severe flare/new damage. / Results: LLDAS, LA and Toronto-LDA targets were reached in 67%, 73% and 32% of patients, after a median of 18, 15 or 17 months, respectively. Cumulatively, LLDAS, LA and Toronto-LDA was attained for a median of 23%, 31% and 19% of total follow-up-time, respectively. Remission on-treatment was more common (61% cSLEDAI-defined, 42% pBILAG-defined) than remission off-treatment (31% cSLEDAI-defined, 21% pBILAG-defined). Attainment of all target states, and disease duration (>1 year), significantly reduced the hazard of severe flare (p 0.05). Attainment of all targets reduced the hazards of new damage (p< 0.05). / Conclusions: This is the first study demonstrating that adult-SLE-derived definitions of LDA/remission are achievable in cSLE, significantly reducing risk of severe flare/new damage. Of the LDA definitions, LLDAS performed best, leading to a statistically comparable reduction in the hazards of severe flare to attainment of clinical-remission
    corecore