27 research outputs found

    Patterns of psychotropic prescribing and polypharmacy in older hospitalized patients in Ireland: the influence of dementia on prescribing

    Get PDF
    Neuropsychiatric Symptoms (NPS) are ubiquitous in dementia and are often treated pharmacologically. The objectives of this study were to describe the use of psychotropic, anti-cholinergic, and deliriogenic medications and to identify the prevalence of polypharmacy and psychotropic polypharmacy, among older hospitalized patients in Ireland, with and without dementia. All older patients (≥ 70 years old) that had elective or emergency admissions to six Irish study hospitals were eligible for inclusion in a longitudinal observational study. Of 676 eligible patients, 598 patients were recruited and diagnosed as having dementia, or not, by medical experts. These 598 patients were assessed for delirium, medication use, co-morbidity, functional ability, and nutritional status. We conducted a retrospective cross-sectional analysis of medication data on admission for 583/598 patients with complete medication data, and controlled for age, sex, and co-morbidity. Of 149 patients diagnosed with dementia, only 53 had a previous diagnosis. At hospital admission, 458/583 patients experienced polypharmacy (≥ 5 medications). People with dementia (PwD) were significantly more likely to be prescribed at least one psychotropic medication than patients without dementia (99/147 vs. 182/436; p < 0.001). PwD were also more likely to experience psychotropic polypharmacy (≥ two psychotropics) than those without dementia (54/147 vs. 61/436; p < 0.001). There were no significant differences in the prescribing patterns of anti-cholinergics (23/147 vs. 42/436; p = 0.18) or deliriogenics (79/147 vs. 235/436; p = 0.62). Polypharmacy and psychotropic drug use is highly prevalent in older Irish hospitalized patients, especially in PwD. Hospital admission presents an ideal time for medication reviews in PwD

    Gut microbiomes from Gambian infants reveal the development of a non-industrialized Prevotella-based trophic network.

    Get PDF
    Funder: Bill &amp; Melinda Gates Foundation Grand Challenges New Interventions in Global Health awardFunder: MRC Unit The Gambia/MRC International Nutrition Group by the UK MRC and the UK Department for the International DevelopmentDistinct bacterial trophic networks exist in the gut microbiota of individuals in industrialized and non-industrialized countries. In particular, non-industrialized gut microbiomes tend to be enriched with Prevotella species. To study the development of these Prevotella-rich compositions, we investigated the gut microbiota of children aged between 7 and 37 months living in rural Gambia (616 children, 1,389 stool samples, stratified by 3-month age groups). These infants, who typically eat a high-fibre, low-protein diet, were part of a double-blind, randomized iron intervention trial (NCT02941081) and here we report the secondary outcome. We found that child age was the largest discriminating factor between samples and that anthropometric indices (collection time points, season, geographic collection site, and iron supplementation) did not significantly influence the gut microbiome. Prevotella copri, Faecalibacterium prausnitzii and Prevotella stercorea were, on average, the most abundant species in these 1,389 samples (35%, 11% and 7%, respectively). Distinct bacterial trophic network clusters were identified, centred around either P. stercorea or F. prausnitzii and were found to develop steadily with age, whereas P. copri, independently of other species, rapidly became dominant after weaning. This dataset, set within a critical gut microbial developmental time frame, provides insights into the development of Prevotella-rich gut microbiomes, which are typically understudied and are underrepresented in western populations

    Intermittent preventive treatment with sulphadoxine-pyrimethamine but not dihydroartemisinin-piperaquine modulates the relationship between inflammatory markers and adverse pregnancy outcomes in Malawi

    Get PDF
    Women in malaria-endemic areas receive sulphadoxine-pyrimethamine (SP) as Intermittent Preventive Treatment in Pregnancy (IPTp) to reduce malaria. While dihydroartemisinin-piperaquine (DP) has superior antimalarial properties as IPTp, SP is associated with superior fetal growth. As maternal inflammation influences fetal growth, we investigated whether SP alters the relationship between inflammation and birth outcomes. We measured C-reactive protein (CRP) and alpha-1-acid glycoprotein (AGP) at enrollment (16–28 gestation weeks (gw)), visit 3 (24–36 gw) and delivery in 1319 Malawian women randomized to receive monthly SP, DP, or DP and single-dose azithromycin (AZ) in the IMPROVE trial (NCT03208179). Logistic regression was used to assess the relationship between adverse outcomes, inflammation, and treatment arm. Elevated AGP at enrollment was associated with adverse birth outcome (aRR 1.40, 95% CI: 1.15, 1.70), with similar associations observed across treatment arms, exceptions being that elevated AGP was associated with low maternal weight gain in SP recipients (aRR 1.94, 95% CI: 1.36, 2.76) and with small for gestational age in DP+AZ recepients (aRR 1.49, 95% CI 1.02, 2.17). At visit 3 there were few associations between inflammation andoutcomes. At delivery, women with elevated AGP receiving either DP or DP+AZ had an increased risk of adverse birth outcomes (aRR 1.60, 95% CI: 1.28, 2.00), including low birth weight, pre-term birth and foetal loss, this was not seen in women receiving SP (aRR 0.82, 95% CI: 0.54, 1.26). The risk of an association between elevated AGP and adverse birth outcome was higher in those receiving DP or DP+AZ compared to those receiving SP (aRR 1.95, 95% CI: 1.21, 3.13). No clear associations between CRP and adverse outcomes were observed. AGP identified women at risk of adverse pregnancy outcomes. SP modifies the relationship between inflammatory biomarkers and adverse outcomes. Our findings provide insights into potential mechanisms by which SP may improve pregnancy outcomes

    Attention! A good bedside test for delirium?

    Get PDF
    peer-reviewedBackground Routine delirium screening could improve delirium detection, but it remains unclear as to which screening tool is most suitable. We tested the diagnostic accuracy of the following screening methods (either individually or in combination) in the detection of delirium: MOTYB (months of the year backwards); SSF (Spatial Span Forwards); evidence of subjective or objective 'confusion'.Methods We performed a cross-sectional study of general hospital adult inpatients in a large tertiary referral hospital. Screening tests were performed by junior medical trainees. Subsequently, two independent formal delirium assessments were performed: first, the Confusion Assessment Method (CAM) followed by the Delirium Rating Scale-Revised 98 (DRS-R98). DSM-IV (Diagnostic and Statistical Manual of Mental Disorders, fourth edition) criteria were used to assign delirium diagnosis. Sensitivity and specificity ratios with 95% CIs were calculated for each screening method.Results 265 patients were included. The most precise screening method overall was achieved by simultaneously performing MOTYB and assessing for subjective/objective confusion (sensitivity 93.8%, 95% CI 82.8 to 98.6; specificity 84.7%, 95% CI 79.2 to 89.2). In older patients, MOTYB alone was most accurate, whereas in younger patients, a simultaneous combination of SSF (cutoff 4) with either MOTYB or assessment of subjective/objective confusion was best. In every case, addition of the CAM as a second-line screening step to improve specificity resulted in considerable loss in sensitivity.Conclusions Our results suggest that simple attention tests may be useful in delirium screening. MOTYB used alone was the most accurate screening test in older people.PUBLISHEDpeer-reviewe

    Dynamic Blood-Brain Barrier Regulation in Mild Traumatic Brain Injury

    Get PDF
    Whereas the diagnosis of moderate and severe traumatic brain injury (TBI) is readily visible on current medical imaging paradigms (magnetic resonance imaging [MRI] and computed tomography [CT] scanning), a far greater challenge is associated with the diagnosis and subsequent management of mild TBI (mTBI), especially concussion which, by definition, is characterized by a normal CT. To investigate whether the integrity of the blood-brain barrier (BBB) is altered in a high-risk population for concussions, we studied professional mixed martial arts (MMA) fighters and adolescent rugby players. Additionally, we performed the linear regression between the BBB disruption defined by increased gadolinium contrast extravasation on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) on MRI and multiple biomechanical parameters indicating the severity of impacts recorded using instrumented mouthguards in professional MMA fighters. MMA fighters were examined pre-fight for a baseline and again within 120 h post-competitive fight, whereas rugby players were examined pre-season and again post-season or post-match in a subset of cases. DCE-MRI, serological analysis of BBB biomarkers, and an analysis of instrumented mouthguard data, was performed. Here, we provide pilot data that demonstrate disruption of the BBB in both professional MMA fighters and rugby players, dependent on the level of exposure. Our data suggest that biomechanical forces in professional MMA and adolescent rugby can lead to BBB disruption. These changes on imaging may serve as a biomarker of exposure of the brain to repetitive subconcussive forces and mTBI

    Months backward test: a review of its use in clinical studies

    Get PDF
    To review the use of the Months Backwards Test (MBT) in clinical and research contexts. METHODS: We conducted a systematic review of reports relating to the MBT based upon a search of PsychINFO and MEDLINE between January 1980 and December 2014. Only reports that specifically described findings pertaining to the MBT were included. Findings were considered in terms of rating procedures, testing performance, psychometric properties, neuropsychological studies and use in clinical populations. RESULTS: We identified 22 data reports. The MBT is administered and rated in a variety of ways with very little consistency across studies. It has been used to assess various cognitive functions including focused and sustained attention as well as central processing speed. Performance can be assessed in terms of the ability to accurately complete the test without errors (“MB accuracy”), and time taken to complete the test (“MB duration”). Completion time in cognitively intact subjects is usually < 20 s with upper limits of 60-90 s typically applied in studies. The majority of cognitively intact adults can complete the test without error such that any errors of omission are strongly suggestive of cognitive dysfunction. Coverage of clinical populations, including those with significant cognitive difficulties is high with the majority of subjects able to engage with MBT procedures. Performance correlates highly with other cognitive tests, especially of attention, including the digit span backwards, trailmaking test B, serial threes and sevens, tests of simple and complex choice reaction time, delayed story recall and standardized list learning measures. Test-retest and inter-rater reliability are high (both > 0.90). Functional magnetic resonance imaging studies comparing the months forward test and MBT indicate greater involvement of more complex networks (bilateral middle and inferior frontal gyri, the posterior parietal cortex and the left anterior cingulate gyrus) for backwards cognitive processing. The MBT has been usefully applied to the study of a variety of clinical presentations, for both cognitive and functional assessment. In addition to the assessment of major neuropsychiatric conditions such as delirium, dementia and Mild Cognitive Impairment, the MBT has been used in the assessment of concussion, profiling of neurocognitive impairments in organic brain disorders and Parkinson’s disease, prediction of delirium risk in surgical patients and medication compliance in diabetes. The reported sensitivity for acute neurocognitive disturbance/delirium in hospitalised patients is estimated at 83%-93%. Repeated testing can be used to identify deteriorating cognitive function over time. CONCLUSION: The MBT is a simple, versatile tool that is sensitive to significant cognitive impairment. Performance can be assessed according to accuracy and speed of performance. However, greater consistency in administration and rating is needed. We suggest two approaches to assessing performance - a simple (pass/ fail) method as well as a ten point scale for rating test performance (467)

    Global economic costs due to vivax malaria and the potential impact of its radical cure: A modelling study.

    Get PDF
    BackgroundIn 2017, an estimated 14 million cases of Plasmodium vivax malaria were reported from Asia, Central and South America, and the Horn of Africa. The clinical burden of vivax malaria is largely driven by its ability to form dormant liver stages (hypnozoites) that can reactivate to cause recurrent episodes of malaria. Elimination of both the blood and liver stages of the parasites ("radical cure") is required to achieve a sustained clinical response and prevent ongoing transmission of the parasite. Novel treatment options and point-of-care diagnostics are now available to ensure that radical cure can be administered safely and effectively. We quantified the global economic cost of vivax malaria and estimated the potential cost benefit of a policy of radical cure after testing patients for glucose-6-phosphate dehydrogenase (G6PD) deficiency.Methods and findingsEstimates of the healthcare provider and household costs due to vivax malaria were collated and combined with national case estimates for 44 endemic countries in 2017. These provider and household costs were compared with those that would be incurred under 2 scenarios for radical cure following G6PD screening: (1) complete adherence following daily supervised primaquine therapy and (2) unsupervised treatment with an assumed 40% effectiveness. A probabilistic sensitivity analysis generated credible intervals (CrIs) for the estimates. Globally, the annual cost of vivax malaria was US359million(95359 million (95% CrI: US222 to 563 million), attributable to 14.2 million cases of vivax malaria in 2017. From a societal perspective, adopting a policy of G6PD deficiency screening and supervision of primaquine to all eligible patients would prevent 6.1 million cases and reduce the global cost of vivax malaria to US266million(95266 million (95% CrI: US161 to 415 million), although healthcare provider costs would increase by US39million.Ifperfectadherencecouldbeachievedwithasinglevisit,thentheglobalcostwouldfallfurthertoUS39 million. If perfect adherence could be achieved with a single visit, then the global cost would fall further to US225 million, equivalent to 135millionincostsavingsfromthebaselineglobalcosts.ApolicyofunsupervisedprimaquinereducedthecosttoUS135 million in cost savings from the baseline global costs. A policy of unsupervised primaquine reduced the cost to US342 million (95% CrI: US$209 to 532 million) while preventing 2.1 million cases. Limitations of the study include partial availability of country-level cost data and parameter uncertainty for the proportion of patients prescribed primaquine, patient adherence to a full course of primaquine, and effectiveness of primaquine when unsupervised.ConclusionsOur modelling study highlights a substantial global economic burden of vivax malaria that could be reduced through investment in safe and effective radical cure achieved by routine screening for G6PD deficiency and supervision of treatment. Novel, low-cost interventions for improving adherence to primaquine to ensure effective radical cure and widespread access to screening for G6PD deficiency will be critical to achieving the timely global elimination of P. vivax

    Attention! A good bedside test for delirium?

    No full text
    Background Routine delirium screening could improve delirium detection, but it remains unclear as to which screening tool is most suitable. We tested the diagnostic accuracy of the following screening methods (either individually or in combination) in the detection of delirium: MOTYB (months of the year backwards); SSF (Spatial Span Forwards); evidence of subjective or objective 'confusion'.Methods We performed a cross-sectional study of general hospital adult inpatients in a large tertiary referral hospital. Screening tests were performed by junior medical trainees. Subsequently, two independent formal delirium assessments were performed: first, the Confusion Assessment Method (CAM) followed by the Delirium Rating Scale-Revised 98 (DRS-R98). DSM-IV (Diagnostic and Statistical Manual of Mental Disorders, fourth edition) criteria were used to assign delirium diagnosis. Sensitivity and specificity ratios with 95% CIs were calculated for each screening method.Results 265 patients were included. The most precise screening method overall was achieved by simultaneously performing MOTYB and assessing for subjective/objective confusion (sensitivity 93.8%, 95% CI 82.8 to 98.6; specificity 84.7%, 95% CI 79.2 to 89.2). In older patients, MOTYB alone was most accurate, whereas in younger patients, a simultaneous combination of SSF (cutoff 4) with either MOTYB or assessment of subjective/objective confusion was best. In every case, addition of the CAM as a second-line screening step to improve specificity resulted in considerable loss in sensitivity.Conclusions Our results suggest that simple attention tests may be useful in delirium screening. MOTYB used alone was the most accurate screening test in older people
    corecore