90 research outputs found

    Assessment of clinical outcomes of medicinal cannabis therapy for depression: analysis from the UK Medical Cannabis Registry.

    Get PDF
    BACKGROUND: Although pre-clinical experiments associate cannabinoids with reduced depressive symptoms, there is a paucity of clinical evidence. This study aims to analyze the health-related quality of life changes and safety outcomes in patients prescribed cannabis-based medicinal products (CBMPs) for depression. METHODS: A series of uncontrolled cases from the UK Medical Cannabis Registry were analyzed. The primary outcomes were changes from baseline in the Patient Health Questionnaire-9 (PHQ-9), Generalized Anxiety Disorder-7 (GAD-7), Sleep Quality Scale (SQS), and EQ-5D-5 L at 1, 3, and 6 months. Secondary outcomes included adverse events incidence. RESULTS: 129 patients were identified for inclusion. Median PHQ-9 at baseline was 16.0 (IQR: 9.0-21.0). There were reductions in PHQ-9 at 1-month (median: 8.0; IQR: 4.0-14.0; p < 0.001), 3-months (7.0; 2.3-12.8; p < 0.001), and 6-months (7.0; 2.0-9.5; p < 0.001). Improvements were also observed in GAD-7, SQS, and EQ-5D-5L Index Value at 1, 3, and 6 months (p < 0.050). 153 (118.6%) adverse events were recorded by 14.0% (n = 18) of participants, 87% (n = 133) of which were mild or moderate. CONCLUSION: CBMP treatment was associated with reductions in depression severity at 1, 3, and 6 months. Limitations of the study design mean that a causal relationship cannot be proven. This analysis provides insights for further study within clinical trial settings

    Assessment of clinical outcomes of medicinal cannabis therapy for depression: analysis from the UK Medical Cannabis Registry

    Get PDF
    Background Although pre-clinical experiments associate cannabinoids with reduced depressive symptoms, there is a paucity of clinical evidence. This study aims to analyze the health-related quality of life changes and safety outcomes in patients prescribed cannabis-based medicinal products (CBMPs) for depression. Methods A series of uncontrolled cases from the UK Medical Cannabis Registry were analyzed. The primary outcomes were changes from baseline in the Patient Health Questionnaire-9 (PHQ-9), Generalized Anxiety Disorder-7 (GAD-7), Sleep Quality Scale (SQS), and EQ-5D-5 L at 1, 3, and 6 months. Secondary outcomes included adverse events incidence. Results 129 patients were identified for inclusion. Median PHQ-9 at baseline was 16.0 (IQR: 9.0–21.0). There were reductions in PHQ-9 at 1-month (median: 8.0; IQR: 4.0–14.0; p < 0.001), 3-months (7.0; 2.3–12.8; p < 0.001), and 6-months (7.0; 2.0–9.5; p < 0.001). Improvements were also observed in GAD-7, SQS, and EQ-5D-5L Index Value at 1, 3, and 6 months (p < 0.050). 153 (118.6%) adverse events were recorded by 14.0% (n = 18) of participants, 87% (n = 133) of which were mild or moderate. Conclusion CBMP treatment was associated with reductions in depression severity at 1, 3, and 6 months. Limitations of the study design mean that a causal relationship cannot be proven. This analysis provides insights for further study within clinical trial settings

    Clinical outcome analysis of patients with autism spectrum disorder: analysis from the UK Medical Cannabis Registry

    Get PDF
    Introduction: Cannabis-based medicinal products (CBMPs) have been identified as a promising novel therapeutic for symptoms and comorbidities related to autism spectrum disorder (ASD). However, there is a paucity of clinical evidence of their efficacy and safety. Objective: This case series aims to assess changes to health-related quality of life and the incidence of adverse events in patients treated with CBMPs for associated symptoms of ASD enrolled on the UK Medical Cannabis Registry (UKMCR). Methods: Patients treated with CBMPs for ASD-related symptoms for a minimum of 1 month were identified from the UKMCR. Primary outcomes were changes in validated patient-reported outcome measures [Generalised Anxiety Disorder-7 (GAD-7), Single-Item Sleep Quality Scale (SQS), 5-level version of the EQ-5D (EQ-5D-5L) index values] at 1, 3 and 6 months compared with baseline. Adverse events were recorded and analysed. Statistical significance was determined by p < 0.050. Results: Seventy-four patients with ASD were included in the analysis. The mean age of participants was 32.7 (±11.6) years. There were significant improvements in general health-related quality of life and sleep as assessed by the EQ-5D-5L, SQS and GAD-7 at 1 and 3 months, with sustained changes in EQ-5D-5L and SQS at 6 months (p < 0.010). There were 180 (243.2%) adverse events reported by 14 (18.9%) participants. If present, adverse events were commonly mild (n = 58; 78.4%) or moderate (n = 81; 109.5%), rather than severe (n = 41; 55.4%). Conclusion: This study demonstrated an associated improvement in general health-related quality of life, and anxiety- and sleep-specific symptoms following initiation of treatment with CBMPs in patients with ASD. These findings, while promising, are limited by the confines of the study which lacks a control arm and is subject to attrition bias. Therefore, further evaluation is required with randomised controlled trials

    Think globally, measure locally: The MIREN standardized protocol for monitoring plant species distributions along elevation gradients

    Get PDF
    Climate change and other global change drivers threaten plant diversity in mountains worldwide. A widely documented response to such environmental modifications is for plant species to change their elevational ranges. Range shifts are often idiosyncratic and difficult to generalize, partly due to variation in sampling methods. There is thus a need for a standardized monitoring strategy that can be applied across mountain regions to assess distribution changes and community turnover of native and non-native plant species over space and time. Here, we present a conceptually intuitive and standardized protocol developed by the Mountain Invasion Research Network (MIREN) to systematically quantify global patterns of native and non-native species distributions along elevation gradients and shifts arising from interactive effects of climate change and human disturbance. Usually repeated every five years, surveys consist of 20 sample sites located at equal elevation increments along three replicate roads per sampling region. At each site, three plots extend from the side of a mountain road into surrounding natural vegetation. The protocol has been successfully used in 18 regions worldwide from 2007 to present. Analyses of one point in time already generated some salient results, and revealed region-specific elevational patterns of native plant species richness, but a globally consistent elevational decline in non-native species richness. Non-native plants were also more abundant directly adjacent to road edges, suggesting that disturbed roadsides serve as a vector for invasions into mountains. From the upcoming analyses of time series, even more exciting results can be expected, especially about range shifts. Implementing the protocol in more mountain regions globally would help to generate a more complete picture of how global change alters species distributions. This would inform conservation policy in mountain ecosystems, where some conservation policies remain poorly implemented

    Bacteria-inducing legume nodules involved in the improvement of plant growth, health and nutrition

    Get PDF
    Bacteria-inducing legume nodules are known as rhizobia and belong to the class Alphaproteobacteria and Betaproteobacteria. They promote the growth and nutrition of their respective legume hosts through atmospheric nitrogen fixation which takes place in the nodules induced in their roots or stems. In addition, rhizobia have other plant growth-promoting mechanisms, mainly solubilization of phosphate and production of indoleacetic acid, ACC deaminase and siderophores. Some of these mechanisms have been reported for strains of rhizobia which are also able to promote the growth of several nonlegumes, such as cereals, oilseeds and vegetables. Less studied are the mechanisms that have the rhizobia to promote the plant health; however, these bacteria are able to exert biocontrol of some phytopathogens and to induce the plant resistance. In this chapter, we revised the available data about the ability of the legume nodule-inducing bacteria for improving the plant growth, health and nutrition of both legumes and nonlegumes. These data showed that rhizobia meet all the requirements of sustainable agriculture to be used as bio-inoculants allowing the total or partial replacement of chemicals used for fertilization or protection of crops

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Identification and reconstruction of low-energy electrons in the ProtoDUNE-SP detector

    Full text link
    Measurements of electrons from Îœe\nu_e interactions are crucial for the Deep Underground Neutrino Experiment (DUNE) neutrino oscillation program, as well as searches for physics beyond the standard model, supernova neutrino detection, and solar neutrino measurements. This article describes the selection and reconstruction of low-energy (Michel) electrons in the ProtoDUNE-SP detector. ProtoDUNE-SP is one of the prototypes for the DUNE far detector, built and operated at CERN as a charged particle test beam experiment. A sample of low-energy electrons produced by the decay of cosmic muons is selected with a purity of 95%. This sample is used to calibrate the low-energy electron energy scale with two techniques. An electron energy calibration based on a cosmic ray muon sample uses calibration constants derived from measured and simulated cosmic ray muon events. Another calibration technique makes use of the theoretically well-understood Michel electron energy spectrum to convert reconstructed charge to electron energy. In addition, the effects of detector response to low-energy electron energy scale and its resolution including readout electronics threshold effects are quantified. Finally, the relation between the theoretical and reconstructed low-energy electron energy spectrum is derived and the energy resolution is characterized. The low-energy electron selection presented here accounts for about 75% of the total electron deposited energy. After the addition of lost energy using a Monte Carlo simulation, the energy resolution improves from about 40% to 25% at 50~MeV. These results are used to validate the expected capabilities of the DUNE far detector to reconstruct low-energy electrons.Comment: 19 pages, 10 figure

    Impact of cross-section uncertainties on supernova neutrino spectral parameter fitting in the Deep Underground Neutrino Experiment

    Get PDF
    A primary goal of the upcoming Deep Underground Neutrino Experiment (DUNE) is to measure the O(10)\mathcal{O}(10) MeV neutrinos produced by a Galactic core-collapse supernova if one should occur during the lifetime of the experiment. The liquid-argon-based detectors planned for DUNE are expected to be uniquely sensitive to the Îœe\nu_e component of the supernova flux, enabling a wide variety of physics and astrophysics measurements. A key requirement for a correct interpretation of these measurements is a good understanding of the energy-dependent total cross section σ(EÎœ)\sigma(E_\nu) for charged-current Îœe\nu_e absorption on argon. In the context of a simulated extraction of supernova Îœe\nu_e spectral parameters from a toy analysis, we investigate the impact of σ(EÎœ)\sigma(E_\nu) modeling uncertainties on DUNE's supernova neutrino physics sensitivity for the first time. We find that the currently large theoretical uncertainties on σ(EÎœ)\sigma(E_\nu) must be substantially reduced before the Îœe\nu_e flux parameters can be extracted reliably: in the absence of external constraints, a measurement of the integrated neutrino luminosity with less than 10\% bias with DUNE requires σ(EÎœ)\sigma(E_\nu) to be known to about 5%. The neutrino spectral shape parameters can be known to better than 10% for a 20% uncertainty on the cross-section scale, although they will be sensitive to uncertainties on the shape of σ(EÎœ)\sigma(E_\nu). A direct measurement of low-energy Îœe\nu_e-argon scattering would be invaluable for improving the theoretical precision to the needed level.Comment: 25 pages, 21 figure

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Worldwide trends in hypertension prevalence and progress in treatment and control from 1990 to 2019: a pooled analysis of 1201 population-representative studies with 104 million participants

    Get PDF
    Background: Hypertension can be detected at the primary health-care level and low-cost treatments can effectively control hypertension. We aimed to measure the prevalence of hypertension and progress in its detection, treatment, and control from 1990 to 2019 for 200 countries and territories.Methods: We used data from 1990 to 2019 on people aged 30-79 years from population-representative studies with measurement of blood pressure and data on blood pressure treatment. We defined hypertension as having systolic blood pressure 140 mm Hg or greater, diastolic blood pressure 90 mm Hg or greater, or taking medication for hypertension. We applied a Bayesian hierarchical model to estimate the prevalence of hypertension and the proportion of people with hypertension who had a previous diagnosis (detection), who were taking medication for hypertension (treatment), and whose hypertension was controlled to below 140/90 mm Hg (control). The model allowed for trends over time to be non-linear and to vary by age.Findings: The number of people aged 30-79 years with hypertension doubled from 1990 to 2019, from 331 (95% credible interval 306-359) million women and 317 (292-344) million men in 1990 to 626 (584-668) million women and 652 (604-698) million men in 2019, despite stable global age-standardised prevalence. In 2019, age-standardised hypertension prevalence was lowest in Canada and Peru for both men and women; in Taiwan, South Korea, Japan, and some countries in western Europe including Switzerland, Spain, and the UK for women; and in several low-income and middle-income countries such as Eritrea, Bangladesh, Ethiopia, and Solomon Islands for men. Hypertension prevalence surpassed 50% for women in two countries and men in nine countries, in central and eastern Europe, central Asia, Oceania, and Latin America. Globally, 59% (55-62) of women and 49% (46-52) of men with hypertension reported a previous diagnosis of hypertension in 2019, and 47% (43-51) of women and 38% (35-41) of men were treated. Control rates among people with hypertension in 2019 were 23% (20-27) for women and 18% (16-21) for men. In 2019, treatment and control rates were highest in South Korea, Canada, and Iceland (treatment >70%; control >50%), followed by the USA, Costa Rica, Germany, Portugal, and Taiwan. Treatment rates were less than 25% for women and less than 20% for men in Nepal, Indonesia, and some countries in sub-Saharan Africa and Oceania. Control rates were below 10% for women and men in these countries and for men in some countries in north Africa, central and south Asia, and eastern Europe. Treatment and control rates have improved in most countries since 1990, but we found little change in most countries in sub-Saharan Africa and Oceania. Improvements were largest in high-income countries, central Europe, and some upper-middle-income and recently high-income countries including Costa Rica, Taiwan, Kazakhstan, South Africa, Brazil, Chile, Turkey, and Iran.Interpretation: Improvements in the detection, treatment, and control of hypertension have varied substantially across countries, with some middle-income countries now outperforming most high-income nations. The dual approach of reducing hypertension prevalence through primary prevention and enhancing its treatment and control is achievable not only in high-income countries but also in low-income and middle-income settings.Copyright (C) 2021 World Health Organization; licensee Elsevier.</p
    • 

    corecore