83 research outputs found
Creation of the first national linked colorectal cancer dataset in Scotland:prospects for future research and a reflection on lessons learned
Introduction:
Current understanding of cancer patients, their treatment pathways and outcomes relies mainly on information from clinical trials and prospective research studies representing a selected sub-set of the patient population. Whole-population analysis is necessary if we are to assess the true impact of new interventions or policy in a real-world setting. Accurate measurement of geographic variation in healthcare use and outcomes also relies on population-level data. Routine access to such data offers efficiency in research resource allocation and a basis for policy that addresses inequalities in care provision.
Objective:
Acknowledging these benefits, the objective of this project was to create a population level dataset in Scotland of patients with a diagnosis of colorectal cancer (CRC).
Methods:
This paper describes the process of creating a novel, national dataset in Scotland.
Results:
In total, thirty two separate healthcare administrative datasets have been linked to provide a comprehensive resource to investigate the management pathways and outcomes for patients with CRC in Scotland, as well as the costs of providing CRC treatment. This is the first time that chemotherapy prescribing and national audit datasets have been linked with the Scottish Cancer Registry on a national scale.
Conclusions:
We describe how the acquired dataset can be used as a research resource and reflect on the data access challenges relating to its creation. Lessons learned from this process and the policy implications for future studies using administrative cancer data are highlighted
The Impact of Phenotypic and Genotypic G6PD Deficiency on Risk of Plasmodium vivax Infection: A Case-Control Study amongst Afghan Refugees in Pakistan
Analyses of a case-control study among Afghan refugees in Pakistan find that a G6PD (glucose-6-phosphate dehydrogenase) “Mediterranean” type deficiency confers substantial protection against Plasmodium vivax malaria
A closer look at multiple-clone Plasmodium vivax infections: detection methods, prevalence and consequences
A Molecular Epidemiological Study of var Gene Diversity to Characterize the Reservoir of Plasmodium falciparum in Humans in Africa
BACKGROUND: The reservoir of Plasmodium infection in humans has traditionally been defined by blood slide positivity. This study was designed to characterize the local reservoir of infection in relation to the diverse var genes that encode the major surface antigen of Plasmodium falciparum blood stages and underlie the parasite's ability to establish chronic infection and transmit from human to mosquito. METHODOLOGY/PRINCIPAL FINDINGS: We investigated the molecular epidemiology of the var multigene family at local sites in Gabon, Senegal and Kenya which differ in parasite prevalence and transmission intensity. 1839 distinct var gene types were defined by sequencing DBLα domains in the three sites. Only 76 (4.1%) var types were found in more than one population indicating spatial heterogeneity in var types across the African continent. The majority of var types appeared only once in the population sample. Non-parametric statistical estimators predict in each population at minimum five to seven thousand distinct var types. Similar diversity of var types was seen in sites with different parasite prevalences. CONCLUSIONS/SIGNIFICANCE: Var population genomics provides new insights into the epidemiology of P. falciparum in Africa where malaria has never been conquered. In particular, we have described the extensive reservoir of infection in local African sites and discovered a unique var population structure that can facilitate superinfection through minimal overlap in var repertoires among parasite genomes. Our findings show that var typing as a molecular surveillance system defines the extent of genetic complexity in the reservoir of infection to complement measures of malaria prevalence. The observed small scale spatial diversity of var genes suggests that var genetics could greatly inform current malaria mapping approaches and predict complex malaria population dynamics due to the import of var types to areas where no widespread pre-existing immunity in the population exists
Mapping of variations in child stunting, wasting and underweight within the states of India: the Global Burden of Disease Study 2000–2017
Background
To inform actions at the district level under the National Nutrition Mission (NNM), we assessed the prevalence trends of child growth failure (CGF) indicators for all districts in India and inequality between districts within the states.
Methods
We assessed the trends of CGF indicators (stunting, wasting and underweight) from 2000 to 2017 across the districts of India, aggregated from 5 × 5 km grid estimates, using all accessible data from various surveys with subnational geographical information. The states were categorised into three groups using their Socio-demographic Index (SDI) levels calculated as part of the Global Burden of Disease Study based on per capita income, mean education and fertility rate in women younger than 25 years. Inequality between districts within the states was assessed using coefficient of variation (CV). We projected the prevalence of CGF indicators for the districts up to 2030 based on the trends from 2000 to 2017 to compare with the NNM 2022 targets for stunting and underweight, and the WHO/UNICEF 2030 targets for stunting and wasting. We assessed Pearson correlation coefficient between two major national surveys for district-level estimates of CGF indicators in the states.
Findings
The prevalence of stunting ranged 3.8-fold from 16.4% (95% UI 15.2–17.8) to 62.8% (95% UI 61.5–64.0) among the 723 districts of India in 2017, wasting ranged 5.4-fold from 5.5% (95% UI 5.1–6.1) to 30.0% (95% UI 28.2–31.8), and underweight ranged 4.6-fold from 11.0% (95% UI 10.5–11.9) to 51.0% (95% UI 49.9–52.1). 36.1% of the districts in India had stunting prevalence 40% or more, with 67.0% districts in the low SDI states group and only 1.1% districts in the high SDI states with this level of stunting. The prevalence of stunting declined significantly from 2010 to 2017 in 98.5% of the districts with a maximum decline of 41.2% (95% UI 40.3–42.5), wasting in 61.3% with a maximum decline of 44.0% (95% UI 42.3–46.7), and underweight in 95.0% with a maximum decline of 53.9% (95% UI 52.8–55.4). The CV varied 7.4-fold for stunting, 12.2-fold for wasting, and 8.6-fold for underweight between the states in 2017; the CV increased for stunting in 28 out of 31 states, for wasting in 16 states, and for underweight in 20 states from 2000 to 2017. In order to reach the NNM 2022 targets for stunting and underweight individually, 82.6% and 98.5% of the districts in India would need a rate of improvement higher than they had up to 2017, respectively. To achieve the WHO/UNICEF 2030 target for wasting, all districts in India would need a rate of improvement higher than they had up to 2017. The correlation between the two national surveys for district-level estimates was poor, with Pearson correlation coefficient of 0.7 only in Odisha and four small north-eastern states out of the 27 states covered by these surveys.
Interpretation
CGF indicators have improved in India, but there are substantial variations between the districts in their magnitude and rate of decline, and the inequality between districts has increased in a large proportion of the states. The poor correlation between the national surveys for CGF estimates highlights the need to standardise collection of anthropometric data in India. The district-level trends in this report provide a useful reference for targeting the efforts under NNM to reduce CGF across India and meet the Indian and global targets.
Keywords
Child growth failureDistrict-levelGeospatial mappingInequalityNational Nutrition MissionPrevalenceStuntingTime trendsUnder-fiveUndernutritionUnderweightWastingWHO/UNICEF target
A Policy-into-Practice Intervention to Increase the Uptake of Evidence-Based Management of Low Back Pain in Primary Care: A Prospective Cohort Study
BACKGROUND: Persistent non-specific low back pain (nsLBP) is poorly understood by the general community, by educators, researchers and health professionals, making effective care problematic. This study evaluated the effectiveness of a policy-into-practice intervention developed for primary care physicians (PCPs). METHODS: To encourage PCPs to adopt practical evidence-based approaches and facilitate time-efficient, integrated management of patients with nsLBP, we developed an interdisciplinary evidence-based, practical pain education program (gPEP) based on a contemporary biopsychosocial framework. One hundred and twenty six PCPs from primary care settings in Western Australia were recruited. PCPs participated in a 6.5-hour gPEP. Self-report measures recorded at baseline and at 2 months post-intervention included PCPs' attitudes, beliefs (modified Health Care Providers Pain and Impairment Relationship Scale (HC-PAIRS), evidence-based clinical practices (knowledge and skills regarding nsLBP management: 5-point Likert scale with 1 = nil and 5 = excellent) and practice behaviours (recommendations based on a patient vignette; 5-point Likert scale). RESULTS: Ninety one PCPs participated (attendance rate of 72%; post-intervention response rate 88%). PCP-responders adopted more positive, guideline-consistent beliefs, evidenced by clinically significant HC-PAIRS score differences (mean change = -5.6±8.2, p<0.0001; 95% confidence interval: -7.6 to -3.6) and significant positive shifts on all measures of clinical knowledge and skills (p<0.0001 for all questions). Self management strategies were recommended more frequently post-intervention. The majority of responders who were guideline-inconsistent for work and bed rest recommendations (82% and 62% respectively) at pre-intervention, gave guideline-consistent responses at post-intervention. CONCLUSION: An interprofessional pain education program set within a framework that aligns health policy and practice, encourages PCPs to adopt more self-reported evidence-based attitudes, beliefs and clinical behaviours in their management of patients with nsLBP. However, further research is required to determine cost effectiveness of this approach when compared with other modes of educational delivery and to examine PCP behaviours in actual clinical practice
Successful malaria elimination strategies require interventions that target changing vector behaviours
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (n = 143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (n = 152), or no hydrocortisone (n = 108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (n = 137), shock-dependent (n = 146), and no (n = 101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Genetic mechanisms of critical illness in COVID-19.
Host-mediated lung inflammation is present1, and drives mortality2, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development3. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10-8) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10-8) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 × 10-12) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10-8) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice
- …