720 research outputs found

    Use of combined oral contraceptives and risk of venous thromboembolism: nested case-control studies using the QResearch and CPRD databases

    Get PDF
    Objective To investigate the association between use of combined oral contraceptives and risk of venous thromboembolism, taking the type of progestogen into account. Design Two nested case-control studies. Setting General practices in the United Kingdom contributing to the Clinical Practice Research Datalink (CPRD; 618 practices) and QResearch primary care database (722 practices). Participants Women aged 15-49 years with a first diagnosis of venous thromboembolism in 2001-13, each matched with up to five controls by age, practice, and calendar year. Main outcome measures Odds ratios for incident venous thromboembolism and use of combined oral contraceptives in the previous year, adjusted for smoking status, alcohol consumption, ethnic group, body mass index, comorbidities, and other contraceptive drugs. Results were combined across the two datasets. Results 5062 cases of venous thromboembolism from CPRD and 5500 from QResearch were analysed. Current exposure to any combined oral contraceptive was associated with an increased risk of venous thromboembolism (adjusted odds ratio 2.97, 95% confidence interval 2.78 to 3.17) compared with no exposure in the previous year. Corresponding risks associated with current exposure to desogestrel (4.28, 3.66 to 5.01), gestodene (3.64, 3.00 to 4.43), drospirenone (4.12, 3.43 to 4.96), and cyproterone (4.27, 3.57 to 5.11) were significantly higher than those for second generation contraceptives levonorgestrel (2.38, 2.18 to 2.59) and norethisterone (2.56, 2.15 to 3.06), and for norgestimate (2.53, 2.17 to 2.96). The number of extra cases of venous thromboembolism per year per 10 000 treated women was lowest for levonorgestrel (6, 95% confidence interval 5 to 7) and norgestimate (6, 5 to 8), and highest for desogestrel (14, 11 to 17) and cyproterone (14, 11 to 17). Conclusions In these population based, case-control studies using two large primary care databases, risks of venous thromboembolism associated with combined oral contraceptives were, with the exception of norgestimate, higher for newer drug preparations than for second generation drugs

    Preliminary results of a feasibility study of the use of information technology for identification of suspected colorectal cancer in primary care: the CREDIBLE study

    Get PDF
    This is the final version of the article. Available from Cancer Research UK/Nature Publishing Group via the DOI in this record.BACKGROUND: We report the findings of a feasibility study using information technology to search electronic primary care records and to identify patients with possible colorectal cancer. METHODS: An algorithm to flag up patients meeting National Institute for Health and Care Excellence (NICE) urgent referral criteria for suspected colorectal cancer was developed and incorporated into clinical audit software. This periodically flagged up such patients aged 60 to 79 years. General practitioners (GPs) reviewed flagged-up patients and decided on further clinical management. We report the numbers of patients identified and the numbers that GPs judged to need further review, investigations or referral to secondary care and the final diagnoses. RESULTS: Between January 2012 and March 2014, 19,580 records of patients aged 60 to 79 years were searched in 20 UK general practices, flagging up 809 patients who met urgent referral criteria. The majority of the patients had microcytic anaemia (236 (29%)) or rectal bleeding (205 (25%)). A total of 274 (34%) patients needed further clinical review of their records; 199 (73%) of these were invited for GP consultation, and 116 attended, of whom 42 were referred to secondary care. Colon cancer was diagnosed in 10 out of 809 (1.2%) flagged-up patients and polyps in a further 28 out of 809 (3.5%). CONCLUSIONS: It is technically possible to identify patients with colorectal cancer by searching electronic patient records.We acknowledge the General Practitioners, practice nurses, practice managers and administrative staff who supported this study, our trial co-ordinator Marie Crook, Anthony Ingold who was one of our patient representatives and MSDi for their support in developing the software algorithm. We also acknowledge the support of the National Institute for Health Research Clinical Research Network. This study was funded by the National Awareness and Early Diagnosis Initiative (NAEDI). TM is partly funded by the National Institute for Health Research (NIHR) through the Collaborations for Leadership in Applied Health Research and Care for the West Midlands (CLAHRC-WM) programme

    Exposure to bisphosphonates and risk of common non-gastrointestinal cancers: series of nested case–control studies using two primary-care databases

    Get PDF
    Background: Bisphosphonates are the most commonly prescribed osteoporosis drugs but long-term effects are unclear, although antitumour properties are known from preclinical studies. Methods: Nested case–control studies were conducted to investigate bisphosphonate use and risks of common nongastrointestinal cancers (breast, prostate, lung, bladder, melanoma, ovarian, pancreas, uterus and cervical). Patients 50 years and older, diagnosed with primary cancers between 1997 and 2011, were matched to five controls using the UK practice-based QResearch and Clinical Practice Research Datalink (CPRD) databases. The databases were analysed separately and the results combined. Results: A total of 91 556 and 88 845 cases were identified from QResearch and CPRD, respectively. Bisphosphonate use was associated with reduced risks of breast (odds ratio (OR): 0.92, 95% confidence interval (CI): 0.87–0.97), prostate (OR: 0.87, 95% CI: 0.79–0.96) and pancreatic (OR: 0.79, 95% CI: 0.68–0.93) cancers in the combined analyses, but no significant trends with duration. For alendronate, reduced risk associations were found for prostate cancer in the QResearch (OR: 0.81, 95% CI: 0.70–0.93) and combined (OR: 0.84, 95% CI: 0.75–0.93) analyses (trend with duration P-values 0.009 and 0.001). There were no significant associations from any of the other analyses. Conclusion: In this series of large population-based case–control studies, bisphosphonate use was not associated with increased risks for any common non-gastrointestinal cancers

    An external validation of the QCOVID3 risk prediction algorithm for risk of hospitalisation and death from COVID-19: An observational, prospective cohort study of 1.66m vaccinated adults in Wales, UK

    Get PDF
    INTRODUCTION: At the start of the COVID-19 pandemic there was an urgent need to identify individuals at highest risk of severe outcomes, such as hospitalisation and death following infection. The QCOVID risk prediction algorithms emerged as key tools in facilitating this which were further developed during the second wave of the COVID-19 pandemic to identify groups of people at highest risk of severe COVID-19 related outcomes following one or two doses of vaccine. OBJECTIVES: To externally validate the QCOVID3 algorithm based on primary and secondary care records for Wales, UK. METHODS: We conducted an observational, prospective cohort based on electronic health care records for 1.66m vaccinated adults living in Wales on 8th December 2020, with follow-up until 15th June 2021. Follow-up started from day 14 post vaccination to allow the full effect of the vaccine. RESULTS: The scores produced by the QCOVID3 risk algorithm showed high levels of discrimination for both COVID-19 related deaths and hospital admissions and good calibration (Harrell C statistic: ≥ 0.828). CONCLUSION: This validation of the updated QCOVID3 risk algorithms in the adult vaccinated Welsh population has shown that the algorithms are valid for use in the Welsh population, and applicable on a population independent of the original study, which has not been previously reported. This study provides further evidence that the QCOVID algorithms can help inform public health risk management on the ongoing surveillance and intervention to manage COVID-19 related risks

    Prediction of cardiovascular risk using Framingham, ASSIGN and QRISK2: how well do they predict individual rather than population risk?

    Get PDF
    BACKGROUND: The objective of this study was to evaluate the performance of risk scores (Framingham, Assign and QRISK2) in predicting high cardiovascular disease (CVD) risk in individuals rather than populations. METHODS AND FINDINGS: This study included 1.8 million persons without CVD and prior statin prescribing using the Clinical Practice Research Datalink. This contains electronic medical records of the general population registered with a UK general practice. Individual CVD risks were estimated using competing risk regression models. Individual differences in the 10-year CVD risks as predicted by risk scores and competing risk models were estimated; the population was divided into 20 subgroups based on predicted risk. CVD outcomes occurred in 69,870 persons. In the subgroup with lowest risks, risk predictions by QRISK2 were similar to individual risks predicted using our competing risk model (99.9% of people had differences of less than 2%); in the subgroup with highest risks, risk predictions varied greatly (only 13.3% of people had differences of less than 2%). Larger deviations between QRISK2 and our individual predicted risks occurred with calendar year, different ethnicities, diabetes mellitus and number of records for medical events in the electronic health records in the year before the index date. A QRISK2 estimate of low 10-year CVD risk (<15%) was confirmed by Framingham, ASSIGN and our individual predicted risks in 89.8% while an estimate of high 10-year CVD risk (≥ 20%) was confirmed in only 48.6% of people. The majority of cases occurred in people who had predicted 10-year CVD risk of less than 20%. CONCLUSIONS: Application of existing CVD risk scores may result in considerable misclassification of high risk status. Current practice to use a constant threshold level for intervention for all patients, together with the use of different scoring methods, may inadvertently create an arbitrary classification of high CVD risk

    Incidence, prevalence and mortality of bullous pemphigoid in England 1998-2017: a population-based cohort study

    Get PDF
    BACKGROUND: A rising incidence and high mortality were found for bullous pemphigoid (BP) over a decade ago in the UK. Updated estimates of its epidemiology are required to understand the healthcare needs of an ageing population. OBJECTIVES: To determine the incidence, prevalence and mortality rates of BP in England from 1998 to 2017. METHODS: We conducted a cohort study of longitudinal electronic health records using the Clinical Practice Research Datalink and linked Hospital Episode Statistics. Incidence was calculated per 100 000 person-years and annual point prevalence per 100 000 people. Multivariate analysis was used to determine incidence rate ratios by sociodemographic factors. Mortality was examined in an age-, sex- and practice-matched cohort, using linked Office of National Statistics death records. Hazard ratios (HRs) were stratified by matched set. RESULTS: The incidence was 7·63 [95% confidence interval (CI) 7·35-7·93] per 100 000 person-years and rose with increasing age, particularly for elderly men. The annual increase in incidence was 0·9% (95% CI 0·2-1·7). The prevalence almost doubled over the observation period, reaching 47·99 (95% CI 43·09-53·46) per 100 000 people and 141·24 (95% CI 125·55-158·87) per 100 000 people over the age of 60 years. The risk of all-cause mortality was highest in the 2 years after diagnosis (HR 2·96; 95% CI 2·68-3·26) and remained raised thereafter (HR 1·54; 95% CI 1·36-1·74). CONCLUSIONS: We report a modest increase in the incidence rate of BP, but show that the burden of disease in the elderly population is considerable. Mortality is high, particularly in the first 2 years after diagnosis

    Do changes in traditional coronary heart disease risk factors over time explain the association between socio-economic status and coronary heart disease?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Socioeconomic status (SES) predicts coronary heart disease independently of the traditional risk factors included in the Framingham risk score. However, it is unknown whether <it>changes </it>in Framingham risk score variables over time explain the association between SES and coronary heart disease. We examined this question given its relevance to risk assessment in clinical decision making.</p> <p>Methods</p> <p>The Atherosclerosis Risk in Communities study data (initiated in 1987 with 10-years follow-up of 15,495 adults aged 45-64 years in four Southern and Mid-Western communities) were used. SES was assessed at baseline, dichotomized as low SES (defined as low education and/or low income) or not. The time dependent variables - smoking, total and high density lipoprotein cholesterol, systolic blood pressure and use of blood pressure lowering medication - were assessed every three years. Ten-year incidence of coronary heart disease was based on EKG and cardiac enzyme criteria, or adjudicated death certificate data. Cox survival analyses examined the contribution of SES to heart disease risk independent of baseline Framingham risk score, without and with further adjustment for the time dependent variables.</p> <p>Results</p> <p>Adjusting for baseline Framingham risk score, low SES was associated with an increased coronary heart disease risk (hazard ratio [HR] = 1.53; 95% Confidence Interval [CI], 1.27 to1.85). After further adjustment for the time dependent variables, the SES effect remained significant (HR = 1.44; 95% CI, 1.19 to1.74).</p> <p>Conclusion</p> <p>Using Framingham Risk Score alone under estimated the coronary heart disease risk in low SES persons. This bias was not eliminated by subsequent changes in Framingham risk score variables.</p
    corecore