51 research outputs found
Thrombocytopenic, thromboembolic and haemorrhagic events following second dose with BNT162b2 and ChAdOx1: self-controlled case series analysis of the English national sentinel cohort
Thrombosis associated with thrombocytopenia was a matter of concern post first and second doses of BNT162b2 and ChAdOx1 COVID-19 vaccines. Therefore, it is important to investigate the risk of thrombocytopenic, thromboembolic and haemorrhagic events following a second dose of BNT162b2 and ChAdOx1 COVID-19 vaccines. We conducted a large-scale self-controlled case series analysis, using routine primary care data linked to hospital data, among 12.3 million individuals (16 years old and above) in England. We used the nationally representative Oxford-Royal College of General Practitioners (RCGP) sentinel network database with baseline and risk periods between 8th December 2020 and 11th June 2022. We included individuals who received two vaccine (primary) doses of the BNT162b2 mRNA (Pfizer-BioNTech) and two vaccine doses of ChAdOx1 nCoV-19 (Oxford-AstraZeneca) vaccines in our analyses. We carried out a self-controlled case series (SCCS) analysis for each outcome using a conditional Poisson regression model with an offset for the length of risk period. We reported the incidence rate ratios (IRRs) and 95% confidence intervals (CI) of thrombocytopenic, thromboembolic (including arterial and venous events) and haemorrhagic events, in the period of 0-27 days after receiving a second dose of BNT162b2 or ChAdOx1 vaccines compared to the baseline period (14 or more days prior to first dose, 28 or more days after the second dose and the time between 28 or more days after the first and 14 or more days prior to the second dose). We adjusted for a range of potential confounders, including age, sex, comorbidities and deprivation. Between December 8, 2020 and February 11, 2022, 6,306,306 individuals were vaccinated with two doses of BNT162b2 and 6,046,785 individuals were vaccinated with two doses of ChAdOx1. Compared to the baseline, our analysis show no increased risk of venous thromboembolic events (VTE) for both BNT162b2 (IRR 0.71, 95% CI: 0.65-0.770) and ChAdOx1 (IRR 0.91, 95% CI: 0.84-0.98); and similarly there was no increased risk for cerebral venous sinus thrombosis (CVST) for both BNT162b2 (IRR 0.87, 95% CI: 0.41-1.85) and ChAdOx1 (IRR 1.73, 95% CI: 0.82-3.68). We additionally report no difference in IRR for pulmonary embolus, and deep vein thrombosis, thrombocytopenia, including idiopathic thrombocytopenic purpura (ITP), and haemorrhagic events post second dose for both BNT162b2. Reassuringly, we found no associations between increased risk of thrombocytopenic, thromboembolic and haemorrhagic events post vaccination with second dose for either of these vaccines. Data and Connectivity: COVID-19 Vaccines Pharmacovigilance study
Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)
In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field
Impact of prediagnostic smoking and smoking cessation on colorectal cancer prognosis: a meta-analysis of individual patient data from cohorts within the CHANCES consortium
Background: Smoking has been associated with colorectal cancer (CRC)
incidence and mortality in previous studies and might also be associated
with prognosis after CRC diagnosis. However, current evidence on smoking
in association with CRC prognosis is limited.
Patients and methods: For this individual patient data meta-analysis,
sociodemographic and smoking behavior information of 12 414 incident CRC
patients (median age at diagnosis: 64.3 years), recruited within 14
prospective cohort studies among previously cancer-free adults, was
collected at baseline and harmonized across studies. Vital status and
causes of death were collected for a mean follow-up time of 5.1 years
following cancer diagnosis. Associations of smoking behavior with
overall and CRC-specific survival were evaluated using Cox regression
and standard meta-analysis methodology.
Results: A total of 5229 participants died, 3194 from CRC. Cox
regression revealed significant associations between former [hazard
ratio (HR) = 1.12; 95 % confidence interval (CI) = 1.04-1.20] and
current smoking (HR = 1.29; 95% CI = 1.04-1.60) and poorer overall
survival compared with never smoking. Compared with current smoking,
smoking cessation was associated with improved overall (HR<10years =
0.78; 95% CI = 0.69-0.88; HR >= 10years = 0.78; 95% CI = 0.63-0.97)
and CRC-specific survival (HR similar to 10 years = 0.76; 95% CI =
0.67-0.85).
Conclusion: In this large meta-analysis including primary data of
incident CRC patients from 14 prospective cohort studies on the
association between smoking and CRC prognosis, former and current
smoking were associated with poorer CRC prognosis compared with never
smoking. Smoking cessation was associated with improved survival when
compared with current smokers. Future studies should further quantify
the benefits of nonsmoking, both for cancer prevention and for improving
survival among CRC patients, in particular also in terms of treatment
response
Quality of life in older adults with chronic kidney disease and transient changes in renal function: findings from the Oxford Renal Cohort
Background: Quality of life (QoL) is an important measure of disease burden and general health perception. The relationship between early chronic kidney disease (CKD) and QoL remains poorly understood. The Oxford Renal Study (OxRen) cohort comprises 1063 adults aged ≥60 years from UK primary care practices screened for early CKD, grouped according to existing or screen-detected CKD diagnoses, or biochemistry results indicative of reduced renal function (referred to as transient estimated glomerular filtration rate (eGFR) reduction).
Objectives: This study aimed to compare QoL in participants known to have CKD at recruitment to those identified as having CKD through a screening programme.
Methods: Health profile data and multi-attribute utility scores were reported for two generic questionnaires: 5-level EuroQol-5 Dimension (EQ-5D-5L) and ICEpop CAPability measure for Adults (ICECAP-A). QoL was compared between patients with existing and screen-detected CKD; those with transient eGFR reduction served as the reference group in univariable and multivariable linear regression.
Results: Mean and standard deviation utility scores were not significantly different between the subgroups for EQ-5D-5L (screen-detected:0.785±0.156, n=480, transient:0.779±0.157, n=261, existing CKD:0.763±0.171, n=322, p=0.216) or ICECAP-A (screen-detected:0.909±0.094, transient:0.904±0.110, existing CKD:0.894±0.115, p=0.200). Age, smoking status, and number of comorbidities were identified as independent predictors of QoL in this cohort.
Conclusion: QoL of participants with existing CKD diagnoses was not significantly different from those with screen-detected CKD or transient eGFR reduction and was similar to UK mean scores for the same age, suggesting that patient burden of early CKD is minor. Moreover, CKD-related comorbidities contribute more significantly to disease burden in earlier stages of CKD than renal function per se. Larger prospective studies are required to define the relationship between QoL and CKD progression more precisely. These data also confirm the essentially asymptomatic nature of CKD, implying that routine screening or case finding are required to diagnose it.</p
Prevalence and factors associated with multimorbidity among primary care patients with decreased renal function
Objectives: To establish the prevalence of multimorbidity in people with chronic kidney disease (CKD) stages 1-5 and transiently impaired renal function and identify factors associated with multimorbidity.
Design and setting: Prospective cohort study in UK primary care.
Participants: 861 participants aged 60 and older with decreased renal function of whom, 584 (65.8%) had CKD and 277 (32.2%) did not have CKD.
Interventions: Participants underwent medical history and clinical assessment, and blood and urine sampling.
Primary and secondary outcome measures: Multimorbidity was defined as presence of ≥2 chronic conditions including CKD. Prevalence of each condition, co-existing conditions and multimorbidity were described and logistic regression was used to identify predictors of multimorbidity.
Results: The mean (±SD) age of participants was 74±7 years, 54% were women and 98% were white. After CKD, the next most prevalent condition was hypertension (n=511, 59.3%), followed by obesity (n=265, 30.8%) ischemic heart disease (n=145, 16.8%) and diabetes (n=133, 15.4%). Having two co-existing conditions was most common (27%), the most common combination of which was hypertension and obesity (29%). One or three conditions was the next most prevalent combination (20% and 21% respectively). The prevalence of multimorbidity was 73.9% (95%CI 70.9-76.8) in all participants and 86.6% (95%CI 83.9-89.3) in those with any-stage CKD. Logistic regression found a significant association between increasing age (OR 1.07, 95%CI 1.04-0.10), increasing BMI (OR 1.15, 95%CI 1.10-1.20) and decreasing eGFR (OR 0.99, 95%CI 0.98-1.00) with multimorbidity.
Conclusions: This analysis is the first to provide an accurate estimate of the prevalence of multimorbidity in a screened older primary care population living with or at risk of CKD across all stages. Hypertension and obesity were the most common combination of conditions other than CKD that people were living with, suggesting that there may be multiple reasons for closely monitoring health status in individuals with CKD.</p
Area of Blair Mine in Tuscaloosa County, Alabama, photo 2
Objectives To report reliable estimates of short term and long term survival rates for people with a diagnosis of heart failure and to assess trends over time by year of diagnosis, hospital admission, and socioeconomic group. Design Population based cohort study. Setting Primary care, United Kingdom. Participants Primary care data for 55 959 patients aged 45 and overwith a new diagnosis of heart failure and 278 679 age and sex matched controls in the Clinical Practice Research Datalink from 1 January 2000 to 31 December 2017 and linked to inpatient Hospital Episode Statistics and Office for National Statistics mortality data. Main outcome measures Survival rates at one, five, and 10 years and cause of death for people with and without heart failure; and temporal trends in survival by year of diagnosis, hospital admission, and socioeconomic group. Results Overall, one, five, and 10 year survival rates increased by 6.6% (from 74.2% in 2000 to 80.8% in 2016), 7.2% (from 41.0% in 2000 to 48.2% in 2012), and 6.4% (from 19.8% in 2000 to 26.2% in 2007), respectively. There were 30 906 deaths in the heart failure group over the study period. Heart failure was listed on the death certificate in 13 093 (42.4%) of these patients, and in 2237 (7.2%) it was the primary cause of death. Improvement in survival was greater for patients not requiring admission to hospital around the time of diagnosis (median difference 2.4 years; 5.3 v 2.9 years, P<0.001). There was a deprivation gap in median survival of 2.4 years between people who were least deprived and those who were most deprived (11.1 v 8.7 years, P<0.001). Conclusions Survival after a diagnosis of heart failure has shown only modest improvement in the 21st century and lags behind other serious conditions, such as cancer. New strategies to achieve timely diagnosis and treatment initiation in primary care for all socioeconomic groups should be a priority for future research and policy
Behavioural interventions for smoking cessation: an overview and network meta-analysis
Supplementary tables in Microsoft Word and Excel that provide additional information to accompany the Cochrane Overview: Behavioural interventions for smoking cessation: an overview and network meta-analysi
Long term trends in natriuretic peptide testing for heart failure in UK primary care: a cohort study
Aims: Heart failure (HF) is a malignant condition with poor outcomes and is often diagnosed on emergency hospital admission. Natriuretic peptide (NP) testing in primary care is recommended in international guidelines to facilitate timely diagnosis. We aimed to report contemporary trends in NP testing and subsequent HF diagnosis rates over time.
Methods and Results: Cohort study using linked primary and secondary care data of adult (≥45 years) patients in England 2004-2018 (n=7,212,013, 48% male) to report trends in NP testing (over time, by age, sex, ethnicity, and socioeconomic status) and HF diagnosis rates. NP test rates increased from 0.25 per 1000 person-years [95% confidence interval (CI) 0.23-0.26] in 2004 to 16.88 per 1000 person-years (95% CI 16.73-17.03) in 2018, with a significant upward trend in 2010 following publication of national HF guidance. Women and different ethnic groups had similar test rates, and there was more NP testing in older and more socially deprived groups as expected. The HF detection rate was constant over the study period (around 10%) and the proportion of patients without NP testing prior to diagnosis remained high [99.6% (n=13,484) in 2004 vs 76.7% (n=12,978) in 2017].
Conclusion: NP testing in primary care has increased over time, with no evidence of significant inequalities, but most patients with HF still do not have a NP test recorded prior to diagnosis. More NP testing in primary care may be needed to prevent hospitalisation and facilitate HF diagnosis at an earlier, more treatable stage.</p
- …