113 research outputs found

    Vaccine effectiveness against laboratory-confirmed influenza in Europe – Results from the DRIVE network during season 2018/19

    Get PDF
    The DRIVE project aims to establish a sustainable network to estimate brand-specific influenza vaccine effectiveness (IVE) annually. DRIVE is a public–private partnership launched in response to EMA guidance that requires effectiveness evaluation from manufacturers for all individual influenza vaccine brands every season. IVE studies are conducted by public partners in DRIVE. Private partners (vaccine manufacturers from the European Federation of Pharmaceutical Industries and Association (EFPIA)) provide written feedback moderated by an independent scientific committee. Test-negative design (TND) case-control studies (4 in primary care and five in hospital) were conducted in six countries in Europe during the 2018/19 season. Site-specific confounder-adjusted vaccine effectiveness (VE) estimates for any vaccine exposure were calculated by age group (<18 years (y), 18-64y and 65 + y) and pooled by setting (primary care, hospital) through random effects meta-analysis. In addition, one population-based cohort study was conducted in Finland. TND studies included 3339 cases and 6012 controls; seven vaccine brands were reported. For ages 65 + y, pooled VE against any influenza strain was estimated at 27% (95%CI 6–44) in hospital setting. Sample size was insufficient for meaningful IVE estimates in other age groups, in the primary care setting, or by vaccine brand. The population-based cohort study included 274,077 vaccinated and 494,337 unvaccinated person-years, two vaccine brands were reported. Brand-specific IVE was estimated for Fluenz Tetra (36% [95%CI 24–45]) for ages 2-6y, Vaxigrip Tetra (54% [43–62]) for ages 6 months to 6y, and Vaxigrip Tetra (30% [25–35]) for ages 65 + y. The results presented are from the second influenza season covered by the DRIVE network. While sample size from the pooled TND studies was still too low for precise (brand-specific) IVE estimates, the network has approximately doubled in size compared to the pilot season. Taking measures to increase sample size is an important focus of DRIVE for the coming years

    Urinary biomarkers for the detection of prostate cancer in patients with high-grade prostatic intraepithelial neoplasia

    Get PDF
    Introduction: high-grade prostatic intraepithelial neoplasia (HGPIN) is a recognized precursor stage of PCa. Men who present HGPIN in a first prostate biopsy face years of active surveillance including repeat biopsies. This study aimed to identify non-invasive prognostic biomarkers that differentiate early on between indolent HGPIN cases and those that will transform into actual PCa. Methods: we measured the expression of 21 candidate mRNA biomarkers using quantitative PCR in urine sediment samples from a cohort of 90 patients with initial diagnosis of HGPIN and a posterior follow up of at least two years. Uni- and multivariate statistical analyses were applied to analyze the candidate biomarkers and multiplex models using combinations of these biomarkers. Results: PSMA, PCA3, PSGR, GOLM, KLK3, CDH1, and SPINK1 behavedas predictors for PCa presence in repeat biopsies. Multiplex models outperformed (AUC = 0.81-0.86) the predictive power of single genes, including the FDA-approved PCA3 (AUC = 0.70). With a fixed sensitivity of 95%, the specificity of our multiplex models was of 41-58%, compared to the 30% of PCA3. The PPV of our models (30-38%) was also higher than the PPV of PCA3 (27%), suggesting that benign cases could be more accurately identified. Applying statistical models, we estimated that 33% to 47% of repeat biopsies could be prevented with a multiplex PCR model, representing an easy applicable and significant advantage over the current gold standard in urine sediment. Discussion: using multiplex RTqPCR-based models in urine sediment it is possible to improve the current diagnostic method of choice (PCA3) to differentiate between benign HGPIN and PCa cases

    Cancer mortality inequalities in urban areas: a Bayesian small area analysis in Spanish cities

    Get PDF
    Background: Intra-urban inequalities in mortality have been infrequently analysed in European contexts. The aim of the present study was to analyse patterns of cancer mortality and their relationship with socioeconomic deprivation in small areas in 11 Spanish cities. Methods: It is a cross-sectional ecological design using mortality data (years 1996-2003). Units of analysis were the census tracts. A deprivation index was calculated for each census tract. In order to control the variability in estimating the risk of dying we used Bayesian models. We present the RR of the census tract with the highest deprivation vs. the census tract with the lowest deprivation. Results: In the case of men, socioeconomic inequalities are observed in total cancer mortality in all cities, except in Castellon, Cordoba and Vigo, while Barcelona (RR = 1.53 95%CI 1.42-1.67), Madrid (RR = 1.57 95%CI 1.49-1.65) and Seville (RR = 1.53 95%CI 1.36-1.74) present the greatest inequalities. In general Barcelona and Madrid, present inequalities for most types of cancer. Among women for total cancer mortality, inequalities have only been found in Barcelona and Zaragoza. The excess number of cancer deaths due to socioeconomic deprivation was 16,413 for men and 1,142 for women. Conclusion: This study has analysed inequalities in cancer mortality in small areas of cities in Spain, not only relating this mortality with socioeconomic deprivation, but also calculating the excess mortality which may be attributed to such deprivation. This knowledge is particularly useful to determine which geographical areas in each city need intersectorial policies in order to promote a healthy environment.This article was partially supported by Fondo de Investigaciones Ssanitarias (FIS) projects numbers PI042013, PI040041, PI040170, PI040069, PI042602 PI040388, PI040489, PI042098 , PI041260, PI040399, PI081488 and by the CIBER en Epidemiología y Salud Pública (CIBERESP), Spain and by the program of “Intensificación de la Actividad Investigadora (Carme Borrell)” funded by the “Instituto de Salud Carlos III” and “Departament de Salut. Generalitat de Catalunya”

    Use of Colorimetric Culture Methods for Detection of Mycobacterium Tuberculosis Complex Isolates from Sputum Samples in Resource-Limited Settings

    Get PDF
    Despite recent advances, tuberculosis (TB) diagnosis remains imperfect in resource-limited settings due to its complexity and costs, poor sensitivity of available tests, or long times to reporting. We present a report on the use of colorimetric methods, based on the detection of mycobacterial growth using colorimetric indicators, for the detection of Mycobacterium tuberculosis in sputum specimens. We evaluated the nitrate reductase assay (NRA), a modified NRA using para-nitrobenzoic acid (PNB) (NRAp), and the resazurin tube assay using PNB (RETAp) to differentiate tuberculous and nontuberculous mycobacteria. The performances were assessed at days 18 and 28 using mycobacterium growth indicator tube (MGIT) and Löwenstein-Jensen (LJ) medium culture methods as the reference standards. We enrolled 690 adults with suspected pulmonary tuberculosis from a regional referral hospital in Uganda between March 2010 and June 2011. At day 18, the sensitivities and specificities were 84.6% and 90.0% for the NRA, 84.1% and 92.6% for the NRAp, and 71.2% and 99.3% for the RETAp, respectively. At day 28, the sensitivity of the RETAp increased to 82.6%. Among smear-negative patients with suspected TB, sensitivities at day 28 were 64.7% for the NRA, 61.3% for the NRAp, and 50% for the RETAp. Contamination rates were found to be 5.4% for the NRA and 6.7% for the RETAp, compared with 22.1% for LJ medium culture and 20.4% for MGIT culture. The median times to positivity were 10, 7, and 25 days for colorimetric methods, MGIT culture, and LJ medium culture,respectively. Whereas the low specificity of the NRA/NRAp precludes it from being used for TB diagnosis, the RETAp might provide an alternative to LJ medium culture to decrease the time to culture results in resource-poor settings

    Waterborne Outbreak of Gastroenteritis: Effects on Sick Leaves and Cost of Lost Workdays

    Get PDF
    We examined the acute and cumulative effects of this incidence on sick leaves among public sector employees residing in the clean and contaminated areas, and the additional costs of lost workdays due to the incidence.Daily information on sick leaves of 1789 Finnish Public Sector Study participants was obtained from employers' registers. Global Positioning System-coordinates were used for linking participants to the clean and contaminated areas. Prevalence ratios (PR) for weekly sickness absences were calculated using binomial regression analysis. Calculations for the costs were based on prior studies.Among those living in the contaminated areas, the prevalence of participants on sick leave was 3.54 (95% confidence interval (CI) 2.97–4.22) times higher on the week following the incidence compared to the reference period. Those living and working in the clean area were basically not affected, the corresponding PR for sick leaves was 1.12, 95% CI 0.73–1.73. No cumulative effects on sick leaves were observed among the exposed. The estimated additional costs of lost workdays due to the incidence were 1.8–2.1 million euros.The prevalence of sickness absences among public sector employees residing in affected areas increased shortly after drinking water distribution system was contaminated, but no long-term effects were observed. The estimated costs of lost workdays were remarkable, thus, the cost-benefits of better monitoring systems for the water distribution systems should be evaluated

    Choice of the initial antiretroviral treatment for HIV-positive individuals in the era of integrase inhibitors

    Get PDF
    BACKGROUND: We aimed to describe the most frequently prescribed initial antiretroviral therapy (ART) regimens in recent years in HIV-positive persons in the Cohort of the Spanish HIV/AIDS Research Network (CoRIS) and to investigate factors associated with the choice of each regimen. METHODS: We analyzed initial ART regimens prescribed in adults participating in CoRIS from 2014 to 2017. Only regimens prescribed in >5% of patients were considered. We used multivariable multinomial regression to estimate Relative Risk Ratios (RRRs) for the association between sociodemographic and clinical characteristics and the choice of the initial regimen. RESULTS: Among 2874 participants, abacavir(ABC)/lamivudine(3TC)/dolutegavir(DTG) was the most frequently prescribed regimen (32.1%), followed by tenofovir disoproxil fumarate (TDF)/emtricitabine (FTC)/elvitegravir(EVG)/cobicistat(COBI) (14.9%), TDF/FTC/rilpivirine (RPV) (14.0%), tenofovir alafenamide (TAF)/FTC/EVG/COBI (13.7%), TDF/FTC+DTG (10.0%), TDF/FTC+darunavir/ritonavir or darunavir/cobicistat (bDRV) (9.8%) and TDF/FTC+raltegravir (RAL) (5.6%). Compared with ABC/3TC/DTG, starting TDF/FTC/RPV was less likely in patients with CD4100.000 copies/mL. TDF/FTC+DTG was more frequent in those with CD4100.000 copies/mL. TDF/FTC+RAL and TDF/FTC+bDRV were also more frequent among patients with CD4<200 cells//muL and with transmission categories other than men who have sex with men. Compared with ABC/3TC/DTG, the prescription of other initial ART regimens decreased from 2014-2015 to 2016-2017 with the exception of TDF/FTC+DTG. Differences in the choice of the initial ART regimen were observed by hospitals' location. CONCLUSIONS: The choice of initial ART regimens is consistent with Spanish guidelines' recommendations, but is also clearly influenced by physician's perception based on patient's clinical and sociodemographic variables and by the prescribing hospital location

    The role of epigenetics in renal ageing

    Get PDF
    An ability to separate natural ageing processes from processes specific to morbidities is required to understand the heterogeneity of age-related organ dysfunction. Mechanistic insight into how epigenetic factors regulate ageing throughout the life course, linked to a decline in renal function with ageing, is already proving to be of value in the analyses of clinical and epidemiological cohorts. Noncoding RNAs provide epigenetic regulatory circuits within the kidney, which reciprocally interact with DNA methylation processes, histone modification and chromatin. These interactions have been demonstrated to reflect the biological age and function of renal allografts. Epigenetic factors control gene expression and activity in response to environmental perturbations. They also have roles in highly conserved signalling pathways that modulate ageing, including the mTOR and insulin/insulin-like growth factor signalling pathways, and regulation of sirtuin activity. Nutrition, the gut microbiota, inflammation and environmental factors, including psychosocial and lifestyle stresses, provide potential mechanistic links between the epigenetic landscape of ageing and renal dysfunction. Approaches to modify the renal epigenome via nutritional intervention, targeting the methylome or targeting chromatin seem eminently feasible, although caution is merited owing to the potential for intergenerational and transgenerational effects

    Why Are Outcomes Different for Registry Patients Enrolled Prospectively and Retrospectively? Insights from the Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF).

    Get PDF
    Background: Retrospective and prospective observational studies are designed to reflect real-world evidence on clinical practice, but can yield conflicting results. The GARFIELD-AF Registry includes both methods of enrolment and allows analysis of differences in patient characteristics and outcomes that may result. Methods and Results: Patients with atrial fibrillation (AF) and ≥1 risk factor for stroke at diagnosis of AF were recruited either retrospectively (n = 5069) or prospectively (n = 5501) from 19 countries and then followed prospectively. The retrospectively enrolled cohort comprised patients with established AF (for a least 6, and up to 24 months before enrolment), who were identified retrospectively (and baseline and partial follow-up data were collected from the emedical records) and then followed prospectively between 0-18 months (such that the total time of follow-up was 24 months; data collection Dec-2009 and Oct-2010). In the prospectively enrolled cohort, patients with newly diagnosed AF (≤6 weeks after diagnosis) were recruited between Mar-2010 and Oct-2011 and were followed for 24 months after enrolment. Differences between the cohorts were observed in clinical characteristics, including type of AF, stroke prevention strategies, and event rates. More patients in the retrospectively identified cohort received vitamin K antagonists (62.1% vs. 53.2%) and fewer received non-vitamin K oral anticoagulants (1.8% vs . 4.2%). All-cause mortality rates per 100 person-years during the prospective follow-up (starting the first study visit up to 1 year) were significantly lower in the retrospective than prospectively identified cohort (3.04 [95% CI 2.51 to 3.67] vs . 4.05 [95% CI 3.53 to 4.63]; p = 0.016). Conclusions: Interpretations of data from registries that aim to evaluate the characteristics and outcomes of patients with AF must take account of differences in registry design and the impact of recall bias and survivorship bias that is incurred with retrospective enrolment. Clinical Trial Registration: - URL: http://www.clinicaltrials.gov . Unique identifier for GARFIELD-AF (NCT01090362)

    The management of acute venous thromboembolism in clinical practice. Results from the European PREFER in VTE Registry

    Get PDF
    Venous thromboembolism (VTE) is a significant cause of morbidity and mortality in Europe. Data from real-world registries are necessary, as clinical trials do not represent the full spectrum of VTE patients seen in clinical practice. We aimed to document the epidemiology, management and outcomes of VTE using data from a large, observational database. PREFER in VTE was an international, non-interventional disease registry conducted between January 2013 and July 2015 in primary and secondary care across seven European countries. Consecutive patients with acute VTE were documented and followed up over 12 months. PREFER in VTE included 3,455 patients with a mean age of 60.8 ± 17.0 years. Overall, 53.0 % were male. The majority of patients were assessed in the hospital setting as inpatients or outpatients (78.5 %). The diagnosis was deep-vein thrombosis (DVT) in 59.5 % and pulmonary embolism (PE) in 40.5 %. The most common comorbidities were the various types of cardiovascular disease (excluding hypertension; 45.5 %), hypertension (42.3 %) and dyslipidaemia (21.1 %). Following the index VTE, a large proportion of patients received initial therapy with heparin (73.2 %), almost half received a vitamin K antagonist (48.7 %) and nearly a quarter received a DOAC (24.5 %). Almost a quarter of all presentations were for recurrent VTE, with &gt;80 % of previous episodes having occurred more than 12 months prior to baseline. In conclusion, PREFER in VTE has provided contemporary insights into VTE patients and their real-world management, including their baseline characteristics, risk factors, disease history, symptoms and signs, initial therapy and outcomes
    corecore