37 research outputs found

    Cooperation in a Repeated Public Goods Game with a Probabilistic Endpoint

    Get PDF
    In our experiment, we have a multiple-round public goods game but with a probabilistic endpoint. This changes the Nash equilibrium, such that cooperation is the new equilibrium strategy. The experiment consists of two treatments, one with a single round per session (called the intertemporal treatment), and the second with multiple rounds per session. Experimental results suggest that contribution was indeed positive and consistent provided a high enough probability of the game’s continuation, but declined when probability fell

    Supporting Pharmacovigilance Signal Validation and Prioritization with Analyses of Routinely Collected Health Data: Lessons Learned from an EHDEN Network Study

    Get PDF
    Introduction: Individual case reports are the main asset in pharmacovigilance signal management. Signal validation is the first stage after signal detection and aims to determine if there is sufficient evidence to justify further assessment. Throughout signal management, a prioritization of signals is continually made. Routinely collected health data can provide relevant contextual information but are primarily used at a later stage in pharmacoepidemiological studies to assess communicated signals. Objective: The aim of this study was to examine the feasibility and utility of analysing routine health data from a multinational distributed network to support signal validation and prioritization and to reflect on key user requirements for these analyses to become an integral part of this process. Methods: Statistical signal detection was performed in VigiBase, the WHO global database of individual case safety reports, targeting generic manufacturer drugs and 16 prespecified adverse events. During a 5-day study-a-thon, signal validation and prioritization were performed using information from VigiBase, regulatory documents and the scientific literature alongside descriptive analyses of routine health data from 10 partners of the European Health Data and Evidence Network (EHDEN). Databases included in the study were from the UK, Spain, Norway, the Netherlands and Serbia, capturing records from primary care and/or hospitals. Results: Ninety-five statistical signals were subjected to signal validation, of which eight were considered for descriptive analyses in the routine health data. Design, execution and interpretation of results from these analyses took up to a few hours for each signal (of which 15–60 minutes were for execution) and informed decisions for five out of eight signals. The impact of insights from the routine health data varied and included possible alternative explanations, potential public health and clinical impact and feasibility of follow-up pharmacoepidemiological studies. Three signals were selected for signal assessment, two of these decisions were supported by insights from the routine health data. Standardization of analytical code, availability of adverse event phenotypes including bridges between different source vocabularies, and governance around the access and use of routine health data were identified as important aspects for future development. Conclusions: Analyses of routine health data from a distributed network to support signal validation and prioritization are feasible in the given time limits and can inform decision making. The cost–benefit of integrating these analyses at this stage of signal management requires further research

    Global urban environmental change drives adaptation in white clover

    Get PDF
    Urbanization transforms environments in ways that alter biological evolution. We examined whether urban environmental change drives parallel evolution by sampling 110,019 white clover plants from 6169 populations in 160 cities globally. Plants were assayed for a Mendelian antiherbivore defense that also affects tolerance to abiotic stressors. Urban-rural gradients were associated with the evolution of clines in defense in 47% of cities throughout the world. Variation in the strength of clines was explained by environmental changes in drought stress and vegetation cover that varied among cities. Sequencing 2074 genomes from 26 cities revealed that the evolution of urban-rural clines was best explained by adaptive evolution, but the degree of parallel adaptation varied among cities. Our results demonstrate that urbanization leads to adaptation at a global scale

    Association between overweight and obesity and risk of clinically diagnosed knee, hip, and hand osteoarthritis: A population-based cohort study

    No full text
    Objective: Previous cohorts have reported associations between overweight/obesity and knee and hand osteoarthritis (OA). However, no data on the effect of these on the OA burden are available. We aimed to analyse the effect of overweight and obesity on the incidence of routinely diagnosed knee, hip, and hand OA.Methods:Design: population-based cohortSetting: primary care records from the SIDIAP database (>5.5 million subjects) covering >80% of the population of Catalonia, Spain.Participants: ?40 years old with no OA on 01/01/2006 and with body mass index (BMI) data available. Follow-up: from 01/01/2006 to 12/31/2010, loss to follow-up, or death.Measures: BMI World Health Organization categories (exposure), and incident clinical diagnoses of knee, hip, or hand OA (ICD-10 codes).Results: 1,764,061 subjects were observed for a median (inter-quartile range) of 4.45 (4.19 to 4.98) years. Incidence rates (per 1000 PY) of knee, hip and hand OA ranged from 3.7 (3.6 to 3.8), 1.7 (1.7 to 1.8) and 2.6 (2.5 to 2.7) amongst normal-weight, to 19.5 (19.1 to 19.9), 3.8 (3.7 to 4.0) and 4.0 (3.9 to 4.2) in the grade II obese respectively.Compared to normal-weight subjects, being overweight or obese increased the risk of OA at all three sites, especially at the knee: overweight and (grade I, II) obesity increased knee OA risk by a factor of 2, 3.1 and 4.7 fold respectively.Conclusions: Both overweight and obesity increase the risk of hand, hip, and knee OA, especially for the latter, with a dose-response gradient with increasing BMI

    The role of COVID-19 vaccines in preventing post-COVID-19 thromboembolic and cardiovascular complications

    Get PDF
    Objective To study the association between COVID-19 vaccination and the risk of post-COVID-19 cardiac and thromboembolic complications. Methods We conducted a staggered cohort study based on national vaccination campaigns using electronic health records from the UK, Spain and Estonia. Vaccine rollout was grouped into four stages with predefined enrolment periods. Each stage included all individuals eligible for vaccination, with no previous SARS-CoV-2 infection or COVID-19 vaccine at the start date. Vaccination status was used as a time-varying exposure. Outcomes included heart failure (HF), venous thromboembolism (VTE) and arterial thrombosis/thromboembolism (ATE) recorded in four time windows after SARS-CoV-2 infection: 0–30, 31–90, 91–180 and 181–365 days. Propensity score overlap weighting and empirical calibration were used to minimise observed and unobserved confounding, respectively. Fine-Gray models estimated subdistribution hazard ratios (sHR). Random effect meta-analyses were conducted across staggered cohorts and databases. Results The study included 10.17 million vaccinated and 10.39 million unvaccinated people. Vaccination was associated with reduced risks of acute (30-day) and post-acute COVID-19 VTE, ATE and HF: for example, meta-analytic sHR of 0.22 (95% CI 0.17 to 0.29), 0.53 (0.44 to 0.63) and 0.45 (0.38 to 0.53), respectively, for 0–30 days after SARS-CoV-2 infection, while in the 91–180 days sHR were 0.53 (0.40 to 0.70), 0.72 (0.58 to 0.88) and 0.61 (0.51 to 0.73), respectively. Conclusions COVID-19 vaccination reduced the risk of post-COVID-19 cardiac and thromboembolic outcomes. These effects were more pronounced for acute COVID-19 outcomes, consistent with known reductions in disease severity following breakthrough versus unvaccinated SARS-CoV-2 infection.</p

    Real-Life and RCT Participants:Alendronate Users Versus FITs' Trial Eligibility Criterion

    No full text
    We aimed to characterize incident users of alendronate from Denmark and Spain, and investigate their eligibility for participation in the pivotal Fracture Intervention Trial (FIT). This is an international cross-sectional study, where the data were obtained from the SIDIAP database (Sistema d'Informació per al Desenvolupament de l'Investigació en Atenció Primària) from Catalonia (Spain) and the Danish Health Registries (DHR). This study included patients who were incident users of alendronate, ≥40 years old with no history of Paget's disease. Our measurements were the proportion of incident users of alendronate who were not eligible to participate in FIT. 14,316 and 21,221 subjects initiated alendronate in 2006-2007 (SIDIAP) and 2005-2006 (DHR), respectively. SIDIAP and DHR alendronate user cohorts had 2347 (16.4 %) and 5275 (24.9 %) subjects aged &gt;80 years old, reported 9 (0.1 %) and 91 (0.4 %) diagnoses of myocardial infarction, 423 (3 %) and 368 (1.7 %) of erosive gastro-intestinal disease, 200 (1.4 %) and 1109 (5.2 %) of dyspepsia, and 349 (2.4 %) and 149 (0.7 %) of metabolic bone disease, all of which were exclusion criteria in FIT. Men [3818 (26.7 %) in SIDIAP and 3885 (18.3 %) in DHR] and glucocorticoid users [1229 (8.6 %) in SIDIAP and 4716 (22.2 %) in DHR] were also excluded from the FIT trial. Overall, 3447 (35.4 %) SIDIAP and 6228 (44.5 %) (when not considering men and glucocorticoid users) DHR of incident alendronate users would have been excluded from FIT. One in two real-life users of alendronate exhibited one or more clinical characteristics that would have led to them being excluded from the FIT trial

    Pre-school obesity is associated with an increased risk of childhood fracture: A longitudinal cohort study of 466,997 children and up to 11 years of follow-up in Catalonia, Spain

    No full text
    This study aimed to determine if having an overweight or obese range body mass index (BMI) at time of beginning school is associated with increased fracture incidence in childhood. A dynamic cohort was created from children presenting for routine preschool primary care screening, collected in the Information System for Research in Primary Care (SIDIAP) platform in Catalonia, Spain. Data were collected from 296 primary care centers representing 74% of the regional pediatric population. A total of 466,997 children (48.6% female) with a validated weight and height measurement within routine health care screening at age 4 years (±6 months) between 2006 and 2013 were included, and followed up to the age of 15, migration out of region, death, or until December 31, 2016. BMI was calculated at age 4 years and classified using WHO growth tables, and fractures were identified using previously validated ICD10 codes in electronic primary care records, divided by anatomical location. Actuarial lifetables were used to calculate cumulative incidence. Cox regression was used to investigate the association of BMI category and fracture risk with adjustment for socioeconomic status, age, sex, and nationality. Median follow‐up was 4.90 years (interquartile range [IQR] 2.50 to 7.61). Cumulative incidence of any fracture during childhood was 9.20% (95% confidence interval [CI] 3.79% to 14.61%) for underweight, 10.06% (9.82% to 10.29%) for normal weight, 11.28% (10.22% to 12.35%) for overweight children, and 13.05% (10.69% to 15.41%) for children with obesity. Compared with children of normal range weight, having an overweight and obese range BMI was associated with an excess risk of lower limb fracture (adjusted hazard ratio [HR] = 1.42 [1.26 to 1.59]; 1.74 [1.46 to 2.06], respectively) and upper limb fracture (adjusted HR = 1.10 [1.03 to 1.17]; 1.19 [1.07 to 1.31]). Overall, preschool children with an overweight or obese range BMI had increased incidence of upper and lower limb fractures in childhood compared with contemporaries of normal weight. © 2020 American Society for Bone and Mineral Research

    Factors Affecting Fruit Quality in Strawberries Grown in a Subtropical Environment

    No full text
    Variations of strawberry fruit quality due to year, harvest date, cultivar, and nutrition were studied in Tucumán, Argentina. ‘Camarosa’, ‘Camino Real’, ‘Candonga’, and ‘Ventana’ were fertilized with “C” (control; 120 kg N/ha, 70 kg P2O5/ha, 220 kg K2O/ha, 40 kg CaO/ha, and 20 kg MgO/ha) and “1.5C” in 2006, “C”, “1.33C”, and “1.66C” in 2007, and “C”, “1.5C”, and “2C” in 2008, without affecting fruit quality. Fruit weight decreased with temperature and crop load. Firmness decreased with rainfall and temperature. Redness increased with temperature and decreased with rainfall. Soluble solids content decreased with crop load. Titratable acidity increased with temperature.EEA FamailláFil: Agüero, Juan José. Instituto Nacional de Tecnología Agropecuaria (INTA). Estación Experimental Agropecuaria Famaillá. División Frutihorticultura; ArgentinaFil: Salazar, Sergio Miguel. Instituto Nacional de Tecnología Agropecuaria (INTA). Estación Experimental Agropecuaria Famaillá. División Frutihorticultura; ArgentinaFil: Kirschbaum, Daniel Santiago. Instituto Nacional de Tecnología Agropecuaria (INTA). Estación Experimental Agropecuaria Famaillá. División Frutihorticultura; ArgentinaFil: Jerez, Elena Fanny. Instituto Nacional de Tecnología Agropecuaria (INTA). Estación Experimental Agropecuaria Famaillá. División Frutihorticultura; Argentin
    corecore