68 research outputs found

    An Economic Assessment of Smokefree Restaurant Establishments in Tennessee: Implications for Other Smoking Establishments

    Get PDF
    In 2007 Tennessee enacted and implemented the Nonsmoker Protection Act (NSPA) to protect nonsmokers by creating 100% smoke-free restaurants. Several venues were exempted, including age-restricted ones such as bars, and tobacco regulation was preempted. Thus, the NSPA is not equitable smoke free policy (SFP) because it has left vast segments of nonsmokers such as employees and patrons of bars unprotected from second-hand smoke (SHS) exposure and thwarted any local initiative to pursue 100% comprehensive SFPs. While this predisposes these nonsmokers to the health dangers of SHS exposure, it makes the NSPA incompatible with the objectives of the Healthy People 2020 and 2030 as well as goals of the state health plan. In 2021, the American Lung Association graded the NSPA “C,” and the United Health Foundation ranked it 42nd out of 50 states. This project assessed the effects of smoke-free venues across different economic domains through quantitative and qualitative data review to determine the implications for venues exempted by NSPA. By delineating any economic effects of SFP across several economic domains, the quantitative data gleaned from NAICS, Census Bureau, and Tennessee Dept. of Revenue were supplemented with interviews of establishments in Tennessee that voluntarily transitioned to smoke-free environment. A total of 7 such establishments with capacities ranging from 50 to over 69,000 people and number of employees ranging from 6 to over 1300 were interviewed. It was discovered that smoke-free environments have positive economic effects on restaurant establishments in Tennessee. By focusing on the SFP effect on restaurant establishments, the findings can be extrapolated to support the case for 100% smoke-free environments for other hospitality locations such as bars, music venues, and casinos. After analysis of trends for retail sales, number of establishments, employment, and payrolls by size of establishment and Metropolitan Statistical Area, a positive economic effect was identified for majority of these indicators between 2010 and 2019, a 10-year period following restaurants becoming smoke-free. Highlights include: Retail sales in Tennessee eating and drinking establishments increased by 62% The number of restaurant establishments increased by 16% Employment in the restaurant sector increased by 23% The qualitative data from the interviews reinforces these findings, with 100% of respondents supporting smoke-free age-restricted venues in their local communities. Thus, it can be inferred from these Tennessee-specific data with high degree of confidence that other hospitality venues will benefit economically in some way by becoming smoke-free with the following considerations: Provide protections from SHS exposure and health risks to nonsmokers; Do not adversely affect sales or employment in the hospitality, entertainment or sport industries, including bars, hotels and motels, and restaurants; Have strong public support and compliance

    Resolving issues with environmental impact assessment of marine renewable energy installations

    Get PDF
    Growing concerns about climate change and energy security have fueled a rapid increase in the development of marine renewable energy installations (MREIs). The potential ecological consequences of increased use of these devices emphasizes the need for high quality environmental impact assessment (EIA). We demonstrate that these processes are hampered severely, primarily because ambiguities in the legislation and lack of clear implementation guidance are such that they do not ensure robust assessment of the significance of impacts and cumulative effects. We highlight why the regulatory framework leads to conceptual ambiguities and propose changes which, for the most part, do not require major adjustments to standard practice. We emphasize the importance of determining the degree of confidence in impacts to permit the likelihood as well as magnitude of impacts to be quantified and propose ways in which assessment of population-level impacts could be incorporated into the EIA process. Overall, however, we argue that, instead of trying to ascertain which particular developments are responsible for tipping an already heavily degraded marine environment into an undesirable state, emphasis should be placed on better strategic assessment.Publisher PDFPeer reviewe

    Periprosthetic Joint Infection After Total Knee Arthroplasty With or Without Antibiotic Bone Cement.

    Get PDF
    IMPORTANCE Despite increased use of antibiotic-loaded bone cement (ALBC) in joint arthroplasty over recent decades, current evidence for prophylactic use of ALBC to reduce risk of periprosthetic joint infection (PJI) is insufficient. OBJECTIVE To compare the rate of revision attributed to PJI following primary total knee arthroplasty (TKA) using ALBC vs plain bone cement. DESIGN, SETTING, AND PARTICIPANTS This international cohort study used data from 14 national or regional joint arthroplasty registries in Australia, Denmark, Finland, Germany, Italy, New Zealand, Norway, Romania, Sweden, Switzerland, the Netherlands, the UK, and the US. The study included primary TKAs for osteoarthritis registered from January 1, 2010, to December 31, 2020, and followed-up until December 31, 2021. Data analysis was performed from April to September 2023. EXPOSURE Primary TKA with ALBC vs plain bone cement. MAIN OUTCOMES AND MEASURES The primary outcome was risk of 1-year revision for PJI. Using a distributed data network analysis method, data were harmonized, and a cumulative revision rate was calculated (1 - Kaplan-Meier), and Cox regression analyses were performed within the 10 registries using both cement types. A meta-analysis was then performed to combine all aggregated data and evaluate the risk of 1-year revision for PJI and all causes. RESULTS Among 2 168 924 TKAs included, 93% were performed with ALBC. Most TKAs were performed in female patients (59.5%) and patients aged 65 to 74 years (39.9%), fully cemented (92.2%), and in the 2015 to 2020 period (62.5%). All participating registries reported a cumulative 1-year revision rate for PJI of less than 1% following primary TKA with ALBC (range, 0.21%-0.80%) and with plain bone cement (range, 0.23%-0.70%). The meta-analyses based on adjusted Cox regression for 1 917 190 TKAs showed no statistically significant difference at 1 year in risk of revision for PJI (hazard rate ratio, 1.16; 95% CI, 0.89-1.52) or for all causes (hazard rate ratio, 1.12; 95% CI, 0.89-1.40) among TKAs performed with ALBC vs plain bone cement. CONCLUSIONS AND RELEVANCE In this study, the risk of revision for PJI was similar between ALBC and plain bone cement following primary TKA. Any additional costs of ALBC and its relative value in reducing revision risk should be considered in the context of the overall health care delivery system

    The use of antibiotic-loaded bone cement and systemic antibiotic prophylactic use in 2,971,357 primary total knee arthroplasties from 2010 to 2020: an international register-based observational study among countries in Africa, Europe, North America, and Oceania.

    Get PDF
    BACKGROUND AND PURPOSE Antibiotic-loaded bone cement (ALBC) and systemic antibiotic prophylaxis (SAP) have been used to reduce periprosthetic joint infection (PJI) rates. We investigated the use of ALBC and SAP in primary total knee arthroplasty (TKA). PATIENTS AND METHODS This observational study is based on 2,971,357 primary TKAs reported in 2010-2020 to national/regional joint arthroplasty registries in Australia, Denmark, Finland, Germany, Italy, the Netherlands, New Zealand, Norway, Romania, South Africa, Sweden, Switzerland, the UK, and the USA. Aggregate-level data on trends and types of bone cement, antibiotic agents, and doses and duration of SAP used was extracted from participating registries. RESULTS ALBC was used in 77% of the TKAs with variation ranging from 100% in Norway to 31% in the USA. Palacos R+G was the most common (62%) ALBC type used. The primary antibiotic used in ALBC was gentamicin (94%). Use of ALBC in combination with SAP was common practice (77%). Cefazolin was the most common (32%) SAP agent. The doses and duration of SAP used varied from one single preoperative dosage as standard practice in Bolzano, Italy (98%) to 1-day 4 doses in Norway (83% of the 40,709 TKAs reported to the Norwegian arthroplasty register). CONCLUSION The proportion of ALBC usage in primary TKA varies internationally, with gentamicin being the most common antibiotic. ALBC in combination with SAP was common practice, with cefazolin the most common SAP agent. The type of ALBC and type, dose, and duration of SAP varied among participating countries

    Machining-induced thermal damage in cortical bone: necrosis and micro-mechanical integrity

    Get PDF
    In bone cutting, the tissue is exposed to necrosis due to temperature elevation, which can significantly influence postoperative results in orthopaedic surgeries. This damage is usually revealed through histological analysis to show the necrotic extent; however, this technique does not capture mechanical damage, which is essential for a full material integrity assessment. Here, with micro-mechanics, it is demonstrated that machining-induced damage in bone extends beyond the necrotic region. Drilling with different conditions was performed on ex-vivo bovine cortical bone, inducing different damage degrees. Micro-pillar compression tests were performed in the machined sub-surface to identify changes in properties and failure modes caused by drilling. It was revealed that at high cutting temperatures, the bone near the machined surface suffers from lower modulus (−42%), strength (−41%) and brittle behaviour, whereas the bulk bone remains undamaged with pristine properties and ductile behaviour. Histology was also performed to evaluate necrosis and, surprisingly, it was found that the brittle and weaker bone layer is more than three times larger when compared to the necrotic layer, clearly showing that the drilling thermo-mechanical effect could affect not only biologically, but also micro-mechanically. Consequently, these results reveal another kind of bone damage that has so far been neglected

    Developmental trajectories of cerebral blood flow and oxidative metabolism at baseline and during working memory tasks

    Full text link
    The neurobiological interpretation of developmental BOLD fMRI findings remains difficult due to the confounding issues of potentially varied baseline of brain function and varied strength of neurovascular coupling across age groups. The central theme of the present research is to study the development of brain function and neuronal activity through in vivo assessments of cerebral blood flow (CBF), oxygen extraction fraction (OEF) and cerebral metabolic rate of oxygen (CMRO2) both at baseline and during the performance of a working memory task in a cohort of typically developing children aged 7 to 18years. Using a suite of 4 emerging MRI technologies including MR blood oximetry, phase-contrast MRI, pseudo-continuous arterial spin labeling (pCASL) perfusion MRI and concurrent CBF/BOLD fMRI, we found: 1) At baseline, both global CBF and CMRO2 showed an age related decline while global OEF was stable across the age group; 2) During the working memory task, neither BOLD nor CBF responses showed significant variations with age in the activated fronto-parietal brain regions. Nevertheless, detailed voxel-wise analyses revealed sub-regions within the activated fronto-parietal regions that show significant decline of fractional CMRO2 responses with age. These findings suggest that the brain may become more "energy efficient" with age during development

    Evaluating the Effects of SARS-CoV-2 Spike Mutation D614G on Transmissibility and Pathogenicity.

    Get PDF
    Global dispersal and increasing frequency of the SARS-CoV-2 spike protein variant D614G are suggestive of a selective advantage but may also be due to a random founder effect. We investigate the hypothesis for positive selection of spike D614G in the United Kingdom using more than 25,000 whole genome SARS-CoV-2 sequences. Despite the availability of a large dataset, well represented by both spike 614 variants, not all approaches showed a conclusive signal of positive selection. Population genetic analysis indicates that 614G increases in frequency relative to 614D in a manner consistent with a selective advantage. We do not find any indication that patients infected with the spike 614G variant have higher COVID-19 mortality or clinical severity, but 614G is associated with higher viral load and younger age of patients. Significant differences in growth and size of 614G phylogenetic clusters indicate a need for continued study of this variant

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Evaluating the Effects of SARS-CoV-2 Spike Mutation D614G on Transmissibility and Pathogenicity

    Get PDF
    Global dispersal and increasing frequency of the SARS-CoV-2 spike protein variant D614G are suggestive of a selective advantage but may also be due to a random founder effect. We investigate the hypothesis for positive selection of spike D614G in the United Kingdom using more than 25,000 whole genome SARS-CoV-2 sequences. Despite the availability of a large dataset, well represented by both spike 614 variants, not all approaches showed a conclusive signal of positive selection. Population genetic analysis indicates that 614G increases in frequency relative to 614D in a manner consistent with a selective advantage. We do not find any indication that patients infected with the spike 614G variant have higher COVID-19 mortality or clinical severity, but 614G is associated with higher viral load and younger age of patients. Significant differences in growth and size of 614G phylogenetic clusters indicate a need for continued study of this variant

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
    corecore