11 research outputs found

    Potential impact of the 2019 ACC/AHA guidelines on the primary prevention of cardiovascular disease recommendations on the inappropriate routine use of aspirin and aspirin use without a recommended indication for primary prevention of cardiovascular disease in cardiology practices: Insights from the NCDR pinnacle registry

    No full text
    Background: Aspirin is recommended in patients with atherosclerotic cardiovascular disease for secondary prevention. In patients without atherosclerotic cardiovascular disease and not at high 10-year risk, there is no evidence aspirin reduces adverse cardiovascular events and it could increase bleeding. The 2019 American College of Cardiology/American Heart Association Guidelines on Primary Prevention of Cardiovascular Disease state that aspirin may be considered for primary prevention (class IIb) in patients 40 to 70 years that are at higher risk of atherosclerotic cardiovascular disease and that routine use of aspirin should be avoided (class III:Harm) for patients \u3e70 years. We examined the frequency of patients on aspirin for primary prevention that would have been considered unindicated or potentially harmful per the recent guideline where aspirin discontinuation may be beneficial.Methods: To assess the potential impact, within the National Cardiovascular Disease Registry Practice Innovation and Clinical Excellence Registry, we assessed 855 366 patients from 400 practices with encounters between January 1, 2018 and March 31, 2019, that were receiving aspirin for primary prevention. We defined inappropriate use as the use of aspirin in patients \u3c40 or \u3e70 years and use without a recommended indication as use of aspirin in patients 40 to 70 years with low, borderline, or intermediate 10-year atherosclerotic cardiovascular disease risk. Frequency of inappropriate use and use without a recommended indication were calculated and practice-level variation was evaluated using the median rate ratio.Results: Inappropriate use occurred in 27.6% (193 674/701 975) and use without a recommended indication in 26.0% (31 810/122 507) with significant practice-level variation in inappropriate use (predicted median practice-level rate 33.5%, interquartile range, 24.1% to 40.8%; median rate ratio, 1.71 [95% CI, 1.67-1.76]).Conclusions: Immediately before the 2019 American College of Cardiology/American Heart Association Guidelines on Primary Prevention of Cardiovascular Disease, over one-fourth of patients in this national registry were receiving aspirin for primary prevention inappropriately or without a recommended indication with significant practice-level variation. These findings help to determine the potential impact of guideline recommendations on contemporary use of aspirin for primary prevention

    Engaging clinicians in evidence based policy development : the case of nursing documentation

    No full text
    A lack of consistent policy direction, revealed by a review of nursing and midwifery documentation, presented researchers with an opportunity to engage clinicians in the process of evidence based policy development. By utilising the framework informed by both practice development and the principles of evidence based practice, clinicians were taken through an education program and a series of activities to develop their skills in discerning how research evidence and other literature can inform policy development. The clinicians' involvement maximised their investment in the final policy. Clinicians synthesised all the evidence associated with nursing and midwifery documentation and produced a set of seven guiding principles that formed the basis of an area wide policy for nursing and midwifery documentation. The strength of this approach to policy development was that the clinician's experience ensured that the concerns of the clinicians were included in the policy. Difficulties in completing tasks outside meeting times were highlighted

    Leaching of Cryptosporidium parvum Oocysts, Escherichia coli, and a Salmonella enterica Serovar Typhimurium Bacteriophage through Intact Soil Cores following Surface Application and Injection of Slurry

    No full text
    Increasing amounts of livestock manure are being applied to agricultural soil, but it is unknown to what extent this may be associated with contamination of aquatic recipients and groundwater if microorganisms are transported through the soil under natural weather conditions. The objective of this study was therefore to evaluate how injection and surface application of pig slurry on intact sandy clay loam soil cores influenced the leaching of Salmonella enterica serovar Typhimurium bacteriophage 28B, Escherichia coli, and Cryptosporidium parvum oocysts. All three microbial tracers were detected in the leachate on day 1, and the highest relative concentration was detected on the fourth day (0.1 pore volume). Although the concentration of the phage 28B declined over time, the phage was still found in leachate at day 148. C. parvum oocysts and chloride had an additional rise in the relative concentration at a 0.5 pore volume, corresponding to the exchange of the total pore volume. The leaching of E. coli was delayed compared with that of the added microbial tracers, indicating a stronger attachment to slurry particles, but E. coli could be detected up to 3 months. Significantly enhanced leaching of phage 28B and oocysts by the injection method was seen, whereas leaching of the indigenous E. coli was not affected by the application method. Preferential flow was the primary transport vehicle, and the diameter of the fractures in the intact soil cores facilitated transport of all sizes of microbial tracers under natural weather conditions

    Anti-spike antibody response to natural SARS-CoV-2 infection in the general population

    Get PDF
    Understanding the trajectory, duration, and determinants of antibody responses after SARS-CoV-2 infection can inform subsequent protection and risk of reinfection, however large-scale representative studies are limited. Here we estimated antibody response after SARS-CoV-2 infection in the general population using representative data from 7,256 United Kingdom COVID-19 infection survey participants who had positive swab SARS-CoV-2 PCR tests from 26-April-2020 to 14-June-2021. A latent class model classified 24% of participants as ‘non-responders’ not developing anti-spike antibodies, who were older, had higher SARS-CoV-2 cycle threshold values during infection (i.e. lower viral burden), and less frequently reported any symptoms. Among those who seroconverted, using Bayesian linear mixed models, the estimated anti-spike IgG peak level was 7.3-fold higher than the level previously associated with 50% protection against reinfection, with higher peak levels in older participants and those of non-white ethnicity. The estimated anti-spike IgG half-life was 184 days, being longer in females and those of white ethnicity. We estimated antibody levels associated with protection against reinfection likely last 1.5-2 years on average, with levels associated with protection from severe infection present for several years. These estimates could inform planning for vaccination booster strategies
    corecore