261 research outputs found
Recommended from our members
Strategies for Primary Prevention of Coronary Heart Disease Based on Risk Stratification by the ACC/AHA Lipid Guidelines, ATP III Guidelines, Coronary Calcium Scoring...
Background: Several approaches have been proposed for risk-stratification and primary prevention of coronary heart disease (CHD), but their comparative and cost-effectiveness is unknown. Methods: We constructed a state-transition microsimulation model to compare multiple approaches to the primary prevention of CHD in a simulated cohort of men aged 45–75 and women 55–75. Risk-stratification strategies included the 2013 American College of Cardiology/American Heart Association (ACC/AHA) guidelines on the treatment of blood cholesterol, the Adult Treatment Panel (ATP) III guidelines, and approaches based on coronary artery calcium (CAC) scoring and C-reactive protein (CRP). Additionally we assessed a treat-all strategy in which all individuals were prescribed either moderate-dose or high-dose statins and all males received low-dose aspirin. Outcome measures included CHD events, costs, medication-related side effects, radiation-attributable cancers, and quality-adjusted-life-years (QALYs) over a 30-year timeframe. Results: Treat-all with high-dose statins dominated all other strategies for both men and women, gaining 15.7 million QALYs, preventing 7.3 million myocardial infarctions, and saving over $238 billion, compared to the status quo, far outweighing its associated adverse events including bleeding, hepatitis, myopathy, and new-onset diabetes. ACC/AHA guidelines were more cost-effective than ATP III guidelines for both men and women despite placing 8.7 million more people on statins. For women at low CHD risk, treat-all with high-dose statins was more likely to cause a statin-related adverse event than to prevent a CHD event. Conclusions: Despite leading to a greater proportion of the population placed on statin therapy, the ACC/AHA guidelines are more cost-effective than ATP III. Even so, at generic prices, treating all men and women with statins and all men with low-dose aspirin appears to be more cost-effective than all risk-stratification approaches for the primary prevention of CHD. Especially for low-CHD risk women, decisions on the appropriate primary prevention strategy should be based on shared decision making between patients and healthcare providers
National trends in emergency room diagnosis of pulmonary embolism, 2001–2010: a cross-sectional study
Background:
Little is known about the United States diagnosis and burden of pulmonary embolism (PE) in the emergency department (ED), and their evolution over the past decade. We examined nationally representative data to evaluate factors associated with and trends in ED diagnosis of PE.
Methods:
We conducted a cross-sectional study using National Hospital Ambulatory Medical Care Survey (NHAMCS) data from January 1, 2001 to December 31, 2010. We identified all ED patient visits where PE was diagnosed and corresponding demographic, hemodynamic, testing and disposition data. Analyses were performed using descriptive statistics and multivariable logistic regression.
Results:
During the study period 988,000 weighted patient visits with diagnosis of PE were identified. Among patients with an ED visit, the likelihood of having a diagnosis of PE per year increased significantly from 2001 to 2010 (odds ratio [OR] 1.091, 95% confidence interval [CI] 1.034-1.152, P = 0.002 for trend) when adjusted for demographic and hospital information. In contrast, when further adjusted for the use of computed tomography (CT) among patients in the ED, the likelihood of having a diagnosis of PE per year did not change (OR 1.041, 95% CI 0.987-1.097, P = 0.14). Overall, 75.1% of patients seen with a diagnosis of PE were hemodynamically stable; 86% were admitted with an in-hospital death rate under 3%.
Conclusions:
The proportion of ED visits with a diagnosis of PE increased significantly from 2001 to 2010 and this rise can be attributed in large part to the increased availability and use of CT. Most of these patients were admitted with low in-hospital mortality.
Keywords:
Pulmonary embolism Emergency department Computed Tomography (CT) pulmonary angiograph
Worldwide Disparities in Recovery of Cardiac Testing 1 Year Into COVID-19
BACKGROUND
The extent to which health care systems have adapted to the COVID-19 pandemic to provide necessary cardiac diagnostic services is unknown.
OBJECTIVES
The aim of this study was to determine the impact of the pandemic on cardiac testing practices, volumes and types of diagnostic services, and perceived psychological stress to health care providers worldwide.
METHODS
The International Atomic Energy Agency conducted a worldwide survey assessing alterations from baseline in cardiovascular diagnostic care at the pandemic's onset and 1 year later. Multivariable regression was used to determine factors associated with procedure volume recovery.
RESULTS
Surveys were submitted from 669 centers in 107 countries. Worldwide reduction in cardiac procedure volumes of 64% from March 2019 to April 2020 recovered by April 2021 in high- and upper middle-income countries (recovery rates of 108% and 99%) but remained depressed in lower middle- and low-income countries (46% and 30% recovery). Although stress testing was used 12% less frequently in 2021 than in 2019, coronary computed tomographic angiography was used 14% more, a trend also seen for other advanced cardiac imaging modalities (positron emission tomography and magnetic resonance; 22%-25% increases). Pandemic-related psychological stress was estimated to have affected nearly 40% of staff, impacting patient care at 78% of sites. In multivariable regression, only lower-income status and physicians' psychological stress were significant in predicting recovery of cardiac testing.
CONCLUSIONS
Cardiac diagnostic testing has yet to recover to prepandemic levels in lower-income countries. Worldwide, the decrease in standard stress testing is offset by greater use of advanced cardiac imaging modalities. Pandemic-related psychological stress among providers is widespread and associated with poor recovery of cardiac testing
Current worldwide nuclear cardiology practices and radiation exposure: results from the 65 country IAEA Nuclear Cardiology Protocols Cross-Sectional Study (INCAPS)
AIMS To characterize patient radiation doses from nuclear myocardial perfusion imaging (MPI) and the use of radiation-optimizing 'best practices' worldwide, and to evaluate the relationship between laboratory use of best practices and patient radiation dose. METHODS AND RESULTS We conducted an observational cross-sectional study of protocols used for all 7911 MPI studies performed in 308 nuclear cardiology laboratories in 65 countries for a single week in March-April 2013. Eight 'best practices' relating to radiation exposure were identified a priori by an expert committee, and a radiation-related quality index (QI) devised indicating the number of best practices used by a laboratory. Patient radiation effective dose (ED) ranged between 0.8 and 35.6 mSv (median 10.0 mSv). Average laboratory ED ranged from 2.2 to 24.4 mSv (median 10.4 mSv); only 91 (30%) laboratories achieved the median ED ≤ 9 mSv recommended by guidelines. Laboratory QIs ranged from 2 to 8 (median 5). Both ED and QI differed significantly between laboratories, countries, and world regions. The lowest median ED (8.0 mSv), in Europe, coincided with high best-practice adherence (mean laboratory QI 6.2). The highest doses (median 12.1 mSv) and low QI (4.9) occurred in Latin America. In hierarchical regression modelling, patients undergoing MPI at laboratories following more 'best practices' had lower EDs. CONCLUSION Marked worldwide variation exists in radiation safety practices pertaining to MPI, with targeted EDs currently achieved in a minority of laboratories. The significant relationship between best-practice implementation and lower doses indicates numerous opportunities to reduce radiation exposure from MPI globally
Recommended from our members
Can Physicians Identify Inappropriate Nuclear Stress Tests? An Examination of Inter-Rater Reliability for the 2009 Appropriate Use Criteria for Radionuclide Imaging
Background—We sought to determine inter-rater reliability of the 2009 Appropriate Use Criteria for radionuclide imaging and whether physicians at various levels of training can effectively identify nuclear stress tests with inappropriate indications.
Methods and Results—Four hundred patients were randomly selected from a consecutive cohort of patients undergoing nuclear stress testing at an academic medical center. Raters with different levels of training (including cardiology attending physicians, cardiology fellows, internal medicine hospitalists, and internal medicine interns) classified individual nuclear stress tests using the 2009 Appropriate Use Criteria. Consensus classification by 2 cardiologists was considered the operational gold standard, and sensitivity and specificity of individual raters for identifying inappropriate tests were calculated. Inter-rater reliability of the Appropriate Use Criteria was assessed using Cohen κ statistics for pairs of different raters. The mean age of patients was 61.5 years; 214 (54%) were female. The cardiologists rated 256 (64%) of 400 nuclear stress tests as appropriate, 68 (18%) as uncertain, 55 (14%) as inappropriate; 21 (5%) tests were unable to be classified. Inter-rater reliability for noncardiologist raters was modest (unweighted Cohen κ, 0.51, 95% confidence interval, 0.45–0.55). Sensitivity of individual raters for identifying inappropriate tests ranged from 47% to 82%, while specificity ranged from 85% to 97%.
Conclusions—Inter-rater reliability for the 2009 Appropriate Use Criteria for radionuclide imaging is modest, and there is considerable variation in the ability of raters at different levels of training to identify inappropriate tests
Non-maximally entangled states: production, characterization and utilization
Using a spontaneous-downconversion photon source, we produce true
non-maximally entangled states, i.e., without the need for post-selection. The
degree and phase of entanglement are readily tunable, and are characterized
both by a standard analysis using coincidence minima, and by quantum state
tomography of the two-photon state. Using the latter, we experimentally
reconstruct the reduced density matrix for the polarization. Finally, we use
these states to measure the Hardy fraction, obtaining a result that is from any local-realistic result.Comment: 4 pages, 4 figures. To appear in Phys. Rev. Let
Thermal conductivity of amorphous carbon thin films
Thermal conductivities of amorphous carbon thin films are measured
in the temperatures range 80--400 K using the method. Sample films
range from soft a-C:H prepared by remote-plasma deposition ( W
m K at room temperature) to amorphous diamond with a large
fraction of bonded carbon deposited from a filtered-arc source ( W m K). Effective-medium theory provides a
phenomenological description of the variation of conductivity with mass
density. The thermal conductivities are in good agreement with the minimum
thermal conductivity calculated from the measured atomic density and
longitudinal speed of sound.Comment: 4 pages, 4 figure
Disparities in Non-invasive Traditional and Advanced Testing for Coronary Artery Disease: Findings from the INCAPS-COVID 2 Study
The COVID-19 pandemic disrupted delivery of cardiovascular care including non-invasive testing protocols and test selection for evaluation of coronary artery disease (CAD). Trends in test selection among traditional versus advanced noninvasive tests for CAD during the pandemic and among countries of varying income status have not been well studied. The International Atomic Energy Agency conducted a global survey to assess pandemic-related changes in the practice of cardiovascular diagnostic testing. Site procedural volumes for noninvasive tests to evaluate CAD from March 2019 (pre-pandemic), April 2020 (onset), and April 2021 (initial recovery) were collected. We considered traditional testing modalities exercise electrocardiography (ECG), stress echocardiography, and stress single-photon emission computed tomography (SPECT), and advanced testing modalities stress cardiac magnetic resonance (CMR), coronary computed tomography angiography (CCTA), and stress positron emission tomography (PET). Survey data were obtained from 669 centers in 107 countries, reporting the performance of 367,933 studies for CAD during the study period. Compared to 2019, traditional tests were performed 14% less frequently (recovery rate 82%) in 2021 versus advanced tests which were performed 15% more frequently (128% recovery rate). CCTA, stress CMR and stress PET showed 14%, 25%, and 25% increases in volumes from 2019 to 2021, respectively. The increase in advanced testing was isolated to high- and upper-middle-income countries, with 132% recovery in advanced tests by 2021 as compared to 55% in lower-income nations. The COVID-19 pandemic exacerbated economic disparities in CAD testing practice between wealthy and poorer countries. Greater recovery rates and even new growth was observed for advanced imaging modalities but this growth was restricted to wealthy countries. Efforts to reduce practice variations in CAD testing due to economic status are warranted.<br/
Simulating the influences of groundwater on regional geomorphology using a distributed, dynamic, landscape evolution modelling platform
A dynamic landscape evolution modelling platform (CLiDE) is presented that allows a variety of Earth system interactions to be explored under differing environmental forcing factors. Representation of distributed surface and subsurface hydrology within CLiDE is suited to simulation at sub-annual to centennial time-scales. In this study the hydrological components of CLiDE are evaluated against analytical solutions and recorded datasets. The impact of differing groundwater regimes on sediment discharge is examined for a simple, idealised catchment, Sediment discharge is found to be a function of the evolving catchment morphology. Application of CLiDE to the upper Eden Valley catchment, UK, suggests the addition of baseflow-return from groundwater into the fluvial system modifies the total catchment sediment discharge and the spatio-temporal distribution of sediment fluxes during storm events. The occurrence of a storm following a period of appreciable antecedent rainfall is found to increase simulated sediment fluxes
- …