329 research outputs found
Recommended from our members
Strategies for Primary Prevention of Coronary Heart Disease Based on Risk Stratification by the ACC/AHA Lipid Guidelines, ATP III Guidelines, Coronary Calcium Scoring...
Background: Several approaches have been proposed for risk-stratification and primary prevention of coronary heart disease (CHD), but their comparative and cost-effectiveness is unknown. Methods: We constructed a state-transition microsimulation model to compare multiple approaches to the primary prevention of CHD in a simulated cohort of men aged 45–75 and women 55–75. Risk-stratification strategies included the 2013 American College of Cardiology/American Heart Association (ACC/AHA) guidelines on the treatment of blood cholesterol, the Adult Treatment Panel (ATP) III guidelines, and approaches based on coronary artery calcium (CAC) scoring and C-reactive protein (CRP). Additionally we assessed a treat-all strategy in which all individuals were prescribed either moderate-dose or high-dose statins and all males received low-dose aspirin. Outcome measures included CHD events, costs, medication-related side effects, radiation-attributable cancers, and quality-adjusted-life-years (QALYs) over a 30-year timeframe. Results: Treat-all with high-dose statins dominated all other strategies for both men and women, gaining 15.7 million QALYs, preventing 7.3 million myocardial infarctions, and saving over $238 billion, compared to the status quo, far outweighing its associated adverse events including bleeding, hepatitis, myopathy, and new-onset diabetes. ACC/AHA guidelines were more cost-effective than ATP III guidelines for both men and women despite placing 8.7 million more people on statins. For women at low CHD risk, treat-all with high-dose statins was more likely to cause a statin-related adverse event than to prevent a CHD event. Conclusions: Despite leading to a greater proportion of the population placed on statin therapy, the ACC/AHA guidelines are more cost-effective than ATP III. Even so, at generic prices, treating all men and women with statins and all men with low-dose aspirin appears to be more cost-effective than all risk-stratification approaches for the primary prevention of CHD. Especially for low-CHD risk women, decisions on the appropriate primary prevention strategy should be based on shared decision making between patients and healthcare providers
Worldwide Disparities in Recovery of Cardiac Testing 1 Year Into COVID-19
BACKGROUND
The extent to which health care systems have adapted to the COVID-19 pandemic to provide necessary cardiac diagnostic services is unknown.
OBJECTIVES
The aim of this study was to determine the impact of the pandemic on cardiac testing practices, volumes and types of diagnostic services, and perceived psychological stress to health care providers worldwide.
METHODS
The International Atomic Energy Agency conducted a worldwide survey assessing alterations from baseline in cardiovascular diagnostic care at the pandemic's onset and 1 year later. Multivariable regression was used to determine factors associated with procedure volume recovery.
RESULTS
Surveys were submitted from 669 centers in 107 countries. Worldwide reduction in cardiac procedure volumes of 64% from March 2019 to April 2020 recovered by April 2021 in high- and upper middle-income countries (recovery rates of 108% and 99%) but remained depressed in lower middle- and low-income countries (46% and 30% recovery). Although stress testing was used 12% less frequently in 2021 than in 2019, coronary computed tomographic angiography was used 14% more, a trend also seen for other advanced cardiac imaging modalities (positron emission tomography and magnetic resonance; 22%-25% increases). Pandemic-related psychological stress was estimated to have affected nearly 40% of staff, impacting patient care at 78% of sites. In multivariable regression, only lower-income status and physicians' psychological stress were significant in predicting recovery of cardiac testing.
CONCLUSIONS
Cardiac diagnostic testing has yet to recover to prepandemic levels in lower-income countries. Worldwide, the decrease in standard stress testing is offset by greater use of advanced cardiac imaging modalities. Pandemic-related psychological stress among providers is widespread and associated with poor recovery of cardiac testing
National trends in emergency room diagnosis of pulmonary embolism, 2001–2010: a cross-sectional study
Background:
Little is known about the United States diagnosis and burden of pulmonary embolism (PE) in the emergency department (ED), and their evolution over the past decade. We examined nationally representative data to evaluate factors associated with and trends in ED diagnosis of PE.
Methods:
We conducted a cross-sectional study using National Hospital Ambulatory Medical Care Survey (NHAMCS) data from January 1, 2001 to December 31, 2010. We identified all ED patient visits where PE was diagnosed and corresponding demographic, hemodynamic, testing and disposition data. Analyses were performed using descriptive statistics and multivariable logistic regression.
Results:
During the study period 988,000 weighted patient visits with diagnosis of PE were identified. Among patients with an ED visit, the likelihood of having a diagnosis of PE per year increased significantly from 2001 to 2010 (odds ratio [OR] 1.091, 95% confidence interval [CI] 1.034-1.152, P = 0.002 for trend) when adjusted for demographic and hospital information. In contrast, when further adjusted for the use of computed tomography (CT) among patients in the ED, the likelihood of having a diagnosis of PE per year did not change (OR 1.041, 95% CI 0.987-1.097, P = 0.14). Overall, 75.1% of patients seen with a diagnosis of PE were hemodynamically stable; 86% were admitted with an in-hospital death rate under 3%.
Conclusions:
The proportion of ED visits with a diagnosis of PE increased significantly from 2001 to 2010 and this rise can be attributed in large part to the increased availability and use of CT. Most of these patients were admitted with low in-hospital mortality.
Keywords:
Pulmonary embolism Emergency department Computed Tomography (CT) pulmonary angiograph
Increased Regional Epicardial Fat Volume Associated with Reversible Myocardial Ischemia in Patients with Suspected Coronary Artery Disease
Epicardial adipose tissue is a source of pro-inflammatory cytokines and has been linked to the development of coronary artery disease. No study has systematically assessed the relationship between local epicardial fat volume (EFV) and myocardial perfusion defects. We analyzed EFV in patients undergoing SPECT myocardial perfusion imaging combined with computed tomography (CT) for attenuation correction. Low-dose CT without contrast was performed in 396 consecutive patients undergoing SPECT imaging for evaluation of coronary artery disease. Regional thickness, cross-sectional areas, and total EFV were assessed. 295 patients had normal myocardial perfusion scans and 101 had abnormal perfusion scans. Mean EFVs in normal, ischemic, and infarcted hearts were 99.8 ± 82.3 cm3, 156.4 ± 121.9 cm3, and 96.3 ± 102.1 cm3, respectively (P < 0.001). Reversible perfusion defects were associated with increased local EFV compared to normal perfusion in the distribution of the right (69.2 ± 51.5 vs 46.6 ± 32.0 cm3; P = 0.03) and left anterior descending coronary artery (87.1 ± 76.4 vs 46.7 ± 40.6 cm3; P = 0.005). Our results demonstrate increased regional epicardial fat in patients with active myocardial ischemia compared to patients with myocardial scar or normal perfusion on nuclear perfusion scans. Our results suggest a potential role for cardiac CT to improve risk stratification in patients with suspected coronary artery disease
Multimodality Cardiac Imaging in a Patient with Kawasaki Disease and Giant Aneurysms
Kawasaki disease is a well-known cause of acquired cardiac disease in the pediatric and adult population, most prevalent in Japan but also seen commonly in the United States. In the era of intravenous immunoglobulin (IVIG) treatment, the morbidity associated with this disease has decreased, but it remains a serious illness. Here we present the case of an adolescent, initially diagnosed with Kawasaki disease as an infant, that progressed to giant aneurysm formation and calcification of the coronary arteries. We review his case and the literature, focusing on the integral role of multimodality imaging in managing Kawasaki disease
Recommended from our members
Can Physicians Identify Inappropriate Nuclear Stress Tests? An Examination of Inter-Rater Reliability for the 2009 Appropriate Use Criteria for Radionuclide Imaging
Background—We sought to determine inter-rater reliability of the 2009 Appropriate Use Criteria for radionuclide imaging and whether physicians at various levels of training can effectively identify nuclear stress tests with inappropriate indications.
Methods and Results—Four hundred patients were randomly selected from a consecutive cohort of patients undergoing nuclear stress testing at an academic medical center. Raters with different levels of training (including cardiology attending physicians, cardiology fellows, internal medicine hospitalists, and internal medicine interns) classified individual nuclear stress tests using the 2009 Appropriate Use Criteria. Consensus classification by 2 cardiologists was considered the operational gold standard, and sensitivity and specificity of individual raters for identifying inappropriate tests were calculated. Inter-rater reliability of the Appropriate Use Criteria was assessed using Cohen κ statistics for pairs of different raters. The mean age of patients was 61.5 years; 214 (54%) were female. The cardiologists rated 256 (64%) of 400 nuclear stress tests as appropriate, 68 (18%) as uncertain, 55 (14%) as inappropriate; 21 (5%) tests were unable to be classified. Inter-rater reliability for noncardiologist raters was modest (unweighted Cohen κ, 0.51, 95% confidence interval, 0.45–0.55). Sensitivity of individual raters for identifying inappropriate tests ranged from 47% to 82%, while specificity ranged from 85% to 97%.
Conclusions—Inter-rater reliability for the 2009 Appropriate Use Criteria for radionuclide imaging is modest, and there is considerable variation in the ability of raters at different levels of training to identify inappropriate tests
Current worldwide nuclear cardiology practices and radiation exposure: results from the 65 country IAEA Nuclear Cardiology Protocols Cross-Sectional Study (INCAPS)
AIMS To characterize patient radiation doses from nuclear myocardial perfusion imaging (MPI) and the use of radiation-optimizing 'best practices' worldwide, and to evaluate the relationship between laboratory use of best practices and patient radiation dose. METHODS AND RESULTS We conducted an observational cross-sectional study of protocols used for all 7911 MPI studies performed in 308 nuclear cardiology laboratories in 65 countries for a single week in March-April 2013. Eight 'best practices' relating to radiation exposure were identified a priori by an expert committee, and a radiation-related quality index (QI) devised indicating the number of best practices used by a laboratory. Patient radiation effective dose (ED) ranged between 0.8 and 35.6 mSv (median 10.0 mSv). Average laboratory ED ranged from 2.2 to 24.4 mSv (median 10.4 mSv); only 91 (30%) laboratories achieved the median ED ≤ 9 mSv recommended by guidelines. Laboratory QIs ranged from 2 to 8 (median 5). Both ED and QI differed significantly between laboratories, countries, and world regions. The lowest median ED (8.0 mSv), in Europe, coincided with high best-practice adherence (mean laboratory QI 6.2). The highest doses (median 12.1 mSv) and low QI (4.9) occurred in Latin America. In hierarchical regression modelling, patients undergoing MPI at laboratories following more 'best practices' had lower EDs. CONCLUSION Marked worldwide variation exists in radiation safety practices pertaining to MPI, with targeted EDs currently achieved in a minority of laboratories. The significant relationship between best-practice implementation and lower doses indicates numerous opportunities to reduce radiation exposure from MPI globally
Thermal conductivity of amorphous carbon thin films
Thermal conductivities of amorphous carbon thin films are measured
in the temperatures range 80--400 K using the method. Sample films
range from soft a-C:H prepared by remote-plasma deposition ( W
m K at room temperature) to amorphous diamond with a large
fraction of bonded carbon deposited from a filtered-arc source ( W m K). Effective-medium theory provides a
phenomenological description of the variation of conductivity with mass
density. The thermal conductivities are in good agreement with the minimum
thermal conductivity calculated from the measured atomic density and
longitudinal speed of sound.Comment: 4 pages, 4 figure
Non-maximally entangled states: production, characterization and utilization
Using a spontaneous-downconversion photon source, we produce true
non-maximally entangled states, i.e., without the need for post-selection. The
degree and phase of entanglement are readily tunable, and are characterized
both by a standard analysis using coincidence minima, and by quantum state
tomography of the two-photon state. Using the latter, we experimentally
reconstruct the reduced density matrix for the polarization. Finally, we use
these states to measure the Hardy fraction, obtaining a result that is from any local-realistic result.Comment: 4 pages, 4 figures. To appear in Phys. Rev. Let
- …