48 research outputs found
Non-invasive imaging of coronary artery disease and its functional consequences- The Hybrid SPECT and CCTA approach
For several decades invasive coronary angiography (CA) has been the reference
standard in the assessment of coronary artery disease (CAD) severity. Patients with
high pre-test likelihood of CAD, receive a Class 1 indication for CA. Several factors,
however, motivate the need for a reliable non-invasive imaging modality, especially
in patients with a high pre-test likelihood. Not only to rule out, but also to reliably
diagnose coronary stenosis with functional consequences (significant CAD) and
guide therapeutic decision making.
Performance of CA in all patients with high pre-test likelihood of CAD has
undeniable drawbacks. First of all, the prevalence of angiographically significant CAD
in patients with high pre-test likelihood may be as low as 44-46%. In real life,
the prevalence of significant CAD in patients referred for CA is even lower at only
36.7%, according to the national cardiovascular data registry of the United States .
Secondly, angiographic severity of CAD does not reliably predict its main functional
consequence: myocardial ischemia 6-8. Because evidence of myocardial ischemia is
needed before deciding whether revascularization is necessary, additional (non-
) invasive tests are needed . Finally, CA comes with high costs as a result of the
need for admission to a hospital or day-care facility and a low risk (0.1%) of serious
adverse events .
With the advent of computed tomography coronary angiography (CCTA) and
the known reliability of single photon emission computed tomography myocardial
perfusion imaging (SPECT-MPI) a combination of non-invasive imaging modalities
became available that potentially could reliably diagnose significant CAD. The high
diagnostic accuracy of this combination has previously been shown in highly selected
patient populations. Nevertheless, data demonstrating the applicability of hybrid
SPECT and CCTA in diagnosing significant CAD in stable patients with an intermediate
to high pre-test likelihood of CAD are still lacking. Furthermore, the contribution of
these tests to therapeutic decision making in these patients is unknown. Providing
evidence on both of these issues is the main objective of this thesis
Improving Empirical Scrutiny of the Habitus
Many studies invoke the concept of the Bourdieusian habitus to account for a plethora of stratified patterns uncovered by conventional social-scientific methods. However, as a stratumspecific, embodied and largely non-declarative set of dispositions, the role of the habitus in those stratified patterns is typically not adequately scrutinised empirically. Instead, the habitus is often attributed theoretically to an empirically established link between stratification indicators and an outcome of interest. In this research note
The different risk of new-onset, chronic, worsening, and advanced heart failure:A systematic review and meta-regression analysis
Aims: Heart failure (HF) is a chronic and progressive syndrome associated with a poor prognosis. While it may seem intuitive that the risk of adverse outcomes varies across the different stages of HF, an overview of these risks is lacking. This study aims to determine the risk of all-cause mortality and HF hospitalizations associated with new-onset HF, chronic HF (CHF), worsening HF (WHF), and advanced HF. Methods and results: We performed a systematic review of observational studies from 2012 to 2022 using five different databases. The primary outcomes were 30-day and 1-year all-cause mortality, as well as 1-year HF hospitalization. Studies were pooled using random effects meta-analysis, and mixed-effects meta-regression was used to compare the different HF groups. Among the 15 759 studies screened, 66 were included representing 862 046 HF patients. Pooled 30-day mortality rates did not reveal a significant distinction between hospital-admitted patients, with rates of 10.13% for new-onset HF and 8.11% for WHF (p = 0.10). However, the 1-year mortality risk differed and increased stepwise from CHF to advanced HF, with a rate of 8.47% (95% confidence interval [CI] 7.24–9.89) for CHF, 21.15% (95% CI 17.78–24.95) for new-onset HF, 26.84% (95% CI 23.74–30.19) for WHF, and 29.74% (95% CI 24.15–36.10) for advanced HF. Readmission rates for HF at 1 year followed a similar trend. Conclusions: Our meta-analysis of observational studies confirms the different risk for adverse outcomes across the distinct HF stages. Moreover, it emphasizes the negative prognostic value of WHF as the first progressive stage from CHF towards advanced HF.</p
The different risk of new-onset, chronic, worsening, and advanced heart failure:A systematic review and meta-regression analysis
Aims: Heart failure (HF) is a chronic and progressive syndrome associated with a poor prognosis. While it may seem intuitive that the risk of adverse outcomes varies across the different stages of HF, an overview of these risks is lacking. This study aims to determine the risk of all-cause mortality and HF hospitalizations associated with new-onset HF, chronic HF (CHF), worsening HF (WHF), and advanced HF. Methods and results: We performed a systematic review of observational studies from 2012 to 2022 using five different databases. The primary outcomes were 30-day and 1-year all-cause mortality, as well as 1-year HF hospitalization. Studies were pooled using random effects meta-analysis, and mixed-effects meta-regression was used to compare the different HF groups. Among the 15 759 studies screened, 66 were included representing 862 046 HF patients. Pooled 30-day mortality rates did not reveal a significant distinction between hospital-admitted patients, with rates of 10.13% for new-onset HF and 8.11% for WHF (p = 0.10). However, the 1-year mortality risk differed and increased stepwise from CHF to advanced HF, with a rate of 8.47% (95% confidence interval [CI] 7.24–9.89) for CHF, 21.15% (95% CI 17.78–24.95) for new-onset HF, 26.84% (95% CI 23.74–30.19) for WHF, and 29.74% (95% CI 24.15–36.10) for advanced HF. Readmission rates for HF at 1 year followed a similar trend. Conclusions: Our meta-analysis of observational studies confirms the different risk for adverse outcomes across the distinct HF stages. Moreover, it emphasizes the negative prognostic value of WHF as the first progressive stage from CHF towards advanced HF.</p
Embedding routine health care data in clinical trials:with great power comes great responsibility
Randomised clinical trials (RCTs) are vital for medical progress. Unfortunately, ‘traditional’ RCTs are expensive and inherently slow. Moreover, their generalisability has been questioned. There is considerable overlap in routine health care data (RHCD) and trial-specific data. Therefore, integration of RHCD in an RCT has great potential, as it would reduce the effort and costs required to collect data, thereby overcoming some of the major downsides of a traditional RCT. However, use of RHCD comes with other challenges, such as privacy issues, as well as technical and practical barriers. Here, we give a current overview of related initiatives on national cardiovascular registries (Netherlands Heart Registration, Heart4Data), showcasing the interrelationships between and the relevance of the different registries for the practicing physician. We then discuss the benefits and limitations of RHCD use in the setting of a pragmatic RCT from a cardiovascular perspective, illustrated by a case study in heart failure.</p
Embedding routine health care data in clinical trials:with great power comes great responsibility
Randomised clinical trials (RCTs) are vital for medical progress. Unfortunately, ‘traditional’ RCTs are expensive and inherently slow. Moreover, their generalisability has been questioned. There is considerable overlap in routine health care data (RHCD) and trial-specific data. Therefore, integration of RHCD in an RCT has great potential, as it would reduce the effort and costs required to collect data, thereby overcoming some of the major downsides of a traditional RCT. However, use of RHCD comes with other challenges, such as privacy issues, as well as technical and practical barriers. Here, we give a current overview of related initiatives on national cardiovascular registries (Netherlands Heart Registration, Heart4Data), showcasing the interrelationships between and the relevance of the different registries for the practicing physician. We then discuss the benefits and limitations of RHCD use in the setting of a pragmatic RCT from a cardiovascular perspective, illustrated by a case study in heart failure.</p
Round-the-clock performance of coronary CT angiography for suspected acute coronary syndrome: Results from the BEACON trial
Objective: To assess the image quality of coronary CT angiography (CCTA) for suspected acute coronary syndrome (ACS) outside office hours. Methods: Patients with symptoms suggestive of an ACS underwent CCTA at the emergency department 24 hours, 7 days a week. A total of 118 patients, of whom 89 (75 %) presented during office hours (weekdays between 07:00 and 17:00) and 29 (25 %) outside office hours (weekdays between 17:00 and 07:00, weekends and holidays) underwent CCTA. Image quality was evaluated per coronary segment by two experienced readers and graded on an ordinal scale ranging from 1 to 3. Results: There were no significant differences in acquisition parameters, beta-blocker administration or heart rate between patients presenting during office hours and outside office hours. The median quality score per patient was 30.5 [interquartile range 26.0–33.5] for patients presenting during office hours in comparison to 27.5 [19.75–32.0] for patients presenting outside office hours (p=0.043). The number of non-evaluable segments was lower for patients presenting during office hours (0 [0–1.0] vs. 1.0 [0–4.0], p=0.009). Conclusion: Image quality of CCTA outside office hours in the diagnosis of suspected ACS is diminished. Key Points: • Quality scores were higher for coronary-CTA during office hours.• There were no differences in acquisition parameters.• There was a non-significant trend towards higher heart rates outside office hours.• Coronary-CTA on the ED requires state-of-the-art scanner technology and sufficiently trained staff.• Coronary-CTA on the ED needs preparation time and optimisation o
Comparison of Outcome After Percutaneous Mitral Valve Repair With the MitraClip in Patients With Versus Without Atrial Fibrillation
Percutaneous mitral valve repair with the MitraClip is an established treatment for patients with mitral regurgitation (MR) who are inoperable or at high risk for surgery. Atrial Fibrillation (AF) frequently coincides with MR, but only scarce data of the influence of AF on outcome after MitraClip is available. The aim of the current study was to compare the clinical outcome after MitraClip treatment in patients with versus without atrial fibrillation. Between January 2009 and January 2016, all consecutive patients treated with a MitraClip in 5 Dutch centers were included. Outcome measures were survival, symptoms, MR grade, and stroke incidence. In total, 618 patients were treated with a MitraClip. Patients with AF were older, had higher N-terminal B-type natriuretic peptide levels, more tricuspid regurgitation, less often coronary artery disease and a better left ventricular function. Survival of patients treated with the MitraClip was similar for patients with AF (82%) and without AF (non-AF; 85%) after 1 year (p = 0.30), but significantly different after 5-year follow-up (AF 34%; non-AF 47%; p = 0.006). After 1 month, 64% of the patients with AF were in New York Heart Association class I or II, in contrast to 77% of the patients without AF (p = 0.001). The stroke incidence appeared not to be significantly different (AF 1.8%; non-AF 1.0%; p = 0.40). In conclusion, patients with AF had similar 1-year survival, MR reduction, and stroke incidence compared with non-AF patients. However, MitraClip patients with AF had reduced long-term survival and remained more symptomatic compared with those without AF.</p
Curriculum vitae of the LOTOS-EUROS (v2.0) chemistry transport model
The development and application of chemistry transport models has a long
tradition. Within the Netherlands the LOTOS–EUROS model has been developed by
a consortium of institutes, after combining its independently developed
predecessors in 2005. Recently, version 2.0 of the model was released as an
open-source version. This paper presents the curriculum vitae of the model
system, describing the model's history, model philosophy, basic features and a
validation with EMEP stations for the new benchmark year 2012, and presents
cases with the model's most recent and key developments. By setting the model
developments in context and providing an outlook for directions for further
development, the paper goes beyond the common model description. With an
origin in ozone and sulfur modelling for the models LOTOS and EUROS, the
application areas were gradually extended with persistent organic pollutants,
reactive nitrogen, and primary and secondary particulate matter. After the
combination of the models to LOTOS–EUROS in 2005, the model was further
developed to include new source parametrizations (e.g. road resuspension,
desert dust, wildfires), applied for operational smog forecasts in the
Netherlands and Europe, and has been used for emission scenarios, source
apportionment, and long-term hindcast and climate change scenarios.
LOTOS–EUROS has been a front-runner in data assimilation of ground-based and
satellite observations and has participated in many model intercomparison
studies. The model is no longer confined to applications over Europe but is
also applied to other regions of the world, e.g. China. The increasing
interaction with emission experts has also contributed to the improvement of
the model's performance. The philosophy for model development has always been
to use knowledge that is state of the art and proven, to keep a good balance
in the level of detail of process description and accuracy of input and
output, and to keep a good record on the effect of model changes using
benchmarking and validation. The performance of v2.0 with respect to EMEP
observations is good, with spatial correlations around 0.8 or higher for
concentrations and wet deposition. Temporal correlations are around 0.5 or
higher. Recent innovative applications include source apportionment and data
assimilation, particle number modelling, and energy transition scenarios
including corresponding land use changes as well as Saharan dust forecasting.
Future developments would enable more flexibility with respect to model
horizontal and vertical resolution and further detailing of model input data.
This includes the use of different sources of land use characterization
(roughness length and vegetation), detailing of emissions in space and time,
and efficient coupling to meteorology from different meteorological models
Automatic centerline extraction of coronary arteries in coronary computed tomographic angiography
Coronary computed tomographic angiography (CCTA) is a non-invasive imaging modality for the visualization of the heart and coronary arteries. To fully exploit the potential of the CCTA datasets and apply it in clinical practice, an automated coronary artery extraction approach is needed. The purpose of this paper is to present and validate a fully automatic centerline extraction algorithm for coronary arteries in CCTA images. The algorithm is based on an improved version of Frangi’s vesselness filter which removes unwanted step-edge responses at the boundaries of the cardiac chambers. Building upon this new vesselness filter, the coronary artery extraction pipeline extracts the centerlines of main branches as well as side-branches automatically. This algorithm was first evaluated with a standardized evaluation framework named Rotterdam Coronary Artery Algorithm Evaluation Framework used in the MICCAI Coronary Artery Tracking challenge 2008 (CAT08). It includes 128 reference centerlines which were manually delineated. The average overlap and accuracy measures of our method were 93.7% and 0.30 mm, respectively, which ranked at the 1st and 3rd place compared to five other automatic methods presented in the CAT08. Secondly, in 50 clinical datasets, a total of 100 reference centerlines were generated from lumen contours in the transversal planes which were manually corrected by an expert from the cardiology department. In this evaluation, the average overlap and accuracy were 96.1% and 0.33 mm, respectively. The entire processing time for one dataset is less than 2 min on a standard desktop computer. In conclusion, our newly developed automatic approach can extract coronary arteries in CCTA images with excellent performances in extraction ability and accuracy