6 research outputs found
The use of 2-D speckle tracking echocardiography in assessing adolescent athletes with left ventricular hypertrabeculation meeting the criteria for left ventricular non-compaction cardiomyopathy
BACKGROUND: Current echocardiographic criteria cannot accurately differentiate exercise induced left ventricular (LV) hypertrabeculation in athletes from LV non-compaction cardiomyopathy (LVNC). This study aims to evaluate the role of speckle tracking echocardiography (STE) in characterising LV myocardial mechanics in healthy adolescent athletes with and without LVNC echocardiographic criteria. METHODS: Adolescent athletes evaluated at three sports academies between 2014 and 2019 were considered for this observational study. Those meeting the Jenni criteria for LVNC (end-systolic non-compacted/compacted myocardium ratio > 2 in any short axis segment) were considered LVNC+ and the rest LVNC-. Peak systolic LV longitudinal strain (Sl), circumferential strain (Sc), rotation (Rot), corresponding strain rates (SRl/c) and segmental values were calculated and compared using a non-inferiority approach. RESULTS: A total of 417 participants were included, mean age 14.5 ± 1.7 years, of which 6.5% were LVNC+ (n = 27). None of the athletes showed any additional LVNC clinical criteria. All average Sl, SRl Sc, SRc and Rot values were no worse in the LVNC+ group compared to LVNC- (p values range 0.0003-0.06), apart from apical SRc (p = 0.2). All 54 segmental measurements (Sl/Sc SRl/SRc and Rot) had numerically comparable means in both LVNC+ and LVNC-, of which 69% were also statistically non-inferior. CONCLUSIONS: Among healthy adolescent athletes, 6.5% met the echocardiographic criteria for LVNC, but showed normal LV STE parameters, in contrast to available data on paediatric LVNC describing abnormal myocardial function. STE could better characterise the myocardial mechanics of athletes with LV hypertrabeculation, thus allowing the transition from structural to functional LVNC diagnosis, especially in suspected physiological remodelling
The use of 2-D speckle tracking echocardiography in differentiating healthy adolescent athletes with right ventricular outflow tract dilation from patients with arrhythmogenic cardiomyopathy
AIMS: Echocardiographic assessment of adolescent athletes for arrhythmogenic cardiomyopathy (ACM) can be challenging owing to right ventricular (RV) exercise-related remodelling, particularly RV outflow tract (RVOT) dilation. The aim of this study is to evaluate the role of RV 2-D speckle tracking echocardiography (STE) in comparing healthy adolescent athletes with and without RVOT dilation to patients with ACM. METHODS AND RESULTS: A total of 391 adolescent athletes, mean age 14.5 ± 1.7 years, evaluated at three sports academies between 2014 and 2019 were included, and compared to previously reported ACM patients (n = 38 definite and n = 39 borderline). Peak systolic RV free wall (RVFW-Sl), global and segmental strain (Sl), and corresponding strain rates (SRl) were calculated. The participants meeting the major modified Task Force Criteria (mTFC) for RVOT dilation were defined as mTFC+ (n = 58, 14.8%), and the rest as mTFC- (n = 333, 85.2%). Mean RVFW-Sl was -27.6 ± 3.4% overall, -28.2 ± 4.1% in the mTFC+ group and - 27.5 ± 3.3% in the mTFC- group. mTFC+ athletes had normal RV-FW-Sl when compared to definite (-29% vs -19%, p < 0.001) and borderline ACM (-29% vs -21%, p < 0.001) cohorts. In addition, all mean global and regional Sl and SRl values were no worse in the mTFC+ group compared to the mTFC- (p values range < 0.0001 to 0.1, inferiority margin of 2% and 0.1 s-1 respectively). CONCLUSIONS: In athletes with RVOT dilation meeting the major mTFC, STE evaluation of the RV can demostrate normal function and differentiate physiological remodelling from pathological changes found in ACM, improving screening in grey-area cases
Transmission of HIV drug resistance and the predicted effect on current first-line regimens in Europe
Numerous studies have shown that baseline drug resistance patterns may influence the outcome of antiretroviral therapy. Therefore, guidelines recommend drug resistance testing to guide the choice of initial regimen. In addition to optimizing individual patient management, these baseline resistance data enable transmitted drug resistance (TDR) to be surveyed for public health purposes. The SPREAD program systematically collects data to gain insight into TDR occurring in Europe since 2001. Demographic, clinical, and virological data from 4140 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from 26 countries who were newly diagnosed between 2008 and 2010 were analyzed. Evidence of TDR was defined using the WHO list for surveillance of drug resistance mutations. Prevalence of TDR was assessed over time by comparing the results to SPREAD data from 2002 to 2007. Baseline susceptibility to antiretroviral drugs was predicted using the Stanford HIVdb program version 7.0. The overall prevalence of TDR did not change significantly over time and was 8.3% (95% confidence interval, 7.2%-9.5%) in 2008-2010. The most frequent indicators of TDR were nucleoside reverse transcriptase inhibitor (NRTI) mutations (4.5%), followed by nonnucleoside reverse transcriptase inhibitor (NNRTI) mutations (2.9%) and protease inhibitor mutations (2.0%). Baseline mutations were most predictive of reduced susceptibility to initial NNRTI-based regimens: 4.5% and 6.5% of patient isolates were predicted to have resistance to regimens containing efavirenz or rilpivirine, respectively, independent of current NRTI backbones. Although TDR was highest for NRTIs, the impact of baseline drug resistance patterns on susceptibility was largest for NNRTIs. The prevalence of TDR assessed by epidemiological surveys does not clearly indicate to what degree susceptibility to different drug classes is affected
Transmission of HIV drug resistance and the predicted effect on current first-line regimens in Europe
Background. Numerous studies have shown that baseline drug resistance patterns may influence the outcome of antiretroviral therapy. Therefore, guidelines recommend drug resistance testing to guide the choice of initial regimen. In addition to optimizing individual patient management, these baseline resistance data enable transmitted drug resistance (TDR) to be surveyed for public health purposes. The SPREAD program systematically collects data to gain insight into TDR occurring in Europe since 2001. Methods. Demographic, clinical, and virological data from 4140 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from 26 countries who were newly diagnosed between 2008 and 2010 were analyzed. Evidence of TDR was defined using the WHO list for surveillance of drug resistance mutations. Prevalence of TDR was assessed over time by comparing the results to SPREAD data from 2002 to 2007. Baseline susceptibility to antiretroviral drugs was predicted using the Stanford HIVdb program version 7.0. Results. The overall prevalence of TDR did not change significantly over time and was 8.3% (95% confidence interval, 7.2%-9.5%) in 2008-2010. The most frequent indicators of TDR were nucleoside reverse transcriptase inhibitor (NRTI) mutations (4.5%), followed by nonnucleoside reverse transcriptase inhibitor (NNRTI) mutations (2.9%) and protease inhibitor mutations (2.0%). Baseline mutations were most predictive of reduced susceptibility to initial NNRTI-based regimens: 4.5% and 6.5% of patient isolates were predicted to have resistance to regimens containing efavirenz or rilpivirine, respectively, independent of current NRTI backbones. Conclusions. Although TDR was highest for NRTIs, the impact of baseline drug resistance patterns on susceptibility was largest for NNRTIs. The prevalence of TDR assessed by epidemiological surveys does not clearly indicate to what degree susceptibility to different drug classes is affected
Oxygen targets and 6-month outcome after out of hospital cardiac arrest: a pre-planned sub-analysis of the targeted hypothermia versus targeted normothermia after Out-of-Hospital Cardiac Arrest (TTM2) trial
International audienceAbstract Background Optimal oxygen targets in patients resuscitated after cardiac arrest are uncertain. The primary aim of this study was to describe the values of partial pressure of oxygen values (PaO 2 ) and the episodes of hypoxemia and hyperoxemia occurring within the first 72 h of mechanical ventilation in out of hospital cardiac arrest (OHCA) patients. The secondary aim was to evaluate the association of PaO 2 with patients’ outcome. Methods Preplanned secondary analysis of the targeted hypothermia versus targeted normothermia after OHCA (TTM2) trial. Arterial blood gases values were collected from randomization every 4 h for the first 32 h, and then, every 8 h until day 3. Hypoxemia was defined as PaO 2 300 mmHg. Mortality and poor neurological outcome (defined according to modified Rankin scale) were collected at 6 months. Results 1418 patients were included in the analysis. The mean age was 64 ± 14 years, and 292 patients (20.6%) were female. 24.9% of patients had at least one episode of hypoxemia, and 7.6% of patients had at least one episode of severe hyperoxemia. Both hypoxemia and hyperoxemia were independently associated with 6-month mortality, but not with poor neurological outcome. The best cutoff point associated with 6-month mortality for hypoxemia was 69 mmHg (Risk Ratio, RR = 1.009, 95% CI 0.93–1.09), and for hyperoxemia was 195 mmHg (RR = 1.006, 95% CI 0.95–1.06). The time exposure, i.e., the area under the curve (PaO 2 -AUC), for hyperoxemia was significantly associated with mortality ( p = 0.003). Conclusions In OHCA patients, both hypoxemia and hyperoxemia are associated with 6-months mortality, with an effect mediated by the timing exposure to high values of oxygen. Precise titration of oxygen levels should be considered in this group of patients. Trial registration : clinicaltrials.gov NCT02908308 , Registered September 20, 2016