446 research outputs found

    Myocardial stress perfusion scintigraphy for outcome prediction in patients with severe left ventricular systolic dysfunction

    Get PDF
    Abstract: Coronary angiography has been recommended in all patients with suspected chronic coronary syndrome and left ventricular ejection fraction (LVEF) ≤35%. The role of ischemia testing, for example, through stress-rest myocardial perfusion scintigraphy (MPS), for risk prediction is not well established. Methods: We evaluated 1576 consecutive patients referred to MPS and stratified into 3 LV ejection fraction (LVEF) categories: ≤35%, 36–49%, and ≥ 50%. Results: Patients with LVEF ≤35% were oldest, most often men, and with the highest likelihood of prior early (elective or urgent) coronary revascularization. They had also the highest values or summed stress score (SSS), summed rest score (SRS), and summed difference score (SDS), as well as the highest frequency of significant coronary artery disease, and a greater number of diseased vessels. Follow-up: In this subgroup, 32 cardiovascular death or non-fatal myocardial infarction (MI) (21%), 35 all-cause deaths (22%), and 37 cardiovascular deaths, non-fatal MI, or late revascularizations (27%) were recorded with the shortest survival among all LVEF classes. SRS, SSS, and SDS had very low area under the curve values for the prediction of the 3 endpoints, with very high cut-offs, respectively. SRS and SSS cut-offs predicted a worse outcome in Cox regression models including the number of diseased vessels and early revascularization. Conclusions: In patients with LVEF ≤35%, SRS and SSS are less predictive of outcome than in patients with better preserved systolic dysfunction, but their cut-offs retain independent prognostic significance from the number of vessels with significant stenoses and from early revascularization

    Prediction and evaluation of resting energy expenditure in a large group of obese outpatients

    Get PDF
    The aim of this study was to compare resting energy expenditure (REE) measured (MREE) by indirect calorimetry (IC) and REE predicted (PREE) from established predictive equations in a large sample of obese Caucasian adults

    Assessment of body composition in health and disease using bioelectrical impedance analysis (bia) and dual energy x-ray absorptiometry (dxa): A critical overview

    Get PDF
    The measurement of body composition (BC) represents a valuable tool to assess nutritional status in health and disease. The most used methods to evaluate BC in the clinical practice are based on bicompartment models and measure, directly or indirectly, fat mass (FM) and fat-free mass (FFM). Bioelectrical impedance analysis (BIA) and dual energy X-ray absorptiometry (DXA) (nowadays considered as the reference technique in clinical practice) are extensively used in epidemiological (mainly BIA) and clinical (mainly DXA) settings to evaluate BC. DXA is primarily used for the measurements of bone mineral content (BMC) and density to assess bone health and diagnose osteoporosis in defined anatomical regions (femur and spine). However, total body DXA scans are used to derive a three-compartment BC model, including BMC, FM, and FFM. Both these methods feature some limitations: the accuracy of BIA measurements is reduced when specific predictive equations and standardized measurement protocols are not utilized whereas the limitations of DXA are the safety of repeated measurements (no more than two body scans per year are currently advised), cost, and technical expertise. This review aims to provide useful insights mostly into the use of BC methods in prevention and clinical practice (ambulatory or bedridden patients). We believe that it will stimulate a discussion on the topic and reinvigorate the crucial role of BC evaluation in diagnostic and clinical investigation protocols

    Molecular approaches in the diagnosis of sepsis in neutropenic patients with haematological malignances

    Get PDF
    Introduction. Sepsis is a major cause of significant morbidity and mortality in neutropenic patients. Blood culture remains the gold standard in the microbiological diagnosis of bacterial or fungal bloodstream infections, but it has clear limits of rapidity and sen- sitivity. The objective of the study was to compare the real-time polymerase chain reaction (RT-PCR) with automated blood cul- tures (BC) method in detection in whole blood of pathogens in febrile neutropenic patients with hematological malignancies. Methods. A total of 166 consecutive febrile neutropenic patients were enrolled. Blood samples for cultures and SeptiFast testing were obtained at the onset of fever, before the implementation of empirical antibiotic therapy. Results. Forty (24.1%) samples out of the 166 blood samples tested, were positive by at least one method. Twenty-three (13.9%) samples were positive by blood culture and 38 (22.9%) by multi- plex real-time PCR. The analysis of concordance evidenced a low correlation between the two methods (n = 21; 52.5%), mainly due to samples found negative by culture but positive with the Septi- Fast assay. Sensitivity, specificity, and positive and negative pre- dictive values of RT-PCR were 91.3%, 88.1%, 55.3%, and 98.4%, respectively, compared with BC. Discussion. Multiplex real-time PCR assay improved detection of the most bacteria associated with febrile neutropenia episodes. Fur- ther studies are needed to assess the real advantages and clinical benefits that molecular biology tests can add in diagnosis of sepsis. The full article is free available on www.jpmh.or

    Nutritional Screening and Anthropometry in Patients Admitted From the Emergency Department

    Get PDF
    Background: Due to the high prevalence of malnutrition among hospitalized patients, screening and assessment of nutritional status should be routinely performed upon hospital admission. The main objective of this observational study was to evaluate the prevalence of and the risk for malnutrition, as identified by using three nutritional screening tests, and to observe whether some anthropometric and functional parameters used for nutritional evaluation were related to these test scores. Methods: This single-center observational study included 207 patients admitted from the emergency department for hospitalization in either the internal medicine or surgery units of our institution from September 2017 to December 2018. The prevalence of malnutrition among this patient sample was evaluated by using the Nutritional Risk Screening (NRS-2002), the Subjective Global Assessment (SGA) and the Global Leadership Initiative on Malnutrition (GLIM) criteria. Body mass index (BMI), bioimpedance analysis (BIA), handgrip strength (HGS) and calf circumference (CC) assessments were also performed. Results: According to the NRS-2002, 93% of the patients were at no risk or at low nutritional risk (NRS score < 3), and 7% were at a high nutritional risk (NRS score ≥ 3). On the other hand, according to the SGA, 46.3% of the patients were well-nourished (SGA-a), 49.8% were moderately malnourished (SGA-b), and 3.9% were severely malnourished (SGA-c). Finally, according to the GLIM criteria, 18% patients were malnourished. Body weight, body mass index (BMI), phase angle (PhA), CC and HGS were significantly lower in the patients with NRS scores ≥ 3, SGA-c and in patients with stage 1 and stage 2 malnutrition, according to the GLIM criteria. Conclusion: The NRS-2002, the SGA and the GLIM criteria appear to be valuable tools for the screening and assessment of nutritional status. In particular, the lowest NRS-2002, SGA and GLIM scores were associated with the lowest PhA and CC. Nevertheless, a weekly re-evaluation of patients with better screening and assessment scores is recommended to facilitate early detection of changes in nutritional status
    • …
    corecore