16 research outputs found

    Impact of Chronic Kidney Disease on the Presence and Severity of Aortic Stenosis in Patients at High Risk for Coronary Artery Disease

    Get PDF
    <p>Abstract</p> <p>Objective</p> <p>We evaluated the impact of chronic kidney disease (CKD) on the presence and severity of aortic stenosis (AS) in patients at high risk for coronary artery disease (CAD).</p> <p>Methods</p> <p>One hundred and twenty consecutive patients who underwent invasive coronary angiography were enrolled. Aortic valve area (AVA) was calculated by the continuity equation using transthoracic echocardiography, and was normalized by body surface area (AVA index).</p> <p>Results</p> <p>Among all 120 patients, 78% had CAD, 55% had CKD (stage 3: 81%; stage 4: 19%), and 34% had AS (AVA < 2.0cm<sup>2</sup>). Patients with AS were older, more often female, and had a higher frequency of CKD than those without AS, but the prevalence of CAD and most other coexisting conventional risk factors was similar between patients with and without AS. Multivariate linear regression analysis indicated that only CKD and CAD were independent determinants of AVA index with standardized coefficients of -0.37 and -0.28, respectively. When patients were divided into 3 groups (group 1: absence of CKD and CAD, n = 16; group 2: presence of either CKD or CAD, n = 51; and group 3: presence of both CKD and CAD, n = 53), group 3 had the smallest AVA index (1.19 ± 0.30*# cm<sup>2</sup>/m<sup>2</sup>, *p < 0.05 vs. group 1: 1.65 ± 0.32 cm<sup>2</sup>/m<sup>2</sup>, and #p < 0.05 vs. group 2: 1.43 ± 0.29* cm<sup>2</sup>/m<sup>2</sup>) and the highest peak velocity across the aortic valve (1.53 ± 0.41*# m/sec; *p < 0.05 vs. group 1: 1.28 ± 0.29 m/sec, and #p < 0.05 vs. group 2: 1.35 ± 0.27 m/sec).</p> <p>Conclusion</p> <p>CKD, even pre-stage 5 CKD, has a more powerful impact on the presence and severity of AS than other conventional risk factors for atherosclerosis in patients at high risk for CAD.</p

    Carotid artery calcification at the initiation of hemodialysis is a risk factor for cardiovascular events in patients with end-stage renal disease: a cohort study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Vascular calcification has been recognized as a risk factor for cardiovascular (CV) events in patients with end-stage renal disease (ESRD). However, the association of carotid artery calcification (CAAC) with CV events remains unknown. The aim of this study was to elucidate whether CAAC is associated with composite CV events in ESRD patients.</p> <p>Methods</p> <p>One-hundred thirty-three patients who had been started on hemodialysis between 2004 and 2008 were included in this retrospective cohort study. These patients received multi-detector computed tomography to assess CAAC at the initiation of hemodialysis. Composite CV events, including ischemic heart disease, heart failure, cerebrovascular diseases, and CV deaths after the initiation of hemodialysis, were examined in each patient.</p> <p>Results</p> <p>CAAC was found in 94 patients (71%). At the end of follow-up, composite CV events were seen in 47 patients: ischemic heart disease in 20, heart failure in 8, cerebrovascular disease in 12, and CV deaths in 7. The incidence of CAAC was 87% in patients with CV events, which was significantly higher than the rate (62%) in those without. Kaplan-Meier analysis showed a significant increase in composite CV events in patients with CAAC compared with those without CAAC (p = 0.001, log-rank test). Univariate analysis using a Cox hazards model showed that age, smoking, common carotid artery intima-media thickness and CAAC were risk factors for composite CV events. In multivariate analysis, only CAAC was a significant risk factor for composite CV events (hazard ratio, 2.85; 95% confidence interval, 1.18-8.00; p = 0.02).</p> <p>Conclusions</p> <p>CAAC is an independent risk factor for CV events in ESRD patients. The assessment of CAAC at the initiation of hemodialysis is useful for predicting the prognosis.</p

    The role of protected areas in the avoidance of anthropogenic conversion in a high pressure region : a matching method analysis in the core region of the brazilian cerrado

    Get PDF
    Global efforts to avoid anthropogenic conversion of natural habitat rely heavily on the establishment of protected areas. Studies that evaluate the effectiveness of these areas with a focus on preserving the natural habitat define effectiveness as a measure of the influence of protected areas on total avoided conversion. Changes in the estimated effectiveness are related to local and regional differences, evaluation methods, restriction categories that include the protected areas, and other characteristics. The overall objective of this study was to evaluate the effectiveness of protected areas to prevent the advance of the conversion of natural areas in the core region of the Brazil’s Cerrado Biome, taking into account the influence of the restriction degree, governmental sphere, time since the establishment of the protected area units, and the size of the area on the performance of protected areas. The evaluation was conducted using matching methods and took into account the following two fundamental issues: control of statistical biases caused by the influence of covariates on the likelihood of anthropogenic conversion and the non-randomness of the allocation of protected areas throughout the territory (spatial correlation effect) and the control of statistical bias caused by the influence of auto-correlation and leakage effect. Using a sample design that is not based on ways to control these biases may result in outcomes that underestimate or overestimate the effectiveness of those units. The matching method accounted for a bias reduction in 94–99% of the estimation of the average effect of protected areas on anthropogenic conversion and allowed us to obtain results with a reduced influence of the auto-correlation and leakage effects. Most protected areas had a positive influence on the maintenance of natural habitats, although wide variation in this effectiveness was dependent on the type, restriction, governmental sphere, size and age group of the unit

    Wildfires in Bamboo-Dominated Amazonian Forest: Impacts on Above-Ground Biomass and Biodiversity

    Get PDF
    Fire has become an increasingly important disturbance event in south-western Amazonia. We conducted the first assessment of the ecological impacts of these wildfires in 2008, sampling forest structure and biodiversity along twelve 500 m transects in the Chico Mendes Extractive Reserve, Acre, Brazil. Six transects were placed in unburned forests and six were in forests that burned during a series of forest fires that occurred from August to October 2005. Normalized Burn Ratio (NBR) calculations, based on Landsat reflectance data, indicate that all transects were similar prior to the fires. We sampled understorey and canopy vegetation, birds using both mist nets and point counts, coprophagous dung beetles and the leaf-litter ant fauna. Fire had limited influence upon either faunal or floral species richness or community structure responses, and stems <10 cm DBH were the only group to show highly significant (p = 0.001) community turnover in burned forests. Mean aboveground live biomass was statistically indistinguishable in the unburned and burned plots, although there was a significant increase in the total abundance of dead stems in burned plots. Comparisons with previous studies suggest that wildfires had much less effect upon forest structure and biodiversity in these south-western Amazonian forests than in central and eastern Amazonia, where most fire research has been undertaken to date. We discuss potential reasons for the apparent greater resilience of our study plots to wildfire, examining the role of fire intensity, bamboo dominance, background rates of disturbance, landscape and soil conditions
    corecore