38 research outputs found

    Using Community-Based Participatory Research to Investigate the Effectiveness of HIV/AIDS Risk Reduction Counseling in an Urban African-American Community

    Get PDF
    Introduction: Risk reduction counseling is an important component in HIV/AIDS prevention. Community-based participatory research (CBPR) was conducted to determine if a single counseling session was as effective as a two-session intervention in reducing risk behavior. Methods: Community and academic investigators jointly developed the study design. A convenience sample of 242 persons was randomized to receive either a two session intervention with Conventional HIV Testing (CHT) or a one session intervention with HIV Rapid Testing (HRT). Participants completed a risk assessment immediately preceding the test and a risk reduction plan after the test; CHT participants received a second risk reduction session. Results: Of 130 participants completing a one-month follow-up, 86.9% were African American and 72.3% were male. All participants demonstrated a significant decrease in risk behaviors regardless of procedure. Conclusions: Findings suggested that a brief client-centered risk reduction counseling intervention can be equally effective with either CHT or HRT. CBPR allowed the academic partner to answer study questions as the community agency received information to make informed decisions during a transition period from CHT to HRT

    Efficacy of therapeutic plasma exchange on angiotensin II type‐1 receptor antibodies on two kidney transplant recipients

    Full text link
    BackgroundAngiotensin II type‐1 receptor antibody (AT1RAb) has been reported to cause antibody mediated rejection (AMR) in kidney transplant recipients possibly by contraction of renal arteries. We here report 2 kidney transplant recipients with elevated AT1RAbs and negative HLA donor specific antibodies (DSA) and anti‐major histocompatibility complex class I chain‐related gene A (MICA) Abs who received therapeutic plasma exchange (TPE) treatment followed by IVIG.Case 1Thirty‐eight‐year‐old patient received second kidney transplant for end stage renal disease (ESRD) with chronic rejection. Three years post‐transplant, she developed AMR with AT1RAb level >40 U/mL. She received 5 TPE and AT1RAb decreased by 20%, and biopsy showed improvement of AMR. She received another 3 TPE and AT1RAb decreased by 60%. Her creatinine (Cr) was stabilized at around 1.4 mg/dL.Case 2Twenty‐four‐year‐old patient received kidney transplant for ESRD with unclear etiology. Two weeks post‐transplant, her Cr rose with AT1RAb level at 18 U/mL and biopsy showed possible AMR. She received 6 TPE treatments and AT1RAb decreased by 55% and biopsy showed improvement of AMR. She received weekly TPE for subsequently rising AT1RAb but TPE was discontinued because of unsuccessful decrease of AT1RAb. Her Cr was stabilized at around 1.7 mL/dL.ConclusionWe reported 2 patients who received TPE treatments to decrease AT1RAbs. A course of TPE treatment successfully decreased AT1RAb. Histological improvement was observed quickly and Cr was also stabilized following the TPE treatment. Further study is necessary to determine the optimal use of TPE in renal transplant recipients with AT1RAbs.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/146572/1/jca21657.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/146572/2/jca21657_am.pd

    The temporal and long‐term impact of donor body mass index on recipient outcomes after kidney transplantation – a retrospective study

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/153284/1/tri13505_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/153284/2/tri13505.pd

    Access and Outcomes Among Minority Transplant Patients, 1999–2008, with a Focus on Determinants of Kidney Graft Survival

    Full text link
    Coincident with an increasing national interest in equitable health care, a number of studies have described disparities in access to solid organ transplantation for minority patients. In contrast, relatively little is known about differences in posttransplant outcomes between patients of specific racial and ethnic populations. In this paper, we review trends in access to solid organ transplantation and posttransplant outcomes by organ type, race and ethnicity. In addition, we present an analysis of categories of factors that contribute to the racial/ethnic variation seen in kidney transplant outcomes. Disparities in minority access to transplantation among wait-listed candidates are improving, but persist for those awaiting kidney, simultaneous kidney and pancreas and intestine transplantation. In general, graft and patient survival among recipients of solid organ transplants is highest for Asians and Hispanic/Latinos, intermediate for whites and lowest for African Americans. Although much of the difference in outcomes between racial/ethnic groups can be accounted for by adjusting for patient characteristics, important observed differences remain. Age and duration of pretransplant dialysis exposure emerge as the most important determinants of survival in an investigation of the relative impact of center-related versus patient-related variables on kidney graft outcomes.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/79270/1/j.1600-6143.2009.03009.x.pd

    Depressive symptoms, frailty, and adverse outcomes among kidney transplant recipients

    Full text link
    Depressive symptoms and frailty are each independently associated with morbidity and mortality in kidney transplant (KT) recipients. We hypothesized that having both depressive symptoms and frailty would be synergistic and worse than the independent effect of each. In a multicenter cohort study of 773 KT recipients, we measured the Fried frailty phenotype and the modified 18â question Center for Epidemiologic Studiesâ Depression Scale (CESâ D). Using adjusted Poisson regression and survival analysis, we tested whether depressive symptoms (CESâ D score > 14) and frailty were associated with KT length of stay (LOS), deathâ censored graft failure (DCGF), and mortality. At KT admission, 10.0% of patients exhibited depressive symptoms, 16.3% were frail, and 3.6% had both. Recipients with depressive symptoms were more likely to be frail (aOR = 3.97, 95% CI: 2.28â 6.91, P < 0.001). Recipients with both depressive symptoms and frailty had a 1.88 times (95% CI: 1.70â 2.08, P < 0.001) longer LOS, 6.20â fold (95% CI:1.67â 22.95, P < 0.01) increased risk of DCGF, and 2.62â fold (95% CI:1.03â 6.70, P = 0.04) increased risk of mortality, compared to those who were nonfrail and without depressive symptoms. There was only evidence of synergistic effect of frailty and depressive symptoms on length of stay (P for interaction < 0.001). Interventions aimed at reducing preâ KT depressive symptoms and frailty should be explored for their impact on postâ KT outcomes.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/146305/1/ctr13391_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/146305/2/ctr13391.pd

    Racial differences in inflammation and outcomes of aging among kidney transplant candidates

    Full text link
    Abstract Background Inflammation is more common among African Americans (AAs), and it is associated with frailty, poor physical performance, and mortality in community-dwelling older adults. Given the elevated inflammation levels among end-stage renal disease (ESRD) patients, inflammation may be associated with adverse health outcomes such as frailty, physical impairment, and poor health-related quality of life (HRQOL), and these associations may differ between AA and non-AA ESRD patients. Methods One thousand three ESRD participants were recruited at kidney transplant evaluation (4/2014–5/2017), and inflammatory markers (interleukin-6 [IL-6], tumor necrosis factor-a receptor-1 [TNFR1], C-reactive protein [CRP]) were measured. We quantified the association with frailty (Fried phenotype), physical impairment (Short Physical Performance Battery [SPPB]), and fair/poor HRQOL at evaluation using adjusted modified Poisson regression and tested whether these associations differed by race (AA vs. non-AA). Results Non-AAs had lower levels of TNFR1 (9.7 ng/ml vs 14.0 ng/ml, p  0.9) and CRP (4.7 μg/ml vs 4.9 μg/ml, p = 0.4). Non-AAs had an increased risk of frailty with elevated IL-6 (RR = 1.58, 95% CI:1.27–1.96, p < 0.001), TNFR1 (RR = 1.60, 95% CI:1.25–2.05, p < 0.001), CRP (RR = 1.41, 95% CI:1.10–1.82, p < 0.01), and inflammatory index (RR = 1.82, 95% CI:1.44–2.31, p < 0.001). The associations between elevated inflammatory markers and frailty were not present among AAs. Similar results were seen with SPPB impairment and poor/fair HRQOL. Conclusions Non-AAs with elevated inflammatory markers may need closer follow-up and may benefit from prehabilitation to improve physical function, reduce frailty burden, and improve quality of life prior to transplant.https://deepblue.lib.umich.edu/bitstream/2027.42/149150/1/12882_2019_Article_1360.pd

    Newly identified climatically and environmentally significant high-latitude dust sources

    Get PDF
    Dust particles from high latitudes have a potentially large local, regional, and global significance to climate and the environment as short-lived climate forcers, air pollutants, and nutrient sources. Identifying the locations of local dust sources and their emission, transport, and deposition processes is important for understanding the multiple impacts of high-latitude dust (HLD) on the Earth\u27s systems. Here, we identify, describe, and quantify the source intensity (SI) values, which show the potential of soil surfaces for dust emission scaled to values 0 to 1 concerning globally best productive sources, using the Global Sand and Dust Storms Source Base Map (G-SDS-SBM). This includes 64 HLD sources in our collection for the northern (Alaska, Canada, Denmark, Greenland, Iceland, Svalbard, Sweden, and Russia) and southern (Antarctica and Patagonia) high latitudes. Activity from most of these HLD sources shows seasonal character. It is estimated that high-latitude land areas with higher (SI ≥0.5), very high (SI ≥0.7), and the highest potential (SI ≥0.9) for dust emission cover >1 670 000 km2^{2}, >560 000 km2^{2}, and >240 000 km2^{2}, respectively. In the Arctic HLD region (≥60^{∘} N), land area with SI ≥0.5 is 5.5 % (1 035 059 km2^{2}), area with SI ≥0.7 is 2.3 % (440 804 km2^{2}), and area with SI ≥0.9 is 1.1 % (208 701 km2^{2}). Minimum SI values in the northern HLD region are about 3 orders of magnitude smaller, indicating that the dust sources of this region greatly depend on weather conditions. Our spatial dust source distribution analysis modeling results showed evidence supporting a northern HLD belt, defined as the area north of 50^{∘} N, with a “transitional HLD-source area” extending at latitudes 50–58∘ N in Eurasia and 50–55^{∘} N in Canada and a “cold HLD-source area” including areas north of 60^{∘} N in Eurasia and north of 58^{∘} N in Canada, with currently “no dust source” area between the HLD and low-latitude dust (LLD) dust belt, except for British Columbia. Using the global atmospheric transport model SILAM, we estimated that 1.0 % of the global dust emission originated from the high-latitude regions. About 57 % of the dust deposition in snow- and ice-covered Arctic regions was from HLD sources. In the southern HLD region, soil surface conditions are favorable for dust emission during the whole year. Climate change can cause a decrease in the duration of snow cover, retreat of glaciers, and an increase in drought, heatwave intensity, and frequency, leading to the increasing frequency of topsoil conditions favorable for dust emission, which increases the probability of dust storms. Our study provides a step forward to improve the representation of HLD in models and to monitor, quantify, and assess the environmental and climate significance of HLD
    corecore