75 research outputs found
Immunological Monitoring of Renal Transplant Recipients to Predict Acute Allograft Rejection Following the Discontinuation of Tacrolimus
Contains fulltext :
69863.pdf (publisher's version ) (Open Access)BACKGROUND: Transplant patients would benefit from reduction of immunosuppression providing that graft rejection is prevented. We have evaluated a number of immunological markers in blood of patients in whom tacrolimus was withdrawn after renal transplantation. The alloreactive precursor frequency of CD4+ and CD8+ T cells, the frequency of T cell subsets and the functional capacity of CD4+CD25+FoxP3+ regulatory T cells (Treg) were analyzed before transplantation and before tacrolimus reduction. In a case-control design, the results were compared between patients with (n = 15) and without (n = 28) acute rejection after tacrolimus withdrawal. PRINCIPAL FINDINGS: Prior to tacrolimus reduction, the ratio between memory CD8+ T cells and Treg was higher in rejectors compared to non-rejectors. Rejectors also had a higher ratio between memory CD4+ T cells and Treg, and ratios <20 were only observed in non-rejectors. Between the time of transplantation and the start of tacrolimus withdrawal, an increase in naive T cell frequencies and a reciprocal decrease of effector T cell percentages was observed in rejectors. The proportion of Treg within the CD4+ T cells decreased after transplantation, but anti-donor regulatory capacity of Treg remained unaltered in rejectors and non-rejectors. CONCLUSIONS: Immunological monitoring revealed an association between acute rejection following the withdrawal of tacrolimus and 1) the ratio of memory T cells and Treg prior to the start of tacrolimus reduction, and 2) changes in the distribution of naive, effector and memory T cells over time. Combination of these two biomarkers allowed highly specific identification of patients in whom immunosuppression could be safely reduced
Prognosis and serum creatinine levels in acute renal failure at the time of nephrology consultation: an observational cohort study
The aim of this study is to evaluate the association between acute
serum creatinine changes in acute renal failure (ARF), before specialized
treatment begins, and in-hospital mortality, recovery of renal function, and
overall mortality at 6 months, on an equal degree of ARF severity, using the
RIFLE criteria, and comorbid illnesses. METHODS: Prospective cohort study of 1008
consecutive patients who had been diagnosed as having ARF, and had been admitted
in an university-affiliated hospital over 10 years. Demographic, clinical
information and outcomes were measured. After that, 646 patients who had
presented enough increment in serum creatinine to qualify for the RIFLE criteria
were included for subsequent analysis. The population was divided into two groups
using the median serum creatinine change (101%) as the cut-off value.
Multivariate non-conditional logistic and linear regression models were used.
RESULTS: A >or= 101% increment of creatinine respect to its baseline before
nephrology consultation was associated with significant increase of in-hospital
mortality (35.6% vs. 22.6%, p < 0.001), with an adjusted odds ratio of 1.81 (95%
CI: 1.08-3.03). Patients who required continuous renal replacement therapy in the
>or= 101% increment group presented a higher increase of in-hospital mortality
(62.7% vs 46.4%, p = 0.048), with an adjusted odds ratio of 2.66 (95% CI:
1.00-7.21). Patients in the >or= 101% increment group had a higher mean serum
creatinine level with respect to their baseline level (114.72% vs. 37.96%) at
hospital discharge. This was an adjusted 48.92% (95% CI: 13.05-84.79) more serum
creatinine than in the < 101% increment group. CONCLUSION: In this cohort,
patients who had presented an increment in serum level of creatinine of >or= 101%
with respect to basal values, at the time of nephrology consultation, had
increased mortality rates and were discharged from hospital with a more
deteriorated renal function than those with similar Liano scoring and the same
RIFLE classes, but with a < 101% increment. This finding may provide more
information about the factors involved in the prognosis of ARF. Furthermore, the
calculation of relative serum creatinine increase could be used as a practical
tool to identify those patients at risk, and that would benefit from an intensive
therapy
Progression of kidney disease in type 2 diabetes – beyond blood pressure control: an observational study
BACKGROUND: The risk factors for progression of chronic kidney disease (CKD) in type 2 diabetes mellitus (DM) have not been fully elucidated. Although uncontrolled blood pressure (BP) is known to be deleterious, other factors may become more important once BP is treated. METHODS: All patients seen in the outpatient clinics of our hospital between January 1993 and September 2002 with type 2 DM and clinical evidence of CKD were evaluated. Progression of kidney disease was evaluated by rate of decline of glomerular filtration rate (GFR) as estimated from the simplified MDRD formula. Variables associated with progression in univariate analyses were examined by multivariate analysis to determine the factors independently associated with kidney disease progression. RESULTS: 343 patients (mean age 69 years; all male; 77% Caucasian) were studied. Mean BP, glycated hemoglobin, and serum cholesterol during the study period were 138/72 mmHg, 8.1%, and 4.8 mmol/L, respectively. Mean decline of GFR was 4.5 ml min-1 1.73 m(2)-1 yr-1 (range -14 to +32). Low initial serum albumin (p < 0.001), black race (p < 0.001), and degree of proteinuria (p = 0.002), but not blood pressure, glycated hemoglobin, or serum cholesterol, were independently associated with progression. CONCLUSION: In a cohort of diabetic patients with CKD in whom mean BP was < 140/80 mmHg, the potentially remediable factors hypoalbuminemia and proteinuria but not blood pressure were independently associated with progression of kidney disease. Further understanding of the relationship between these factors and kidney disease progression may lead to beneficial therapies in such patients
Bringing the real world into the fMRI scanner: Repetition effects for pictures versus real objects
Our understanding of the neural underpinnings of perception is largely built upon studies employing 2-dimensional (2D) planar images. Here we used slow event-related functional imaging in humans to examine whether neural populations show a characteristic repetition-related change in haemodynamic response for real-world 3-dimensional (3D) objects, an effect commonly observed using 2D images. As expected, trials involving 2D pictures of objects produced robust repetition effects within classic object-selective cortical regions along the ventral and dorsal visual processing streams. Surprisingly, however, repetition effects were weak, if not absent on trials involving the 3D objects. These results suggest that the neural mechanisms involved in processing real objects may therefore be distinct from those that arise when we encounter a 2D representation of the same items. These preliminary results suggest the need for further research with ecologically valid stimuli in other imaging designs to broaden our understanding of the neural mechanisms underlying human vision
QCD and strongly coupled gauge theories : challenges and perspectives
We highlight the progress, current status, and open challenges of QCD-driven physics, in theory and in experiment. We discuss how the strong interaction is intimately connected to a broad sweep of physical problems, in settings ranging from astrophysics and cosmology to strongly coupled, complex systems in particle and condensed-matter physics, as well as to searches for physics beyond the Standard Model. We also discuss how success in describing the strong interaction impacts other fields, and, in turn, how such subjects can impact studies of the strong interaction. In the course of the work we offer a perspective on the many research streams which flow into and out of QCD, as well as a vision for future developments.Peer reviewe
Can human amblyopia be treated in adulthood?
Amblyopia is a common visual disorder that results in a spatial acuity deficit in the affected eye. Orthodox treatment is to occlude the unaffected eye for lengthy periods, largely determined by the severity of the visual deficit at diagnosis. Although this treatment is not without its problems (poor compliance, potential to reduce binocular function, etc) it is effective in many children with moderate to severe amblyopia. Diagnosis and initiation of treatment early in life are thought to be critical to the success of this form of therapy. Occlusion is rarely undertaken in older children (more than 10 years old) as the visual benefits are considered to be marginal. Therefore, in subjects where occlusion is not effective or those missed by mass screening programs, there is no alternative therapy available later in life. More recently, burgeoning evidence has begun to reveal previously unrecognized levels of residual neural plasticity in the adult brain and scientists have developed new genetic, pharmacological, and behavioral interventions to activate these latent mechanisms in order to harness their potential for visual recovery. Prominent amongst these is the concept of perceptual learning—the fact that repeatedly practicing a challenging visual task leads to substantial and enduring improvements in visual performance over time. In the normal visual system the improvements are highly specific to the attributes of the trained stimulus. However, in the amblyopic visual system, learned improvements have been shown to generalize to novel tasks. In this paper we ask whether amblyopic deficits can be reduced in adulthood and explore the pattern of transfer of learned improvements. We also show that developing training protocols that target the deficit in stereo acuity allows the recovery of normal stereo function even in adulthood. This information will help guide further development of learning-based interventions in this clinical group
- …