247 research outputs found

    Protocol for the detection and nutritional management of high-output stomas

    Get PDF
    Introduction: An issue of recent research interest is excessive stoma output and its relation to electrolyte abnormalities. Some studies have identified this as a precursor of dehydration and renal dysfunction. A prospective study was performed of the complications associated with high-output stomas, to identify their causes, consequences and management.Materials and methods: This study was carried out by a multidisciplinary team of surgeons, gastroenterologists, nutritionists and hospital pharmacists. High-output stoma (HOS) was defined as output ≥1500 ml for two consecutive days. The subjects included in the study population, 43 patients with a new permanent or temporary stoma, were classified according to the time of HOS onset as early HOS (<3 weeks after initial surgery) or late HOS (≥3 weeks after surgery). Circumstances permitting, a specific protocol for response to HOS was applied. Each patient was followed up until the fourth month after surgery.Results: Early HOS was observed in 7 (16 %) of the sample population of 43 hospital patients, and late HOS, in 6 of the 37 (16 %) non-early HOS population. By type of stoma, nearly all HOS cases affected ileostomy, rather than colostomy, patients. The patients with early HOS remained in hospital for 18 days post surgery, significantly longer than those with no HOS (12 days). The protocol was applied to the majority of EHOS patients and achieved 100 % effectiveness. 50 % of readmissions were due to altered electrolyte balance. Hypomagnesaemia was observed in 33 % of the late HOS patients.Conclusion: The protocol developed at our hospital for the detection and management of HOS effectively addresses possible long-term complications arising from poor nutritional status and chronic electrolyte alteration

    Successful renal re-transplantation in the presence of pre-existing anti-DQ5 antibodies when there was zero mismatch at class I human leukocyte antigen A, B, & C: a case report

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>Hyperacute rejection may be prevented by avoiding the transplantation of kidneys into patients with pre-existing anti-donor Class I human leukocyte antigen antibodies. However, the role of anti-donor-Class II-human leukocyte antigen-DQ antibodies is not established. The question is ever more relevant as more sensitive cross-matching techniques detect many additional antibodies during the final crossmatch. We now report successful renal transplantation of a patient who had pre-existing antibodies against his donor's human leukocyte antigen-DQ5.</p> <p>Case presentation</p> <p>Our patient, a Caucasian man, was 34 years of age when he received his first deceased donor renal transplant. After 8 years, his first transplant failed from chronic allograft dysfunction and an earlier bout of Banff 1A cellular rejection. The second deceased donor kidney transplant was initially allocated to the patient due to a 0 out of 6 mismatch. The B cell crossmatch was mildly positive, while the T Cell crossmatch was negative. Subsequent assays showed that the patient had preformed antibodies for human leukocyte antigen DQ5 against his second donor. Despite having preformed antibodies against the donor, the patient continues to have excellent allograft function two years after his second renal transplant.</p> <p>Conclusion</p> <p>The presence of pre-existing antibodies against human leukocyte antigen DQ5 does not preclude transplantation. The relevance of having other antibodies against class II human leukocyte antigens prior to transplantation remains to be studied.</p

    No neurocognitive advantage for immediate antiretroviral treatment in adults with greater than 500 CD4+ T-cell counts

    Get PDF
    OBJECTIVE: To compare the effect of immediate versus deferred antiretroviral treatment (ART) on neuropsychological test performance in treatment-naive HIV-positive adults with >500 CD4+ cells/μL. DESIGN: Randomized trial. METHODS: The START parent study randomized participants to commence immediate versus deferred ART until CD4+ <350 cells/μL. The START Neurology substudy used 8 neuropsychological tests, at baseline, months 4, 8, 12 and annually, to compare groups for changes in test performance. Test results were internally standardized to z-scores. The primary outcome was the average of the eight test z-scores (QNPZ-8). Mean changes in QNPZ-8 from baseline were compared by intent-to-treat using longitudinal mixed models. Changes from baseline to specific time points were compared using ANCOVA models. RESULTS: 592 participants had a median age of 34 years; median baseline CD4+ count of 629 cells/μL; the mean follow-up was 3.4 years. ART was used for 94% and 32% of accrued person-years in the immediate and deferred groups, respectively. There was no difference between the immediate and deferred ART groups in QNPZ-8 change through follow-up [-0.018 (95% CI: -0.062 to 0.027, p = 0.44)], or at any visit. However, QNPZ-8 scores increased in both arms during the first year, by 0.22 and 0.24, respectively (p < 0.001 for increase from baseline). CONCLUSIONS: We observed substantial improvement in neurocognitive test performance during the first year in both study arms, underlining the importance of using a control group in studies assessing neurocognitive performance over time. Immediate ART neither benefitted nor harmed neurocognitive performance in individuals with CD4+ cell counts above 500 cells/μL

    Do brain networks evolve by maximizing their information flow capacity?

    Get PDF
    We propose a working hypothesis supported by numerical simulations that brain networks evolve based on the principle of the maximization of their internal information flow capacity. We find that synchronous behavior and capacity of information flow of the evolved networks reproduce well the same behaviors observed in the brain dynamical networks of Caenorhabditis elegans and humans, networks of Hindmarsh-Rose neurons with graphs given by these brain networks. We make a strong case to verify our hypothesis by showing that the neural networks with the closest graph distance to the brain networks of Caenorhabditis elegans and humans are the Hindmarsh-Rose neural networks evolved with coupling strengths that maximize information flow capacity. Surprisingly, we find that global neural synchronization levels decrease during brain evolution, reflecting on an underlying global no Hebbian-like evolution process, which is driven by no Hebbian-like learning behaviors for some of the clusters during evolution, and Hebbian-like learning rules for clusters where neurons increase their synchronization

    A Meta-Analysis of Seaweed Impacts on Seagrasses: Generalities and Knowledge Gaps

    Get PDF
    Seagrasses are important habitat-formers and ecosystem engineers that are under threat from bloom-forming seaweeds. These seaweeds have been suggested to outcompete the seagrasses, particularly when facilitated by eutrophication, causing regime shifts where green meadows and clear waters are replaced with unstable sediments, turbid waters, hypoxia, and poor habitat conditions for fishes and invertebrates. Understanding the situations under which seaweeds impact seagrasses on local patch scales can help proactive management and prevent losses at greater scales. Here, we provide a quantitative review of available published manipulative experiments (all conducted at the patch-scale), to test which attributes of seaweeds and seagrasses (e.g., their abundances, sizes, morphology, taxonomy, attachment type, or origin) influence impacts. Weighted and unweighted meta-analyses (Hedges d metric) of 59 experiments showed generally high variability in attribute-impact relationships. Our main significant findings were that (a) abundant seaweeds had stronger negative impacts on seagrasses than sparse seaweeds, (b) unattached and epiphytic seaweeds had stronger impacts than ‘rooted’ seaweeds, and (c) small seagrass species were more susceptible than larger species. Findings (a) and (c) were rather intuitive. It was more surprising that ‘rooted’ seaweeds had comparatively small impacts, particularly given that this category included the infamous invasive Caulerpa species. This result may reflect that seaweed biomass and/or shading and metabolic by-products like anoxia and sulphides could be lower for rooted seaweeds. In conclusion, our results represent simple and robust first-order generalities about seaweed impacts on seagrasses. This review also documented a limited number of primary studies. We therefore identified major knowledge gaps that need to be addressed before general predictive models on seaweed-seagrass interactions can be build, in order to effectively protect seagrass habitats from detrimental competition from seaweeds

    Redrawing the Map of Great Britain from a Network of Human Interactions

    Get PDF
    Do regional boundaries defined by governments respect the more natural ways that people interact across space? This paper proposes a novel, fine-grained approach to regional delineation, based on analyzing networks of billions of individual human transactions. Given a geographical area and some measure of the strength of links between its inhabitants, we show how to partition the area into smaller, non-overlapping regions while minimizing the disruption to each person's links. We tested our method on the largest non-Internet human network, inferred from a large telecommunications database in Great Britain. Our partitioning algorithm yields geographically cohesive regions that correspond remarkably well with administrative regions, while unveiling unexpected spatial structures that had previously only been hypothesized in the literature. We also quantify the effects of partitioning, showing for instance that the effects of a possible secession of Wales from Great Britain would be twice as disruptive for the human network than that of Scotland.National Science Foundation (U.S.)AT & TAudi AGUnited States. Dept. of Defense (National Defense Science and Engineering Fellowship Program

    GDNF Selectively Induces Microglial Activation and Neuronal Survival in CA1/CA3 Hippocampal Regions Exposed to NMDA Insult through Ret/ERK Signalling

    Get PDF
    The glial cell line-derived neurotrophic factor (GDNF) is a potent survival factor for several neuronal populations in different brain regions, including the hippocampus. However, no information is available on the: (1) hippocampal subregions involved in the GDNF-neuroprotective actions upon excitotoxicity, (2) identity of GDNF-responsive hippocampal cells, (3) transduction pathways involved in the GDNF-mediated neuroprotection in the hippocampus. We addressed these questions in organotypic hippocampal slices exposed to GDNF in presence of N-methyl-D-aspartate (NMDA) by immunoblotting, immunohistochemistry, and confocal analysis. In hippocampal slices GDNF acts through the activation of the tyrosine kinase receptor, Ret, without involving the NCAM-mediated pathway. Both Ret and ERK phosphorylation mainly occurred in the CA3 region where the two activated proteins co-localized. GDNF protected in a greater extent CA3 rather than CA1 following NMDA exposure. This neuroprotective effect targeted preferentially neurons, as assessed by NeuN staining. GDNF neuroprotection was associated with a significant increase of Ret phosphorylation in both CA3 and CA1. Interestingly, confocal images revealed that upon NMDA exposure, Ret activation occurred in microglial cells in the CA3 and CA1 following GDNF exposure. Collectively, this study shows that CA3 and CA1 hippocampal regions are highly responsive to GDNF-induced Ret activation and neuroprotection, and suggest that, upon excitotoxicity, such neuroprotection involves a GDNF modulation of microglial cell activity
    corecore