377 research outputs found

    The global role of kidney transplantation

    Get PDF
    World Kidney Day on March 8 th 2012 provides a chance to reflect on the success of kidney transplantation as a therapy for end stage kidney disease that surpasses dialysis treatments both for the quality and quantity of life that it provides and for its cost effectiveness. Anything that is both cheaper and better, but is not actually the dominant therapy, must have other drawbacks that prevent replacement of all dialysis treatment by transplantation. The barriers to universal transplantation as the therapy for end stage kidney disease include the economic limitations which, in some countries place transplantation, appropriately, at a lower priority than public health fundamentals such as clean water, sanitation and vaccination. Even in high income countries the technical challenges of surgery and the consequences of immunosuppression restrict the number of suitable recipients, but the major finite restrictions on kidney transplantation rates are the shortage of donated organs and the limited medical, surgical and nursing workforces with the required expertise. These problems have solutions which involve the full range of societal, professional, governmental and political environments. World Kidney Day is a call to deliver transplantation therapy to the one million people a year who have a right to benefit

    Reduction of Injection-Related Risk Behaviors After Emergency Implementation of a Syringe Services Program During an HIV Outbreak

    Get PDF
    Objective: To describe injection-related HIV risk behaviors preimplementation and postimplementation of an emergency syringe services program (SSP) in Scott County, Indiana, after an HIV outbreak among persons who inject drugs (PWID). Design: Mixed methods retrospective pre–post intervention analysis. Methods: We analyzed routine SSP program data collected at first and most recent visit among clients with ≥2 visits, ≥7 days apart from April 4 to August 30, 2015, to quantify changes in injection-related risk behaviors. We also analyzed qualitative data collected from 56 PWID recruited in Scott County to understand factors contributing to these behaviors. Results: SSP clients included in our analysis (n = 148, 62% of all SSP clients) reported significant (P < 0.001) reductions over a median 10 weeks (range 1–23) in syringe sharing to inject (18%–2%) and divide drugs (19%–4%), sharing other injection equipment (eg, cookers) (24%–5%), and number of uses of the same syringe [2 (interquartile range: 1–4) to 1 (interquartile range: 1–1)]. Qualitative study participants described access to sterile syringes and safer injection education through the SSP, as explanatory factors for these reductions. Injection frequency findings were mixed, but overall suggested no change. The number of syringes returned by SSP clients increased from 0 at first visit to median 57. All qualitative study participants reported using sharps containers provided by the SSP. Conclusions: Analyses of an SSP program and in-depth qualitative interview data showed rapid reduction of injection-related HIV risk behaviors among PWID post-SSP implementation. Sterile syringe access as part of comprehensive HIV prevention is an important tool to control and prevent HIV outbreaks

    Interleukin 2 Receptor Antagonists for Kidney Transplant Recipients

    Get PDF
    Background: Interleukin 2 receptor antagonists (IL2Ra) are used as induction therapy for prophylaxis against acute rejection in kidney transplant recipients. Use of IL2Ra has increased steadily, with 38% of new kidney transplant recipients in the United States, and 23% in Australasia receiving IL2Ra in 2002. Objectives: This study aims to systematically identify and summarise the effects of using an IL2Ra, as an addition to standard therapy, or as an alternative to other antibody therapy. Search strategy: The Cochrane Renal Group's specialised register (June 2003), the Cochrane Controlled Trials Register (in The Cochrane Library issue 3, 2002), MEDLINE (1966-November 2002) and EMBASE (1980-November 2002). Reference lists and abstracts of conference proceedings and scientific meetings were hand-searched from 1998-2003. Trial groups, authors of included reports and drug manufacturers were contacted. Selection criteria: Randomised controlled trials (RCTs) in all languages comparing IL2Ra to placebo, no treatment, other IL2Ra or other antibody therapy. Data collection and analysis: Data was extracted and quality assessed independently by two reviewers, with differences resolved by discussion. Dichotomous outcomes are reported as relative risk (RR) with 95% confidence intervals (CI). Main results: One hundred and seventeen reports from 38 trials involving 4893 participants were included. Where IL2Ra were compared with placebo (17 trials; 2786 patients), graft loss was not significantly different at one (RR 0.83, 95% CI 0.66 to 1.04) or three years (RR 0.88, 95% CI 0.64 to 1.22). Acute rejection (AR) was significantly reduced at six months (RR 0.66, 95% CI 0.59 to 0.74) and at one year (RR 0.67, 95% CI 0.60 to 0.75). At one year, cytomegalovirus (CMV) infection (RR 0.82, 95% CI 0.65 to 1.03) and malignancy (RR 0.67, 95% CI 0.33 to 1.36) were not significantly different. Where IL2Ra were compared with other antibody therapy no significant differences in treatment effects were demonstrated, but adverse effects strongly favoured IL2Ra. Reviewer's conclusions: Given a 40% risk of rejection, seven patients would need treatment with IL2Ra to prevent one patient having rejection, with no definite improvement in graft or patient survival. There is no apparent difference between basiliximab and daclizumab. IL2Ra are as effective as other antibody therapies and with significantly fewer side effect

    Economic Values for Perennial Ryegrass Traits in New Zealand Dairy Farm Systems

    Get PDF
    Perennial ryegrass (Lolium perenne L.) is the main species used in dairy pastures throughout New Zealand. There are approximately 30 perennial ryegrass cultivars sold commercially in New Zealand, but currently there is no evaluation system which allows farmers to compare the potential impact of different cultivars on the profitability of their farm business. Such an economic evaluation system requires information on performance values (PV) for cultivars which quantifies their performance with respect to the major productivity traits (herbage accumulation (HA, kg DM/ha), nutritive value and persistence) relative to a genetic base, and economic values (EV, Doyle and Elliott 1983) which estimate the additional profit resulting from each unit change in the trait of interest (Equation 1). Economic value = Δ operating profit/Δ trait of interest (1) This paper describes a system modelling approach developed to estimate EV for seasonal HA of pasture in the major dairying regions of New Zealand. This information is used in the DairyNZ Forage Value Index system (www.dairynzfvi.co.nz) which is being developed to include information on all three productivity traits for commercially available ryegrass cultivars

    Development of a Forage Evaluation System for Perennial Ryegrass Cultivar and Endophyte Combinations in New Zealand Dairy Systems

    Get PDF
    An economic index for perennial ryegrass (Lolium perenne L.) cultivars is a relatively new concept, although recently introduced in Ireland (McEvoy et al. 2011). By contrast, in dairy cattle breeding, the concept of an economic index rating animals and economic values underlying that index is well entrenched (Philipson et al. 1994; Veerkamp, 1998). Historically, forage evaluation data for individual cultivars were either displayed using absolute numbers for seasonal dry matter production within a season or across all seasons with a notation to indicate statistical differences, or percentage values where a reference cultivar is 100. The adoption of an economic index and routine evaluation approach for perennial ryegrass provides a method to identify traits of economic importance to focus plant breeding efforts better and to provide clarity for farmers around predicting cultivars that will maximise farm profit. It also allows for routine tracking of genetic gain of individual traits and the economic index. In this paper, the economic based forage evaluation techniques now used in New Zealand for perennial ryegrass cultivar/endophyte combinations are presented

    Community end-of-life care during the COVID-19 pandemic: findings of a UK primary care survey.

    Get PDF
    BACKGROUND: Thousands of people in the UK have required end-of-life care in the community during the COVID-19 pandemic. Primary healthcare teams (general practice and community nursing services) have provided the majority of this care, alongside specialist colleagues. There is a need to learn from this experience in order to inform future service delivery and planning. AIM: To understand the views of GPs and community nurses providing end-of-life care during the first wave of the COVID-19 pandemic. DESIGN & SETTING: A web-based, UK-wide questionnaire survey circulated via professional general practice and community nursing networks, during September and October 2020. METHOD: Responses were analysed using descriptive statistics and an inductive thematic analysis. RESULTS: Valid responses were received from 559 individuals (387 community nurses, 156 GPs, and 16 unspecified roles), from all regions of the UK. The majority reported increased involvement in providing community end-of-life care. Contrasting and potentially conflicting roles emerged between GPs and community nurses. There was increased use of remote consultations, particularly by GPs. Community nurses took greater responsibility in most aspects of end-of-life care practice, particularly face-to-face care, but reported feeling isolated. For some GPs and community nurses, there has been considerable emotional distress. CONCLUSION: Primary healthcare services are playing a critical role in meeting increased need for end-of-life care in the community during the COVID-19 pandemic. They have adapted rapidly, but the significant emotional impact, especially for community nurses, needs addressing alongside rebuilding trusting and supportive team dynamics

    Comparative Survival and Economic Benefits of Deceased Donor Kidney Transplantation and Dialysis in People with Varying Ages and Co-Morbidities

    Get PDF
    <div><h3>Background</h3><p>Deceased donor kidneys for transplantation are in most countries allocated preferentially to recipients who have limited co-morbidities. Little is known about the incremental health and economic gain from transplanting those with co-morbidities compared to remaining on dialysis. The aim of our study is to estimate the average and incremental survival benefits and health care costs of listing and transplantation compared to dialysis among individuals with varying co-morbidities.</p> <h3>Methods</h3><p>A probabilistic Markov model was constructed, using current outcomes for patients with defined co-morbidities treated with either dialysis or transplantation, to compare the health and economic benefits of listing and transplantation with dialysis.</p> <h3>Findings</h3><p>Using the current waiting time for deceased donor transplantation, transplanting a potential recipient, with or without co-morbidities achieves survival gains of between 6 months and more than three life years compared to remaining on dialysis, with an average incremental cost-effectiveness ratio (ICER) of less than $50,000/LYS, even among those with advanced age. Age at listing and the waiting time for transplantation are the most influential variables within the model. If there were an unlimited supply of organs and no waiting time, transplanting the younger and healthier individuals saves the most number of life years and is cost-saving, whereas transplanting the middle-age to older patients still achieves substantial incremental gains in life expectancy compared to being on dialysis.</p> <h3>Conclusions</h3><p>Our modelled analyses suggest transplanting the younger and healthier individuals with end-stage kidney disease maximises survival gains and saves money. Listing and transplanting those with considerable co-morbidities is also cost-effective and achieves substantial survival gains compared with the dialysis alternative. Preferentially excluding the older and sicker individuals cannot be justified on utilitarian grounds.</p> </div

    Environmental adaptation, phenotypic plasticity, and associative learning in insects: the desert locust as a case study

    Get PDF
    The ability to learn and store information should be adapted to the environment in which animals operate to confer a selective advantage. Yet the relationship between learning, memory, and the environment is poorly understood, and further complicated by phenotypic plasticity caused by the very environment in which learning and memory need to operate. Many insect species show polyphenism, an extreme form of phenotypic plasticity, allowing them to occupy distinct environments by producing two or more alternative phenotypes. Yet how the learning and memories capabilities of these alternative phenotypes are adapted to their specific environments remains unknown for most polyphenic insect species. The desert locust can exist as one of two extreme phenotypes or phases, solitarious and gregarious. Recent studies of associative food–odor learning in this locust have shown that aversive but not appetitive learning differs between phases. Furthermore, switching from the solitarious to the gregarious phase (gregarization) prevents locusts acquiring new learned aversions, enabling them to convert an aversive memory formed in the solitarious phase to an appetitive one in the gregarious phase. This conversion provides a neuroecological mechanism that matches key changes in the behavioral environments of the two phases. These findings emphasize the importance of understanding the neural mechanisms that generate ecologically relevant behaviors and the interactions between different forms of behavioral plasticity

    Tacrolimus Versus Cyclosporin as Primary Immunosuppression for Kidney Transplant Recipients

    Get PDF
    Background: Kidney transplantation is the treatment of choice for most patients with end-stage renal disease (ESRD). Standard protocols in use typically involve three drug groups each directed to a site in the T-cell activation or proliferation cascade which are central to the rejection process: calcineurin inhibitors (e.g. cyclosporin, tacrolimus), anti-proliferative agents (e.g. azathioprine, mycophenolate mofetil) and steroids (prednisolone). It remains unclear whether new regimens are more specific or simply more potent immunosuppressants. Objectives: To compare the effects of tacrolimus with cyclosporin as primary therapy for kidney transplant recipients. Search strategy: MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, the Cochrane Renal Group's specialist register and conference proceedings were searched to identify relevant reports of randomised controlled trials (RCTs). Two reviewers assessed trials for eligibility, quality and extracted data independently. Selection criteria: All RCTs where tacrolimus was compared with cyclosporin for the initial treatment of kidney transplant recipients Data collection and analysis: Data were synthesised (random effects model) and results expressed as relative risk (RR), values <1 favouring tacrolimus, with 95% confidence intervals (CI). Subgroup analysis and meta-regression were used to examine potential effect modification by differences in trial design and immunosuppressive co-interventions. Main results: 123 reports from 30 trials (4102 patients) were included. At six months graft loss was significantly reduced in tacrolimus-treated recipients (RR 0.56, 95% CI 0.36 to 0.86), and this effect was persistent up to three years. Meta-regression showed that this benefit diminished as higher trough levels of tacrolimus were targeted (P = 0.04), after allowing for differences in cyclosporin formulation (P = 0.97) and cyclosporin target trough level (P = 0.38). At one year, tacrolimus patients suffered less acute rejection (RR 0.69, 95% CI 0.60 to 0.79), and less steroid-resistant rejection (RR 0.49, 95% CI 0.37 to 0.64), but more insulin-requiring diabetes mellitus (RR 1.86, 1.11 to 3.09), tremor, headache, diarrhoea, dyspepsia and vomiting. Cyclosporin-treated recipients experienced significantly more constipation and cosmetic side-effects. We demonstrated no differences in infection or malignancy. Authors' conclusions: Tacrolimus is superior to cyclosporin in improving graft survival and preventing acute rejection after kidney transplantation, but increases post-transplant diabetes, neurological and gastrointestinal side effects. Treating 100 recipients with tacrolimus instead of cyclosporin would avoid 12 suffering acute rejection, two losing their graft but cause an extra five to become insulin-requiring diabetics

    Target of Rapamycin Inhibitors (TOR-I; Sirolimus and Everolimus) for Primary Immunosuppression in Kidney Transplant Recipients

    Get PDF
    Background: Target of rapamycin inhibitors (TOR-I) (sirolimus, everolimus) are immunosuppressive agents with a novel mode of action but an uncertain clinical role. Objectives: To investigate the benefits and harms of immunosuppressive regimens containing TOR-I when compared to other regimens as initial therapy for kidney transplant recipients. Search strategy: We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (in The Cochrane Library, issue 2, 2005), MEDLINE (1966-June 2005), EMBASE (1980-June 2005), the specialised register of the Cochrane Renal Group (June 2005)., and contacted authors and pharmaceutical companies to identify relevant studies. Selection criteria: All randomised controlled trials (RCTs) and quasi-RCTs where drug regimens containing TOR-I were compared to alternative drug regimens in the immediate post-transplant period were included, without age restriction, dosage or language of report. Data collection and analysis: Two reviewers independently assessed trials for eligibility and quality, and extracted data. Results are expressed as relative risk (RR) or weight mean difference (MD) with 95% confidence intervals (CI). Main results: Thirty three trials (142 reports) were included (sirolimus (27), everolimus (5), head-to-head (1)). When TOR-I replaced CNI there was no difference in acute rejection, but serum creatinine was lower (MD -18.31 micromol/L, -30.96 to -5.67), and bone marrow more suppressed (leucopenia: RR 2.02 1.12 to 3.66; thrombocytopenia: RR 6.97 2.97 to 16.36; anaemia: RR 1.67, 1.27 to 2.20). When TOR-I replaced antimetabolites, acute rejection (RR 0.84, 0.71 to 0.99) and cytomegalovirus infection (CMV) (RR 0.49; 0.37 to 0.65) were reduced, but hypercholesterolaemia was increased (RR 1.65, 1.32 to 2.06). For low versus high-dose TOR-I, with equal CNI dose, rejection was increased (RR 1.23, 1.06 to 1.43) but calculated GFR higher (MD 4.27 mL/min, 1.12 to 7.41), and for low-dose TOR-I/standard-dose CNI versus higher-dose TOR-I/reduced CNI, acute rejection (RR 0.67, 0.52 to 0.88) and calculated GFR (MD -9.46 mL/min, -12.16 to -6.76) were reduced. There was no significant difference in mortality, graft loss or malignancy risk for TOR-I in any comparison. Authors' conclusions: TOR-I have been evaluated in four different primary immunosuppressive algorithms; as replacement for CNI and for antimetabolites, in combination with CNI at low and high dose and with variable dose of CNI. Generally, surrogate endpoints for graft survival favour TOR-I (lower risk of acute rejection and higher GFR) and surrogate endpoints for patient outcomes are worsened by TOR-I (bone marrow suppression, lipid disturbance). Long-term hard-endpoint data from methodologically robust RCTs are still needed
    • …
    corecore