100 research outputs found

    Impact of an Early Invasive Strategy versus Conservative Strategy for Unstable Angina and Non-ST Elevation Acute Coronary Syndrome in Patients with Chronic Kidney Disease: A Systematic Review.

    Get PDF
    BACKGROUND: Clinical practice guidelines support an early invasive approach after NSTE-ACS in patients with chronic kidney disease (CKD). There is no direct randomised controlled trial evidence in the CKD population, and whether the benefit of an early invasive approach is maintained across the spectrum of severity of CKD remains controversial. METHODS: We conducted a systematic review to evaluate the association between an early invasive approach and all-cause mortality in patients with CKD. We searched MEDLINE and EMBASE (1990-May 2015) and article reference lists. Data describing study design, participants, invasive management strategies, renal function, all-cause mortality and risk of bias were extracted. RESULTS: 3,861 potentially relevant studies were identified. Ten studies, representing data on 147,908 individuals with NSTE-ACS met the inclusion criteria. Qualitative heterogeneity in the definitions of early invasive approach, comparison groups and renal dysfunction existed. Meta-analysis of the RCT derived and observational data were generally supportive of an early invasive approach in CKD (RR0.76 (95% CI 0.49-1.17) and RR0.50 (95%CI 0.42-0.59) respectively). Meta-analysis of the observational studies demonstrated a large degree of heterogeneity (I2 79%) driven in part by study size and heterogeneity across various kidney function levels. CONCLUSIONS: The observational data support that an early invasive approach after NSTE-ACS confers a survival benefit in those with early-moderate CKD. Local opportunities for quality improvement should be sought. Those with severe CKD and the dialysis population are high risk and under-studied. Novel and inclusive approaches for CKD and dialysis patients in cardiovascular clinical trials are needed

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Role of Occult and Post-acute Phase Replication in Protective Immunity Induced with a Novel Live Attenuated SIV Vaccine

    Get PDF
    In order to evaluate the role of persisting virus replication during occult phase immunisation in the live attenuated SIV vaccine model, a novel SIVmac239Δnef variant (SIVrtTA) genetically engineered to replicate in the presence of doxycycline was evaluated for its ability to protect against wild-type SIVmac239. Indian rhesus macaques were vaccinated either with SIVrtTA or with SIVmac239Δnef. Doxycycline was withdrawn from 4 of 8 SIVrtTA vaccinates before challenge with wild-type virus. Unvaccinated challenge controls exhibited ~107 peak plasma viral RNA copies/ml persisting beyond the acute phase. Six vaccinates, four SIVmac239Δnef and two SIVrtTA vaccinates exhibited complete protection, defined by lack of wild-type viraemia post-challenge and virus-specific PCR analysis of tissues recovered post-mortem, whereas six SIVrtTA vaccinates were protected from high levels of viraemia. Critically, the complete protection in two SIVrtTA vaccinates was associated with enhanced SIVrtTA replication in the immediate post-acute vaccination period but was independent of doxycycline status at the time of challenge. Mutations were identified in the LTR promoter region and rtTA gene that do not affect doxycycline-control but were associated with enhanced post-acute phase replication in protected vaccinates. High frequencies of total circulating CD8+T effector memory cells and a higher total frequency of SIV-specific CD8+ mono and polyfunctional T cells on the day of wild-type challenge were associated with complete protection but these parameters were not predictive of outcome when assessed 130 days after challenge. Moreover, challenge virus-specific Nef CD8+ polyfunctional T cell responses and antigen were detected in tissues post mortem in completely-protected macaques indicating post-challenge control of infection. Within the parameters of the study design, on-going occult-phase replication may not be absolutely required for protective immunity

    Effects of antiplatelet therapy on stroke risk by brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases: subgroup analyses of the RESTART randomised, open-label trial

    Get PDF
    Background Findings from the RESTART trial suggest that starting antiplatelet therapy might reduce the risk of recurrent symptomatic intracerebral haemorrhage compared with avoiding antiplatelet therapy. Brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases (such as cerebral microbleeds) are associated with greater risks of recurrent intracerebral haemorrhage. We did subgroup analyses of the RESTART trial to explore whether these brain imaging features modify the effects of antiplatelet therapy

    Methods Used in Economic Evaluations of Chronic Kidney Disease Testing — A Systematic Review

    Get PDF
    Background: The prevalence of chronic kidney disease (CKD) is high in general populations around the world. Targeted testing and screening for CKD are often conducted to help identify individuals that may benefit from treatment to ameliorate or prevent their disease progression. Aims: This systematic review examines the methods used in economic evaluations of testing and screening in CKD, with a particular focus on whether test accuracy has been considered, and how analysis has incorporated issues that may be important to the patient, such as the impact of testing on quality of life and the costs they incur. Methods: Articles that described model-based economic evaluations of patient testing interventions focused on CKD were identified through the searching of electronic databases and the hand searching of the bibliographies of the included studies. Results: The initial electronic searches identified 2,671 papers of which 21 were included in the final review. Eighteen studies focused on proteinuria, three evaluated glomerular filtration rate testing and one included both tests. The full impact of inaccurate test results was frequently not considered in economic evaluations in this setting as a societal perspective was rarely adopted. The impact of false positive tests on patients in terms of the costs incurred in re-attending for repeat testing, and the anxiety associated with a positive test was almost always overlooked. In one study where the impact of a false positive test on patient quality of life was examined in sensitivity analysis, it had a significant impact on the conclusions drawn from the model. Conclusion: Future economic evaluations of kidney function testing should examine testing and monitoring pathways from the perspective of patients, to ensure that issues that are important to patients, such as the possibility of inaccurate test results, are properly considered in the analysis

    Virological failure and development of new resistance mutations according to CD4 count at combination antiretroviral therapy initiation

    Get PDF
    Objectives: No randomized controlled trials have yet reported an individual patient benefit of initiating combination antiretroviral therapy (cART) at CD4 counts > 350 cells/μL. It is hypothesized that earlier initiation of cART in asymptomatic and otherwise healthy individuals may lead to poorer adherence and subsequently higher rates of resistance development. Methods: In a large cohort of HIV-positive individuals, we investigated the emergence of new resistance mutations upon virological treatment failure according to the CD4 count at the initiation of cART. Results: Of 7918 included individuals, 6514 (82.3%), 996 (12.6%) and 408 (5.2%) started cART with a CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Virological rebound occurred while on cART in 488 (7.5%), 46 (4.6%) and 30 (7.4%) with a baseline CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Only four (13.0%) individuals with a baseline CD4 count > 350 cells/μL in receipt of a resistance test at viral load rebound were found to have developed new resistance mutations. This compared to 107 (41.2%) of those with virological failure who had initiated cART with a CD4 count < 350 cells/μL. Conclusions: We found no evidence of increased rates of resistance development when cART was initiated at CD4 counts above 350 cells/μL. HIV Medicin

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p&lt;0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p&lt;0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p&gt;0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p&lt;0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    TRY plant trait database – enhanced coverage and open access

    Get PDF
    Plant traits - the morphological, anatomical, physiological, biochemical and phenological characteristics of plants - determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait‐based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits - almost complete coverage for ‘plant growth form’. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait–environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
    corecore