149 research outputs found

    NXY-059, a Failed Stroke Neuroprotectant, Offers No Protection to Stem Cell-Derived Human Neurons

    Get PDF
    Background: Developing new medicines is a complex process where understanding the reasons for both failure and success takes us forward. One gap in our understanding of most candidate stroke drugs before clinical trial is whether they have a protective effect on human tissues. NXY-059 is a spin-trap reagent hypothesized to have activity against the damaging oxidative biology which accompanies ischemic stroke. Re-examination of the preclinical in vivo dataset for this agent in the wake of the failed SAINT-II RCT highlighted the presence of a range of biases leading to overestimation of the magnitude of NXY-059\u27s effects in laboratory animals. Therefore, NXY-059 seemed an ideal candidate to evaluate in human neural tissues to determine whether human tissue testing might improve screening efficiency. Materials and Methods: The aim of this randomized and blinded study was to assess the effects of NXY-059 on human stem cell-derived neurons in the presence of ischemia-like injury induced by oxygen glucose deprivation or oxidative stress induced by hydrogen peroxide or sodium nitroprusside. Results: In MTT assays of cell survival, lactate dehydrogenase assays of total cell death and terminal deoxynucleotidyl transferase dUTP nick end labeling staining of apoptotic-like cell death, NXY-059 at concentrations ranging from 1 µm to 1 mm was completely without activity. Conversely an antioxidant cocktail comprising 100 µm each of ascorbate, reduced glutathione, and dithiothreitol used as a positive control provided marked neuronal protection in these assays. Conclusion: These findings support our hypothesis that stroke drug screening in human neural tissues will be of value and provides an explanation for the failure of NXY-059 as a human stroke drug

    What has preclinical systematic review ever done for us?

    Get PDF
    Systematic review and meta-analysis are a gift to the modern researcher, delivering a crystallised understanding of the existing research data in any given space. This can include whether candidate drugs are likely to work or not and which are better than others, whether our models of disease have predictive value and how this might be improved and also how these all interact with disease pathophysiology. Grappling with the literature needed for such analyses is becoming increasingly difficult as the number of publications grows. However, narrowing the focus of a review to reduce workload runs the risk of diminishing the generalisability of conclusions drawn from such increasingly specific analyses. Moreover, at the same time as we gain greater insight into our topic, we also discover more about the flaws that undermine much scientific research. Systematic review and meta-analysis have also shown that the quality of much preclinical research is inadequate. Systematic review has helped reveal the extent of selection bias, performance bias, detection bias, attrition bias and low statistical power, raising questions about the validity of many preclinical research studies. This is perhaps the greatest virtue of systematic review and meta-analysis, the knowledge generated ultimately helps shed light on the limitations of existing research practice, and in doing so, helps bring reform and rigour to research across the sciences. In this commentary, we explore the lessons that we have identified through the lens of preclinical systematic review and meta-analysis

    Evolution of ischemic damage and behavioural deficit over 6 months after MCAo in the rat: Selecting the optimal outcomes and statistical power for multi-centre preclinical trials

    Get PDF
    Key disparities between the timing and methods of assessment in animal stroke studies and clinical trial may be part of the reason for the failure to translate promising findings. This study investigates the development of ischemic damage after thread occlusion MCAo in the rat, using histological and behavioural outcomes. Using the adhesive removal test we investigate the longevity of behavioural deficit after ischemic stroke in rats, and examine the practicality of using such measures as the primary outcome for future studies. Ischemic stroke was induced in 132 Spontaneously Hypertensive Rats which were assessed for behavioural and histological deficits at 1, 3, 7, 14, 21, 28 days, 12 and 24 weeks (n>11 per timepoint). The basic behavioural score confirmed induction of stroke, with deficits specific to stroke animals. Within 7 days, these deficits resolved in 50% of animals. The adhesive removal test revealed contralateral neglect for up to 6 months following stroke. Sample size calculations to facilitate the use of this test as the primary experimental outcome resulted in cohort sizes much larger than are the norm for experimental studies. Histological damage progressed from a necrotic infarct to a hypercellular area that cleared to leave a fluid filled cavity. Whilst absolute volume of damage changed over time, when corrected for changes in hemispheric volume, an equivalent area of damage was lost at all timepoints. Using behavioural measures at chronic timepoints presents significant challenges to the basic science community in terms of the large number of animals required and the practicalities associated with this. Multicentre preclinical randomised controlled trials as advocated by the MultiPART consortium may be the only practical way to deal with this issue

    Systematic reviews and meta-analysis of preclinical studies:why perform them and how to appraise them critically

    Get PDF
    The use of systematic review and meta-analysis of preclinical studies has become more common, including those of studies describing the modeling of cerebrovascular diseases. Empirical evidence suggests that too many preclinical experiments lack methodological rigor, and this leads to inflated treatment effects. The aim of this review is to describe the concepts of systematic review and meta-analysis and consider how these tools may be used to provide empirical evidence to spur the field to improve the rigor of the conduct and reporting of preclinical research akin to their use in improving the conduct and reporting of randomized controlled trials in clinical research. As with other research domains, systematic reviews are subject to bias. Therefore, we have also suggested guidance for their conduct, reporting, and critical appraisal

    Risk of bias reporting in the recent animal focal cerebral ischaemia literature

    Get PDF
    BACKGROUND: Findings from in vivo research may be less reliable where studies do not report measures to reduce risks of bias. The experimental stroke community has been at the forefront of implementing changes to improve reporting, but it is not known whether these efforts are associated with continuous improvements. Our aims here were firstly to validate an automated tool to assess risks of bias in published works, and secondly to assess the reporting of measures taken to reduce the risk of bias within recent literature for two experimental models of stroke. METHODS: We developed and used text analytic approaches to automatically ascertain reporting of measures to reduce risk of bias from full-text articles describing animal experiments inducing middle cerebral artery occlusion (MCAO) or modelling lacunar stroke. RESULTS: Compared with previous assessments, there were improvements in the reporting of measures taken to reduce risks of bias in the MCAO literature but not in the lacunar stroke literature. Accuracy of automated annotation of risk of bias in the MCAO literature was 86% (randomization), 94% (blinding) and 100% (sample size calculation); and in the lacunar stroke literature accuracy was 67% (randomization), 91% (blinding) and 96% (sample size calculation). DISCUSSION: There remains substantial opportunity for improvement in the reporting of animal research modelling stroke, particularly in the lacunar stroke literature. Further, automated tools perform sufficiently well to identify whether studies report blinded assessment of outcome, but improvements are required in the tools to ascertain whether randomization and a sample size calculation were reported

    Hypothermia protects human neurons

    Get PDF
    Background and Aims: Hypothermia provides neuroprotection after cardiac arrest, hypoxic-ischemic encephalopathy, and in animal models of ischemic stroke. However, as drug development for stroke has been beset by translational failure, we sought additional evidence that hypothermia protects human neurons against ischemic injury. Methods: Human embryonic stem cells were cultured and differentiated to provide a source of neurons expressing β III tubulin, microtubule-associated protein 2, and the Neuronal Nuclei antigen. Oxygen deprivation, oxygen-glucose deprivation, and H2O2 -induced oxidative stress were used to induce relevant injury. Results: Hypothermia to 33°C protected these human neurons against H2O2 -induced oxidative stress reducing lactate dehydrogenase release and Terminal deoxynucleotidyl transferase dUTP nick end labeling-staining by 53% (P≤0·0001; 95% confidence interval 34·8-71·04) and 42% (P≤0·0001; 95% confidence interval 27·5-56·6), respectively, after 24 h in culture. Hypothermia provided similar protection against oxygen-glucose deprivation (42%, P≤0·001, 95% confidence interval 18·3-71·3 and 26%, P≤0·001; 95% confidence interval 12·4-52·2, respectively) but provided no protection against oxygen deprivation alone. Protection (21%) persisted against H2O2 -induced oxidative stress even when hypothermia was initiated six-hours after onset of injury (P≤0·05; 95% confidence interval 0·57-43·1). Conclusion: We conclude that hypothermia protects stem cell-derived human neurons against insults relevant to stroke over a clinically relevant time frame. Protection against H2O2 -induced injury and combined oxygen and glucose deprivation but not against oxygen deprivation alone suggests an interaction in which protection benefits from reduction in available glucose under some but not all circumstances

    High-resolution ab initio three-dimensional X-ray diffraction microscopy

    Full text link
    Coherent X-ray diffraction microscopy is a method of imaging non-periodic isolated objects at resolutions only limited, in principle, by the largest scattering angles recorded. We demonstrate X-ray diffraction imaging with high resolution in all three dimensions, as determined by a quantitative analysis of the reconstructed volume images. These images are retrieved from the 3D diffraction data using no a priori knowledge about the shape or composition of the object, which has never before been demonstrated on a non-periodic object. We also construct 2D images of thick objects with infinite depth of focus (without loss of transverse spatial resolution). These methods can be used to image biological and materials science samples at high resolution using X-ray undulator radiation, and establishes the techniques to be used in atomic-resolution ultrafast imaging at X-ray free-electron laser sources.Comment: 22 pages, 11 figures, submitte
    • …
    corecore