116 research outputs found

    Covariate-adjusted analysis of the Phase 3 REFLECT study of lenvatinib versus sorafenib in the treatment of unresectable hepatocellular carcinoma.

    Get PDF
    BACKGROUND: In the Phase 3 REFLECT trial in patients with unresectable hepatocellular carcinoma (uHCC), the multitargeted tyrosine kinase inhibitor, lenvatinib, was noninferior to sorafenib in the primary outcome of overall survival. Post-hoc review revealed imbalances in prognostic variables between treatment arms. Here, we re-analyse overall survival data from REFLECT to adjust for the imbalance in covariates. METHODS: Univariable and multivariable adjustments were undertaken for a candidate set of covariate values that a physician panel indicated could be prognostically associated with overall survival in uHCC. The values included baseline variables observed pre- and post-randomisation. Univariable analyses were based on a stratified Cox model. The multivariable analysis used a "forwards stepwise" Cox model. RESULTS: Univariable analysis identified alpha-fetoprotein (AFP) as the most influential variable. The chosen multivariable Cox model analysis resulted in an estimated adjusted hazard ratio for lenvatinib of 0.814 (95% CI: 0.699-0.948) when only baseline variables were included. Adjusting for post-randomisation treatment variables further increased the estimated superiority of lenvatinib. CONCLUSIONS: Covariate adjustment of REFLECT suggests that the original noninferiority trial likely underestimated the true effect of lenvatinib on overall survival due to an imbalance in baseline prognostic covariates and the greater use of post-treatment therapies in the sorafenib arm. TRIAL REGISTRATION: Trial number: NCT01761266 (Submitted January 2, 2013)

    Mendelian randomization: estimation of inpatient hospital costs attributable to obesity.

    Get PDF
    BACKGROUND: Mendelian Randomization is a type of instrumental variable (IV) analysis that uses inherited genetic variants as instruments to estimate causal effects attributable to genetic factors. This study aims to estimate the impact of obesity on annual inpatient healthcare costs in the UK using linked data from the UK Biobank and Hospital Episode Statistics (HES). METHODS: UK Biobank data for 482,127 subjects was linked with HES inpatient admission records, and costs were assigned to episodes of care. A two-stage least squares (TSLS) IV model and a TSLS two-part cost model were compared to a naïve regression of inpatient healthcare costs on body mass index (BMI). RESULTS: The naïve analysis of annual cost on continuous BMI predicted an annual cost of £21.61 [95% CI £20.33 - £22.89] greater cost per unit increase in BMI. The TSLS IV model predicted an annual cost of £14.36 [95% CI £0.31 - £28.42] greater cost per unit increase in BMI. Modelled with a binary obesity variable, the naïve analysis predicted that obese subjects incurred £205.53 [95% CI £191.45 - £219.60] greater costs than non-obese subjects. The TSLS model predicted a cost £201.58 [95% CI £4.32 - £398.84] greater for obese subjects compared to non-obese subjects. CONCLUSIONS: The IV models provide evidence for a causal relationship between obesity and higher inpatient healthcare costs. Compared to the naïve models, the binary IV model found a slightly smaller marginal effect of obesity, and the continuous IV model found a slightly smaller marginal effect of a single unit increase in BMI

    Analysis of Human Accelerated DNA Regions Using Archaic Hominin Genomes

    Get PDF
    Several previous comparisons of the human genome with other primate and vertebrate genomes identified genomic regions that are highly conserved in vertebrate evolution but fast-evolving on the human lineage. These human accelerated regions (HARs) may be regions of past adaptive evolution in humans. Alternatively, they may be the result of non-adaptive processes, such as biased gene conversion. We captured and sequenced DNA from a collection of previously published HARs using DNA from an Iberian Neandertal. Combining these new data with shotgun sequence from the Neandertal and Denisova draft genomes, we determine at least one archaic hominin allele for 84% of all positions within HARs. We find that 8% of HAR substitutions are not observed in the archaic hominins and are thus recent in the sense that the derived allele had not come to fixation in the common ancestor of modern humans and archaic hominins. Further, we find that recent substitutions in HARs tend to have come to fixation faster than substitutions elsewhere in the genome and that substitutions in HARs tend to cluster in time, consistent with an episodic rather than a clock-like process underlying HAR evolution. Our catalog of sequence changes in HARs will help prioritize them for functional studies of genomic elements potentially responsible for modern human adaptations

    C3d‐positive donor‐specific antibodies have a role in pretransplant risk stratification of cross‐match‐positive HLA‐incompatible renal transplantation : United Kingdom multicentre study

    Get PDF
    Anti‐HLA‐antibody characteristics aid to risk‐stratify patients and improve long‐term renal graft outcomes. Complement activation by donor‐specific antibody (DSA) is an important characteristic that may determine renal allograft outcome. There is heterogeneity in graft outcomes within the moderate to high immunological risk cases (cross‐match‐positive). We explored the role of C3d‐positive DSAs in sub‐stratification of cross‐match‐positive cases and relate to the graft outcomes. We investigated 139 cross‐match‐positive living‐donor renal transplant recipients from four transplant centres in the United Kingdom. C3d assay was performed on serum samples obtained at pretreatment (predesensitization) and Day 14 post‐transplant. C3d‐positive DSAs were found in 52 (37%) patients at pretreatment and in 37 (27%) patients at Day 14 post‐transplant. Median follow‐up of patients was 48 months (IQR 20.47–77.57). In the multivariable analysis, pretreatment C3d‐positive DSA was independently associated with reduced overall graft survival, the hazard ratio of 3.29 (95% CI 1.37–7.86). The relative risk of death‐censored five‐year graft failure was 2.83 (95% CI 1.56–5.13). Patients with both pretreatment and Day 14 C3d‐positive DSAs had the worst five‐year graft survival at 45.5% compared with 87.2% in both pretreatment and Day 14 C3d‐negative DSA patients with the relative risk of death‐censored five‐year graft failure was 4.26 (95% CI 1.79, 10.09). In this multicentre study, we have demonstrated for the first time the utility of C3d analysis as a distinctive biomarker to sub‐stratify the risk of poor graft outcome in cross‐match‐positive living‐donor renal transplantation

    Health impacts of environmental and social interventions designed to increase deprived communities' access to urban woodlands: a mixed-methods study

    Get PDF
    Background: Contact with natural environments can bring health benefits, but research is lacking on how changes in access to natural environments might improve health, especially for deprived populations. Objective: To evaluate the health impacts of woodland environment interventions intended to increase communities’ engagement with these woodlands. Design: A prospective study of Forestry Commission Scotland’s Woods In and Around Towns (WIAT) programme in deprived communities to enhance public access to natural environments. The study investigated the impact that WIAT had on community-level mental health over time. Setting: Three intervention and three control woodland sites, and associated communities within 1.5 km of the woodlands, located in central Scotland and eligible for WIAT support. Participants: A core community survey was administered at each site in three waves, at baseline and after each phase of intervention (n = 5460, panel A). The completed survey contained a nested longitudinal cohort (n = 609, panel B). Community members also undertook 6-monthly environmental audits at all sites (n = 256) and participated in post-intervention focus groups (n = 34). Interventions: Phase 1 involved physical changes to the woodlands, including footpaths, entrances and vegetation. Phase 2 involved community engagement events promoting woodland use. Main outcome measures: The primary outcome was the Perceived Stress Scale (PSS). Other health measures included health-related quality of life (HRQoL) EuroQol-5 Dimensions (EQ-5D), physical activity (PA) [International Physical Activity Questionnaire (IPAQ)], connectedness to nature [Inclusion of Nature in Self (INS) scale] and social cohesion. Results: The PSS scores significantly increased in the intervention group and marginally decreased in the control group. Multilevel regression models showed a differential impact between the intervention and the control at survey wave 3 in panel A [B (unstandardised coefficient) 3.58, 95% confidence interval (CI) 2.85 to 4.31; p < 0.001] and in panel B [B 3.03, 95% CI 1.54 to 4.52; p < 0.001]. Using the same analytical approach, no significant change in HRQoL was associated with the intervention. Economic assessment included an illustrative cost–utility analysis and a cost–consequences analysis. The differential in stress between the intervention group and the control group was lower or non-significant in those who visited ‘nature’ in the previous year [panel A, B 1.9, 95% CI 0.8 to 3.0; p < 0.001; panel B, B 0.64, 95% CI –1.60 to 2.88; p = 0.57]. The IPAQ score showed a positive association with the intervention for moderate levels of PA [panel B, B 559.3, 95% CI 211.3 to 907.2; p = 0.002] and overall PA [panel B, B 861.5, 95% CI 106.5 to 1616.4; p = 0.025]. The intervention was also associated with increased nature connectedness and social cohesion by wave 3 – significant for panel A only. Qualitative and quantitative evidence showed that interventions increased the perceived quality of the woodland environment and enhanced its enjoyment for different activities, but the increase in use of natural environments post intervention was only 6% (panel B). Limitations: This study was limited to three intervention sites. External factors may be the primary influence on health outcomes. Conclusions: The WIAT interventions did not improve community-level health within 6 months of completion, and hence there was no basis for demonstrating cost-effectiveness. However, the WIAT interventions are low cost (average £11.80 per person in the eligible population) and have potential for cost-effectiveness, if health benefits were found in the longer term. Future work: Using routinely collected data to consider a whole-programme evaluation is recommended

    Detecting Weak Spectral Lines in Interferometric Data through Matched Filtering

    Get PDF
    Modern radio interferometers enable observations of spectral lines with unprecedented spatial resolution and sensitivity. In spite of these technical advances, many lines of interest are still at best weakly detected and therefore necessitate detection and analysis techniques specialized for the low signal-to-noise ratio (S/N) regime. Matched filters can leverage knowledge of the source structure and kinematics to increase sensitivity of spectral line observations. Application of the filter in the native Fourier domain improves S/N while simultaneously avoiding the computational cost and ambiguities associated with imaging, making matched filtering a fast and robust method for weak spectral line detection. We demonstrate how an approximate matched filter can be constructed from a previously observed line or from a model of the source, and we show how this filter can be used to robustly infer a detection significance for weak spectral lines. When applied to ALMA Cycle 2 observations of CH3OH in the protoplanetary disk around TW Hya, the technique yields a ≈53% S/N boost over aperture-based spectral extraction methods, and we show that an even higher boost will be achieved for observations at higher spatial resolution. A Python-based open-source implementation of this technique is available under the MIT license at http://github.com/AstroChem/VISIBLE

    Examining the Evidence for Chytridiomycosis in Threatened Amphibian Species

    Get PDF
    Extinction risks are increasing for amphibians due to rising threats and minimal conservation efforts. Nearly one quarter of all threatened/extinct amphibians in the IUCN Red List is purportedly at risk from the disease chytridiomycosis. However, a closer look at the data reveals that Batrachochytrium dendrobatidis (the causal agent) has been identified and confirmed to cause clinical disease in only 14% of these species. Primary literature surveys confirm these findings; ruling out major discrepancies between Red List assessments and real-time science. Despite widespread interest in chytridiomycosis, little progress has been made between assessment years to acquire evidence for the role of chytridiomycosis in species-specific amphibian declines. Instead, assessment teams invoke the precautionary principle when listing chytridiomycosis as a threat. Precaution is valuable when dealing with the world's most threatened taxa, however scientific research is needed to distinguish between real and predicted threats in order to better prioritize conservation efforts. Fast paced, cost effective, in situ research to confirm or rule out chytridiomycosis in species currently hypothesized to be threatened by the disease would be a step in the right direction. Ultimately, determining the manner in which amphibian conservation resources are utilized is a conversation for the greater conservation community that we hope to stimulate here

    SARS-CoV-2 evolution during treatment of chronic infection

    Get PDF
    The spike protein of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) is critical for virus infection through the engagement of the human ACE2 protein1 and is a major antibody target. Here we show that chronic infection with SARS-CoV-2 leads to viral evolution and reduced sensitivity to neutralizing antibodies in an immunosuppressed individual treated with convalescent plasma, by generating whole-genome ultra-deep sequences for 23 time points that span 101 days and using in vitro techniques to characterize the mutations revealed by sequencing. There was little change in the overall structure of the viral population after two courses of remdesivir during the first 57 days. However, after convalescent plasma therapy, we observed large, dynamic shifts in the viral population, with the emergence of a dominant viral strain that contained a substitution (D796H) in the S2 subunit and a deletion (ΔH69/ΔV70) in the S1 N-terminal domain of the spike protein. As passively transferred serum antibodies diminished, viruses with the escape genotype were reduced in frequency, before returning during a final, unsuccessful course of convalescent plasma treatment. In vitro, the spike double mutant bearing both ΔH69/ΔV70 and D796H conferred modestly decreased sensitivity to convalescent plasma, while maintaining infectivity levels that were similar to the wild-type virus.The spike substitution mutant D796H appeared to be the main contributor to the decreased susceptibility to neutralizing antibodies, but this mutation resulted in an infectivity defect. The spike deletion mutant ΔH69/ΔV70 had a twofold higher level of infectivity than wild-type SARS-CoV-2, possibly compensating for the reduced infectivity of the D796H mutation. These data reveal strong selection on SARS-CoV-2 during convalescent plasma therapy, which is associated with the emergence of viral variants that show evidence of reduced susceptibility to neutralizing antibodies in immunosuppressed individuals
    corecore