1,356 research outputs found

    Differential effects of lipid biosynthesis inhibitors on Zika and Semliki Forest viruses

    Get PDF
    The recent outbreak of infection with Zika virus (ZIKV; Flaviviridae) has attracted attention to this previously neglected mosquito-borne pathogen and the need for efficient therapies. Since flavivirus replication is generally known to be dependent on fatty acid biosynthesis, two inhibitors of this pathway, 5-(tetradecyloxyl)-2-furoic acid (TOFA) and cerulenin, were tested for their potentiality to inhibit virus replication. At concentrations previously shown to inhibit the replication of other flaviviruses, neither drug had a significant antiviral affect against ZIKV, but reduced the replication of the non-related mosquito-borne Semliki Forest virus (Togaviridae)

    Burying beetles

    Get PDF

    A methodology to extract outcomes from routine healthcare data for patients with locally advanced non-small cell lung cancer

    Get PDF
    BACKGROUND: Outcomes for patients in UK with locally advanced non-small cell lung cancer (LA NSCLC) are amongst the worst in Europe. Assessing outcomes is important for analysing the effectiveness of current practice. However, data quality is inconsistent and regular large scale analysis is challenging. This project investigates the use of routine healthcare datasets to determine progression free survival (PFS) and overall survival (OS) of patients treated with primary radical radiotherapy for LA NSCLC. METHODS: All LA NSCLC patients treated with primary radical radiotherapy in a 2 year period were identified and paired manual and routine data generated for an initial pilot study. Manual data was extracted information from hospital records and considered the gold standard. Key time points were date of diagnosis, recurrence, death or last clinical encounter. Routine data was collected from various data sources including, Hospital Episode Statistics, Personal Demographic Service, chemotherapy data, and radiotherapy datasets. Relevant event dates were defined by proxy time points and refined using backdating and time interval optimization. Dataset correlations were then tested on key clinical outcome indicators to establish if routine data could be used as a reliable proxy measure for manual data. RESULTS: Forty-three patients were identified for the pilot study. The manual data showed a median age of 67 years (range 46- 89 years) and all patients had stage IIIA/B disease. Using the manual data, the median PFS was 10.78 months (range 1.58-37.49 months) and median OS was 16.36 months (range 2.69-37.49 months). Based on routine data, using proxy measures, the estimated median PFS was 10.68 months (range 1.61-31.93 months) and estimated median OS was 15.38 months (range 2.14-33.71 months). Overall, the routine data underestimated the PFS and OS of the manual data but there was good correlation with a Pearson correlation coefficient of 0.94 for PFS and 0.97 for OS. CONCLUSIONS: This is a novel approach to use routine datasets to determine outcome indicators in patients with LA NSCLC that will be a surrogate to analysing manual data. The ability to enable efficient and large scale analysis of current lung cancer strategies has a huge potential impact on the healthcare system

    A methodology to extract outcomes from routine healthcare data for patients with locally advanced non-small cell lung cancer

    Get PDF
    BACKGROUND: Outcomes for patients in UK with locally advanced non-small cell lung cancer (LA NSCLC) are amongst the worst in Europe. Assessing outcomes is important for analysing the effectiveness of current practice. However, data quality is inconsistent and regular large scale analysis is challenging. This project investigates the use of routine healthcare datasets to determine progression free survival (PFS) and overall survival (OS) of patients treated with primary radical radiotherapy for LA NSCLC. METHODS: All LA NSCLC patients treated with primary radical radiotherapy in a 2 year period were identified and paired manual and routine data generated for an initial pilot study. Manual data was extracted information from hospital records and considered the gold standard. Key time points were date of diagnosis, recurrence, death or last clinical encounter. Routine data was collected from various data sources including, Hospital Episode Statistics, Personal Demographic Service, chemotherapy data, and radiotherapy datasets. Relevant event dates were defined by proxy time points and refined using backdating and time interval optimization. Dataset correlations were then tested on key clinical outcome indicators to establish if routine data could be used as a reliable proxy measure for manual data. RESULTS: Forty-three patients were identified for the pilot study. The manual data showed a median age of 67 years (range 46- 89 years) and all patients had stage IIIA/B disease. Using the manual data, the median PFS was 10.78 months (range 1.58-37.49 months) and median OS was 16.36 months (range 2.69-37.49 months). Based on routine data, using proxy measures, the estimated median PFS was 10.68 months (range 1.61-31.93 months) and estimated median OS was 15.38 months (range 2.14-33.71 months). Overall, the routine data underestimated the PFS and OS of the manual data but there was good correlation with a Pearson correlation coefficient of 0.94 for PFS and 0.97 for OS. CONCLUSIONS: This is a novel approach to use routine datasets to determine outcome indicators in patients with LA NSCLC that will be a surrogate to analysing manual data. The ability to enable efficient and large scale analysis of current lung cancer strategies has a huge potential impact on the healthcare system

    Modeling Spatially and Temporally Complex Range Dynamics When Detection is Imperfect

    Get PDF
    Species distributions are determined by the interaction of multiple biotic and abiotic factors, which produces complex spatial and temporal patterns of occurrence. As habitats and climate change due to anthropogenic activities, there is a need to develop species distribution models that can quantify these complex range dynamics. In this paper, we develop a dynamic occupancy model that uses a spatial generalized additive model to estimate non-linear spatial variation in occupancy not accounted for by environmental covariates. The model is flexible and can accommodate data from a range of sampling designs that provide information about both occupancy and detection probability. Output from the model can be used to create distribution maps and to estimate indices of temporal range dynamics. We demonstrate the utility of this approach by modeling long-term range dynamics of 10 eastern North American birds using data from the North American Breeding Bird Survey. We anticipate this framework will be particularly useful for modeling species’ distributions over large spatial scales and for quantifying range dynamics over long temporal scales

    Solid phase micro extraction for organic contamination control throughout assembly and operational phases of space missions

    Get PDF
    Space missions concerned with life detection contain highly sensitive instruments for the detection of organics. Terrestrial contamination can interfere with signals of indigenous organics in samples and has the potential to cause false positive biosignature detections, which may lead to incorrect suggestions of the presence of life elsewhere in the Solar System. This study assessed the capability of solid phase micro extraction (SPME) as a method for monitoring organic contamination encountered by spacecraft hardware during assembly and operation. SPME-gas chromatography-mass spectrometry (SPME-GC-MS) analysis was performed on potential contaminant source materials, which are commonly used in spacecraft construction. The sensitivity of SPME-GC-MS to organics was assessed in the context of contaminants identified in molecular wipes taken from hardware surfaces on the ExoMars Rosalind Franklin rover. SPME was found to be effective at detecting a wide range of common organic contaminants that include aromatic hydrocarbons, non-aromatic hydrocarbons, nitrogen-containing compounds, alcohols and carbonyls. A notable example of correlation of contaminant with source material was the detection of benzenamine compounds in an epoxy adhesive analyzed by SPME-GC-MS and in the ExoMars rover surface wipe samples. The current form of SPME-GC-MS does not enable quantitative evaluation of contaminants, nor is it suitable for the detection of every group of organic molecules relevant to astrobiological contamination concerns, namely, large and/or polar molecules such as amino acids. However, it nonetheless represents an effective new monitoring method for rapid, easy identification of organic contaminants commonly present on spacecraft hardware and could thus be utilized in future space missions as part of their contamination control and mitigation protocols

    The development and validation of prognostic models for overall survival in the presence of missing data in the training dataset: a strategy with a detailed example.

    Get PDF
    Background The United Kingdom Myeloma Research Alliance (UK-MRA) Myeloma Risk Profile is a prognostic model for overall survival. It was trained and tested on clinical trial data, aiming to improve the stratification of transplant ineligible (TNE) patients with newly diagnosed multiple myeloma. Missing data is a common problem which affects the development and validation of prognostic models, where decisions on how to address missingness have implications on the choice of methodology. Methods Model building The training and test datasets were the TNE pathways from two large randomised multicentre, phase III clinical trials. Potential prognostic factors were identified by expert opinion. Missing data in the training dataset was imputed using multiple imputation by chained equations. Univariate analysis fitted Cox proportional hazards models in each imputed dataset with the estimates combined by Rubin’s rules. Multivariable analysis applied penalised Cox regression models, with a fixed penalty term across the imputed datasets. The estimates from each imputed dataset and bootstrap standard errors were combined by Rubin’s rules to define the prognostic model. Model assessment Calibration was assessed by visualising the observed and predicted probabilities across the imputed datasets. Discrimination was assessed by combining the prognostic separation D-statistic from each imputed dataset by Rubin’s rules. Model validation The D-statistic was applied in a bootstrap internal validation process in the training dataset and an external validation process in the test dataset, where acceptable performance was pre-specified. Development of risk groups Risk groups were defined using the tertiles of the combined prognostic index, obtained by combining the prognostic index from each imputed dataset by Rubin’s rules. Results The training dataset included 1852 patients, 1268 (68.47%) with complete case data. Ten imputed datasets were generated. Five hundred twenty patients were included in the test dataset. The D-statistic for the prognostic model was 0.840 (95% CI 0.716–0.964) in the training dataset and 0.654 (95% CI 0.497–0.811) in the test dataset and the corrected D-Statistic was 0.801. Conclusion The decision to impute missing covariate data in the training dataset influenced the methods implemented to train and test the model. To extend current literature and aid future researchers, we have presented a detailed example of one approach. Whilst our example is not without limitations, a benefit is that all of the patient information available in the training dataset was utilised to develop the model. Trial registration Both trials were registered; Myeloma IX-ISRCTN68454111, registered 21 September 2000. Myeloma XI-ISRCTN49407852, registered 24 June 2009

    Imaging Cerenkov emission as a quality assurance tool in electron radiotherapy

    Get PDF
    A new potential quality assurance (QA) method is explored (including assessment of depth dose, dose linearity, dose rate linearity and beam profile) for clinical electron beams based on imaging Cerenkov light. The potential of using a standard commercial camera to image Cerenkov light generated from electrons in water for fast QA measurement of a clinical electron beam was explored and compared to ionization chamber measurements. The new method was found to be linear with dose and independent of dose rate (to within 3%). The uncorrected practical range measured in Cerenkov images was found to overestimate the actual value by 3 mm in the worst case. The field size measurements underestimated the dose at the edges by 5% without applying any correction factor. Still, the measured field size could be used to monitor relative changes in the beam profile. Finally, the beam-direction profile measurements were independent of the field size within 2%. A simulation was also performed of the deposited energy and of Cerenkov production in water using GEANT4. Monte Carlo simulation was used to predict the measured light distribution around the water phantom, to reproduce Cerenkov images and to find the relation between deposited energy and Cerenkov production. The camera was modelled as a pinhole camera in GEANT4, to attempt to reproduce Cerenkov images. Simulations of the deposited energy and the Cerenkov light production agreed with each other for a pencil beam of electrons, while for a realistic field size, Cerenkov production in the build-up region overestimated the dose by +8%

    Migratory Behavior and Winter Geography Drive Differential Range Shifts of Eastern Birds in Response to Recent Climate Change

    Get PDF
    Over the past half century, migratory birds in North America have shown divergent population trends relative to resident species, with the former declining rapidly and the latter increasing. The role that climate change has played in these observed trends is not well understood, despite significant warming over this period. We used 43 y of monitoring data to fit dynamic species distribution models and quantify the rate of latitudinal range shifts in 32 species of birds native to eastern North America. Since the early 1970s, species that remain in North America throughout the year, including both resident and migratory species, appear to have responded to climate change through both colonization of suitable area at the northern leading edge of their breeding distributions and adaption in place at the southern trailing edges. Neotropical migrants, in contrast, have shown the opposite pattern: contraction at their southern trailing edges and no measurable shifts in their northern leading edges. As a result, the latitudinal distributions of temperate-wintering species have increased while the latitudinal distributions of neotropical migrants have decreased. These results raise important questions about the mechanisms that determine range boundaries of neotropical migrants and suggest that these species may be particularly vulnerable to future climate change. Our results highlight the potential importance of climate change during the nonbreeding season in constraining the response of migratory species to temperature changes at both the trailing and leading edges of their breeding distributions. Future research on the interactions between breeding and nonbreeding climate change is urgently needed
    corecore