31 research outputs found

    The Curious Case of HU Aquarii - Dynamically Testing Proposed Planetary Systems

    Get PDF
    In early 2011, the discovery of two planets moving on surprisingly extreme orbits around the eclipsing polar cataclysmic variable system HU Aquraii was announced based on variations in the timing of mutual eclipses between the two central stars. We perform a detailed dynamical analysis of the stability of the exoplanet system as proposed in that work, revealing that it is simply dynamically unfeasible. We then apply the latest rigorous methods used by the Anglo-Australian Planet Search to analyse radial velocity data to re-examine the data used to make the initial claim. Using that data, we arrive at a significantly different orbital solution for the proposed planets, which we then show through dynamical analysis to be equally unfeasible. Finally, we discuss the need for caution in linking eclipse-timing data for cataclysmic variables to the presence of planets, and suggest a more likely explanation for the observed signal.Comment: 14 pages, 5 figures, 2 table

    Digital methodology to implement the ECOUTER engagement process

    Get PDF
    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes

    DataSHIELD – new directions and dimensions

    Get PDF
    In disciplines such as biomedicine and social sciences, sharing and combining sensitive individual-level data is often prohibited by ethical-legal or governance constraints and other barriers such as the control of intellectual property or the huge sample sizes. DataSHIELD (Data Aggregation Through Anonymous Summary-statistics from Harmonised Individual-levEL Databases) is a distributed approach that allows the analysis of sensitive individual-level data from one study, and the co-analysis of such data from several studies simultaneously without physically pooling them or disclosing any data. Following initial proof of principle, a stable DataSHIELD platform has now been implemented in a number of epidemiological consortia. This paper reports three new applications of DataSHIELD including application to post-publication sensitive data analysis, text data analysis and privacy protected data visualisation. Expansion of DataSHIELD analytic functionality and application to additional data types demonstrate the broad applications of the software beyond biomedical sciences

    DataSHIELD: taking the analysis to the data, not the data to the analysis

    Get PDF
    Research in modern biomedicine and social science requires sample sizes so large that they can often only be achieved through a pooled co-analysis of data from several studies. But the pooling of information from individuals in a central database that may be queried by researchers raises important ethico-legal questions and can be controversial. In the UK this has been highlighted by recent debate and controversy relating to the UK's proposed 'care.data' initiative, and these issues reflect important societal and professional concerns about privacy, confidentiality and intellectual property. DataSHIELD provides a novel technological solution that can circumvent some of the most basic challenges in facilitating the access of researchers and other healthcare professionals to individual-level data. Commands are sent from a central analysis computer (AC) to several data computers (DCs) storing the data to be co-analysed. The data sets are analysed simultaneously but in parallel. The separate parallelized analyses are linked by non-disclosive summary statistics and commands transmitted back and forth between the DCs and the AC. This paper describes the technical implementation of DataSHIELD using a modified R statistical environment linked to an Opal database deployed behind the computer firewall of each DC. Analysis is controlled through a standard R environment at the AC. Based on this Opal/R implementation, DataSHIELD is currently used by the Healthy Obese Project and the Environmental Core Project (BioSHaRE-EU) for the federated analysis of 10 data sets across eight European countries, and this illustrates the opportunities and challenges presented by the DataSHIELD approach. DataSHIELD facilitates important research in settings where: (i) a co-analysis of individual-level data from several studies is scientifically necessary but governance restrictions prohibit the release or sharing of some of the required data, and/or render data access unacceptably slow; (ii) a research group (e.g. in a developing nation) is particularly vulnerable to loss of intellectual property-the researchers want to fully share the information held in their data with national and international collaborators, but do not wish to hand over the physical data themselves; and (iii) a data set is to be included in an individual-level co-analysis but the physical size of the data precludes direct transfer to a new site for analysis

    The ECOUTER methodology for stakeholder engagement in translational research.

    Get PDF
    BACKGROUND: Because no single person or group holds knowledge about all aspects of research, mechanisms are needed to support knowledge exchange and engagement. Expertise in the research setting necessarily includes scientific and methodological expertise, but also expertise gained through the experience of participating in research and/or being a recipient of research outcomes (as a patient or member of the public). Engagement is, by its nature, reciprocal and relational: the process of engaging research participants, patients, citizens and others (the many 'publics' of engagement) brings them closer to the research but also brings the research closer to them. When translating research into practice, engaging the public and other stakeholders is explicitly intended to make the outcomes of translation relevant to its constituency of users. METHODS: In practice, engagement faces numerous challenges and is often time-consuming, expensive and 'thorny' work. We explore the epistemic and ontological considerations and implications of four common critiques of engagement methodologies that contest: representativeness, communication and articulation, impacts and outcome, and democracy. The ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) methodology addresses problems of representation and epistemic foundationalism using a methodology that asks, "How could it be otherwise?" ECOUTER affords the possibility of engagement where spatial and temporal constraints are present, relying on saturation as a method of 'keeping open' the possible considerations that might emerge and including reflexive use of qualitative analytic methods. RESULTS: This paper describes the ECOUTER process, focusing on one worked example and detailing lessons learned from four other pilots. ECOUTER uses mind-mapping techniques to 'open up' engagement, iteratively and organically. ECOUTER aims to balance the breadth, accessibility and user-determination of the scope of engagement. An ECOUTER exercise comprises four stages: (1) engagement and knowledge exchange; (2) analysis of mindmap contributions; (3) development of a conceptual schema (i.e. a map of concepts and their relationship); and (4) feedback, refinement and development of recommendations. CONCLUSION: ECOUTER refuses fixed truths but also refuses a fixed nature. Its promise lies in its flexibility, adaptability and openness. ECOUTER will be formed and re-formed by the needs and creativity of those who use it

    Quantitative 18F-AV1451 Brain Tau PET Imaging in Cognitively Normal Older Adults, Mild Cognitive Impairment, and Alzheimer's Disease Patients

    Get PDF
    Recent developments of tau Positron Emission Tomography (PET) allows assessment of regional neurofibrillary tangles (NFTs) deposition in human brain. Among the tau PET molecular probes, 18F-AV1451 is characterized by high selectivity for pathologic tau aggregates over amyloid plaques, limited non-specific binding in white and gray matter, and confined off-target binding. The objectives of the study are (1) to quantitatively characterize regional brain tau deposition measured by 18F-AV1451 PET in cognitively normal older adults (CN), mild cognitive impairment (MCI), and AD participants; (2) to evaluate the correlations between cerebrospinal fluid (CSF) biomarkers or Mini-Mental State Examination (MMSE) and 18F-AV1451 PET standardized uptake value ratio (SUVR); and (3) to evaluate the partial volume effects on 18F-AV1451 brain uptake.Methods: The study included total 115 participants (CN = 49, MCI = 58, and AD = 8) from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Preprocessed 18F-AV1451 PET images, structural MRIs, and demographic and clinical assessments were downloaded from the ADNI database. A reblurred Van Cittertiteration method was used for voxelwise partial volume correction (PVC) on PET images. Structural MRIs were used for PET spatial normalization and region of interest (ROI) definition in standard space. The parametric images of 18F-AV1451 SUVR relative to cerebellum were calculated. The ROI SUVR measurements from PVC and non-PVC SUVR images were compared. The correlation between ROI 18F-AV1451 SUVR and the measurements of MMSE, CSF total tau (t-tau), and phosphorylated tau (p-tau) were also assessed.Results:18F-AV1451 prominently specific binding was found in the amygdala, entorhinal cortex, parahippocampus, fusiform, posterior cingulate, temporal, parietal, and frontal brain regions. Most regional SUVRs showed significantly higher uptake of 18F-AV1451 in AD than MCI and CN participants. SUVRs of small regions like amygdala, entorhinal cortex and parahippocampus were statistically improved by PVC in all groups (p < 0.01). Although there was an increasing tendency of 18F-AV-1451 SUVRs in MCI group compared with CN group, no significant difference of 18F-AV1451 deposition was found between CN and MCI brains with or without PVC (p > 0.05). Declined MMSE score was observed with increasing 18F-AV1451 binding in amygdala, entorhinal cortex, parahippocampus, and fusiform. CSF p-tau was positively correlated with 18F-AV1451 deposition. PVC improved the results of 18F-AV-1451 tau deposition and correlation studies in small brain regions.Conclusion: The typical deposition of 18F-AV1451 tau PET imaging in AD brain was found in amygdala, entorhinal cortex, fusiform and parahippocampus, and these regions were strongly associated with cognitive impairment and CSF biomarkers. Although more deposition was observed in MCI group, the 18F-AV-1451 PET imaging could not differentiate the MCI patients from CN population. More tau deposition related to decreased MMSE score and increased level of CSF p-tau, especially in ROIs of amygdala, entorhinal cortex and parahippocampus. PVC did improve the results of tau deposition and correlation studies in small brain regions and suggest to be routinely used in 18F-AV1451 tau PET quantification

    Effect of alirocumab on mortality after acute coronary syndromes. An analysis of the ODYSSEY OUTCOMES randomized clinical trial

    Get PDF
    Background: Previous trials of PCSK9 (proprotein convertase subtilisin-kexin type 9) inhibitors demonstrated reductions in major adverse cardiovascular events, but not death. We assessed the effects of alirocumab on death after index acute coronary syndrome. Methods: ODYSSEY OUTCOMES (Evaluation of Cardiovascular Outcomes After an Acute Coronary Syndrome During Treatment With Alirocumab) was a double-blind, randomized comparison of alirocumab or placebo in 18 924 patients who had an ACS 1 to 12 months previously and elevated atherogenic lipoproteins despite intensive statin therapy. Alirocumab dose was blindly titrated to target achieved low-density lipoprotein cholesterol (LDL-C) between 25 and 50 mg/dL. We examined the effects of treatment on all-cause death and its components, cardiovascular and noncardiovascular death, with log-rank testing. Joint semiparametric models tested associations between nonfatal cardiovascular events and cardiovascular or noncardiovascular death. Results: Median follow-up was 2.8 years. Death occurred in 334 (3.5%) and 392 (4.1%) patients, respectively, in the alirocumab and placebo groups (hazard ratio [HR], 0.85; 95% CI, 0.73 to 0.98; P=0.03, nominal P value). This resulted from nonsignificantly fewer cardiovascular (240 [2.5%] vs 271 [2.9%]; HR, 0.88; 95% CI, 0.74 to 1.05; P=0.15) and noncardiovascular (94 [1.0%] vs 121 [1.3%]; HR, 0.77; 95% CI, 0.59 to 1.01; P=0.06) deaths with alirocumab. In a prespecified analysis of 8242 patients eligible for ≥3 years follow-up, alirocumab reduced death (HR, 0.78; 95% CI, 0.65 to 0.94; P=0.01). Patients with nonfatal cardiovascular events were at increased risk for cardiovascular and noncardiovascular deaths (P<0.0001 for the associations). Alirocumab reduced total nonfatal cardiovascular events (P<0.001) and thereby may have attenuated the number of cardiovascular and noncardiovascular deaths. A post hoc analysis found that, compared to patients with lower LDL-C, patients with baseline LDL-C ≥100 mg/dL (2.59 mmol/L) had a greater absolute risk of death and a larger mortality benefit from alirocumab (HR, 0.71; 95% CI, 0.56 to 0.90; Pinteraction=0.007). In the alirocumab group, all-cause death declined wit h achieved LDL-C at 4 months of treatment, to a level of approximately 30 mg/dL (adjusted P=0.017 for linear trend). Conclusions: Alirocumab added to intensive statin therapy has the potential to reduce death after acute coronary syndrome, particularly if treatment is maintained for ≥3 years, if baseline LDL-C is ≥100 mg/dL, or if achieved LDL-C is low. Clinical Trial Registration: URL: https://www.clinicaltrials.gov. Unique identifier: NCT01663402

    Recognizing, reporting and reducing the data curation debt of cohort studies

    No full text
    Abstract Good data curation is integral to cohort studies, but it is not always done to a level necessary to ensure the longevity of the data a study holds. In this opinion paper, we introduce the concept of data curation debt—the data curation equivalent to the software engineering principle of technical debt. Using the context of UK cohort studies, we define data curation debt—describing examples and their potential impact. We highlight that accruing this debt can make it more difficult to use the data in the future. Additionally, the long-running nature of cohort studies means that interest is accrued on this debt and compounded over time—increasing the impact a debt could have on a study and its stakeholders. Primary causes of data curation debt are discussed across three categories: longevity of hardware, software and data formats; funding; and skills shortages. Based on cross-domain best practice, strategies to reduce the debt and preventive measures are proposed—with importance given to the recognition and transparent reporting of data curation debt. Describing the debt in this way, we encapsulate a multi-faceted issue in simple terms understandable by all cohort study stakeholders. Data curation debt is not only confined to the UK, but is an issue the international community must be aware of and address. This paper aims to stimulate a discussion between cohort studies and their stakeholders on how to address the issue of data curation debt. If data curation debt is left unchecked it could become impossible to use highly valued cohort study data, and ultimately represents an existential risk to studies themselves.</jats:p

    PUblications Metadata Augmentation (PUMA) pipeline.

    Get PDF
    Cohort studies collect, generate and distribute data over long periods of time - often over the lifecourse of their participants. It is common for these studies to host a list of publications (which can number many thousands) on their website to demonstrate the impact of the study and facilitate the search of existing research to which the study data has contributed. The ability to search and explore these publication lists varies greatly between studies. We believe a lack of rich search and exploration functionality of study publications is a barrier to entry for new or prospective users of a study's data, since it may be difficult to find and evaluate previous work in a given area. These lists of publications are also typically manually curated, resulting in a lack of rich metadata to analyse, making bibliometric analysis difficult. We present here a software pipeline that aggregates metadata from a variety of third-party providers to power a web based search and exploration tool for lists of publications. Alongside core publication metadata (i.e. author lists, keywords etc.), we include geocoding of first authors and citation counts in our pipeline. This allows a characterisation of a study as a whole based on common locations of authors, frequency of keywords, citation profile etc. This enriched publications metadata can be useful for generating study impact metrics and web-based graphics for public dissemination. In addition, the pipeline produces a research data set for bibliometric analysis or social studies of science. We use a previously published list of publications from a cohort study as an exemplar input data set to show the output and utility of the pipeline here

    Predictive milling of pharmaceutical materials using nanoindentation of single crystals

    No full text
    Five pharmaceutical materials, including two salts and three neutral compounds, have been subjected to nanoindentation analysis on a single-crystal scale. The nanoindentation experiments were used to calculate a brittleness index for each of the five materials. These results were compared to the size reductions that were obtained on a pilot-plant scale mill. A good correlation between single crystal and large pilot-plant scale results was obtained for the range of materials studied
    corecore