633 research outputs found

    Chemical information matters: an e-Research perspective on information and data sharing in the chemical sciences

    No full text
    Recently, a number of organisations have called for open access to scientific information and especially to the data obtained from publicly funded research, among which the Royal Society report and the European Commission press release are particularly notable. It has long been accepted that building research on the foundations laid by other scientists is both effective and efficient. Regrettably, some disciplines, chemistry being one, have been slow to recognise the value of sharing and have thus been reluctant to curate their data and information in preparation for exchanging it. The very significant increases in both the volume and the complexity of the datasets produced has encouraged the expansion of e-Research, and stimulated the development of methodologies for managing, organising, and analysing "big data". We review the evolution of cheminformatics, the amalgam of chemistry, computer science, and information technology, and assess the wider e-Science and e-Research perspective. Chemical information does matter, as do matters of communicating data and collaborating with data. For chemistry, unique identifiers, structure representations, and property descriptors are essential to the activities of sharing and exchange. Open science entails the sharing of more than mere facts: for example, the publication of negative outcomes can facilitate better understanding of which synthetic routes to choose, an aspiration of the Dial-a-Molecule Grand Challenge. The protagonists of open notebook science go even further and exchange their thoughts and plans. We consider the concepts of preservation, curation, provenance, discovery, and access in the context of the research lifecycle, and then focus on the role of metadata, particularly the ontologies on which the emerging chemical Semantic Web will depend. Among our conclusions, we present our choice of the "grand challenges" for the preservation and sharing of chemical information

    Diverse perceptions of smart spaces

    No full text
    This is the era of smart technology and of ‘smart’ as a meme, so we have run three workshops to examine the ‘smart’ meme and the exploitation of smart environments. The literature relating to smart spaces focuses primarily on technologies and their capabilities. Our three workshops demonstrated that we require a stronger user focus if we are advantageously to exploit spaces ascribed as smart: we examined the concept of smartness from a variety of perspectives, in collaboration with a broad range of contributors. We have prepared this monograph mainly to report on the third workshop, held at Bournemouth University in April 2012, but do also consider the lessons learned from all three. We conclude with a roadmap for a fourth (and final) workshop, which is intended to emphasise the overarching importance of the humans using the spac

    Scientific and technical data sharing: a trading perspective

    Get PDF
    It is arguably a precept that the open sharing of data maximises the scientific utility of the research that generated that data. Indeed, progress depends on individual scientists being able to build on the results produced by others. The means to facilitate sharing undoubtedly exist, but various studies have identified reluctance among researchers to share information with their peers, at least until the professional priorities of the original researchers have been accommodated. With a view to encouraging less inhibited collaboration, we appraise the processes of data exchange from the perspective of a trading environment and consider how data exchanges might promote (or perhaps hinder) collaboration in data-rich scientific research disciplines and how such an exchange might be set up. We suggest an exchange with trusted brokers (akin to the commodity markets) as a way to overcome the challenges of the current environment. We conclude by encouraging the scientific and technical community to debate the merits of a trading perspective on data sharing and exchange

    Non-Thermal Dark Matter, High Energy Cosmic Rays and Late-Decaying Particles From Inflationary Quantum Fluctuations

    Get PDF
    It has been suggested that the origin of cosmic rays above the GZK limit might be explained by the decay of particles, X, with mass of the order of 10^{12} GeV. Generation of heavy particles from inflationary quantum fluctuations is a prime candidate for the origin of the decaying X particles. It has also been suggested that the problem of non-singular galactic halos might be explained if dark matter originates non-thermally from the decay of particles, Y, such that there is a free-streaming length of the order of 0.1Mpc. Here we explore the possibility that quantum fluctuations might account for the Y particles as well as the X particles. For the case of non-thermal WIMP dark matter with unsuppressed weak interactions we find that there is a general problem with deuterium photo-dissociation, disfavouring WIMP dark matter candidates. For the case of more general dark matter particles, which may have little or no interaction with conventional matter, we discuss the conditions under which X and Y scalars or fermions can account for non-thermal dark matter and cosmic rays. For the case where X and Y scalars are simultaneously produced, we show that galactic halos are likely to have a dynamically significant component of X scalar cold dark matter in addition to the dominant non-thermal dark matter component.Comment: 18 Pages, LaTeX. Substantially revised with corrected WIMP cross-section

    Specificity and off-target effects of AAV8-TBG viral vectors for the manipulation of hepatocellular gene expression in mice

    Get PDF
    Mice are a widely used pre-clinical model system in large part due to their potential for genetic manipulation. The ability to manipulate gene expression in specific cells under temporal control is a powerful experimental tool. The liver is central to metabolic homeostasis and a site of many diseases, making the targeting of hepatocytes attractive. Adeno-associated virus 8 (AAV8) vectors are valuable instruments for the manipulation of hepatocellular gene expression. However, their off-target effects in mice have not been thoroughly explored. Here, we sought to identify the short-term off-target effects of AAV8 administration in mice. To do this, we injected C57BL/6J wild-type mice with either recombinant AAV8 vectors expressing Cre recombinase or control AAV8 vectors and characterised the changes in general health and in liver physiology, histology and transcriptomics compared to uninjected controls. We observed an acute and transient trend for reduction in homeostatic liver proliferation together with induction of the DNA damage marker γH2AX following AAV8 administration. The latter was enhanced upon Cre recombinase expression by the vector. Furthermore, we observed transcriptional changes in genes involved in circadian rhythm and response to infection. Notably, there were no additional transcriptomic changes upon expression of Cre recombinase by the AAV8 vector. Overall, there was no evidence of liver injury, and only mild T-cell infiltration was observed 14 days following AAV8 infection. These data advance the technique of hepatocellular genome editing through Cre-Lox recombination using Cre expressing AAV vectors, demonstrating their minimal effects on murine physiology and highlight the more subtle off target effects of these systems

    User-Defined Metadata: Using Cues and Changing Perspectives

    Get PDF
    User-defined metadata is useful for curating and helping to provide context for experiment records, but our previous investigations have demonstrated that simply providing the facility to add metadata is not enough to ensure that metadata is added, let alone to ensure that the metadata is of high quality. For metadata to be useful it first has to be present, but enforcing metadata generation is of no benefit if it is low quality, inconsistent, or irrelevant. Researchers need support. One strategy to encourage more effective metadata creation is to design user interfaces that invite users to add metadata by asking them questions. If we ask users specific questions about their experiments and other activities then we could capture more relevant or useful metadata, although there is a risk that asking the wrong questions may lead to loss of valuable metadata terms or the creation of irrelevant material. In this paper we report on a study to investigate how different questions could be used to generate metadata by eliciting information in three different conditions: free recall, changing perspective by thinking about search terms to help someone else, and providing cues by using a set of topic-based questions. We also investigate how responses varied with different information types. The results of the study show that different terms are created under the different conditions, as expected. The use of cues generates the highest numbers of terms and the most diverse range, including elements that are not captured in other conditions. However, important themes generated in other conditions are not produced because the cues to create them are missing. The study also generated a number of unexpected findings, including responses describing information that is not in the original material: personal opinions and experiences, and comments about the information text itself. These unexpected responses have both positive and negative consequences for the generation of metadata and the curation of scientific records. The results of studies using these techniques to capture metadata for chemistry experiments are also discussed

    A Maximum Eigenvalue Approximation for Crack-Sizing Using Ultrasonic Arrays

    Get PDF
    Ultrasonic phased array systems are becoming increasingly popular as tools for the inspection of safety-critical structures with in the non-destructive testing industry. The datasets captured by these arrays can be used to image the internal microstructure of individual components, all owing the location and nature of any defects to be deduced. Unfortunately, many of the current imaging algorithms require an arbitrary threshold at which the defect measurements can be taken and this aspect of subjectivity can lead to varying characterisations of a flaw between different operators. This paper puts forward an objective approach based on the Kirchoff scattering model and the approximation of the resulting scattering matrices by Toeplitz matrices. A mathematical expression relating the crack size to the maximum eigenvalue of the associated scattering matrix is thus derived. The formula is analysed numerically to assess its sensitivity to the system parameters and it is shown that the method is most effective for sizing defects that are commensurate with the wavelength of the ultrasonic wave (or just smaller than. The method is applied to simulated FMC data arising from finite element calculations where the crack length to wavelength ratios range between 0.6 and 1.8. The recovered objective crack size exhibits an error of 12%

    The detection of flaws in austenitic welds using the decomposition of the time reversal operator

    Get PDF
    The non-destructive testing of austenitic welds using ultrasound plays an important role in the assessment of the structural integrity of safety critical structures. The internal microstructure of these welds is highly scattering and can lead to the obscuration of defects when investigated by traditional imaging algorithms. This paper proposes an alternative objective method for the detection of flaws embedded in austenitic welds based on the singular value decomposition of the time-frequency domain response matrices. The distribution of the singular values is examined in the cases where a flaw exists and where there is no flaw present. A lower threshold on the singular values, specific to austenitic welds, is derived which, when exceeded, indicates the presence of a flaw. The detection criterion is successfully implemented on both synthetic and experimental data. The datasets arising from welds containing a flaw, are further interrogated using the decomposition of the time reversal operator (DORT) method and the total focussing method (TFM) and it is shown that images constructed via the DORT algorithm typically exhibit a higher signal to noise ratio than those constructed by the TFM algorithm
    • …
    corecore