299 research outputs found
Inverse Methods: a Powerful Tool for Evaluating Aerosol Data, Exemplified on Cases With Relevance for the Atmosphere and the Aerosol Climate Effect
For a complete description of a given aerosol, more than one parameter is necessary, e.g. parameters concerning size distribution, chemical composition, and particle morphology. On the other hand, most instruments measuring aerosol properties are sensitive mostly to one parameter, but cross-sensitive to others. These cross-sensitivities are often eliminated by assumptions during data evaluation, inducing systematic uncertainties in the results.
The use of assumptions can be reduced by combining the information of several instruments on the same aerosol and using inverse methods for interpretation of the data. The presentation focuses on two application examples of these methods. The first example concerns a size distribution inversion algorithm that combines data from several instruments into one size distribution. The second example deals with an algorithm that retrieves the aerosol asymmetry parameter (with respect to particle scattering) from measurements of the aerosol absorption and spectral scattering and hemispheric backscattering coefficients, thereby providing a set of parameters that completely describes an aerosol with respect to its direct climate effect
Blockchain-backed analytics. Adding blockchain-based quality gates to data science projects
[EN] A typical analytical lifecycle in data science projects starts with the process of data generation and collection, continues with data preparation and preprocessing and heads towards project specific analytics, visualizations and presentations. In order to ensure high quality trusted analytics, every relevant step of the data-model-result linkage needs to meet certain quality standards that furthermore should be certified by trusted quality gate mechanisms.We propose “blockchain-backed analytics”, a scalable and easy-to-use generic approach to introduce quality gates to data science projects, backed by the immutable records of a blockchain. For that reason, data, models and results are stored as cryptographically hashed fingerprints with mutually linked transactions in a public blockchain database.This approach enables stakeholders of data science projects to track and trace the linkage of data, applied models and modeling results without the need of trust validation of escrow systems or any other third party.Herrmann, M.; Petzold, J.; Bombatkar, V. (2018). Blockchain-backed analytics. Adding blockchain-based quality gates to data science projects. En 2nd International Conference on Advanced Reserach Methods and Analytics (CARMA 2018). Editorial Universitat Politècnica de València. 1-9. https://doi.org/10.4995/CARMA2018.2018.8292OCS1
Array hybridization and whole genome sequencing as new typing tools for Legionella pneumophila
To understand transmissible human diseases, disciplines such as epidemiology and the surveillance of affected cases are as essential as the knowledge about the pathogenesis and the course of a disease. Epidemiologists categorize and estimate factors for public health risks by taking metadata into account including geographic aspects, health and social states to study a disease transmission and prevent further cases. In addition, a focus on the causative agents itself is necessary in order to understand their ecology and hence their virulence traits. The causative agents for a severe pneumonia named Legionnaires’ disease (LD) are bacteria of the genus Legionella. The putative sources of LD infection are any aerosol-generating natural or man-made fresh water systems. Due to this ubiquitous distribution of legionellae, it is difficult to find the source of infection. Therefore, it is necessary to isolate the bacterium from the suffering patients to further characterize it in the laboratory and to compare the clinical isolates with isolates obtained from probable environmental sources.
The predominant species isolated from LD patients is Legionella pneumophila serogroup (Sg) 1. Intensive genotyping of L. pneumophila Sg1 isolates by using the current gold standard method, the sequence-based typing scheme (SBT), revealed limitations in the discrimination of several sequence types (ST) which could not be compensated for by additional phenotypic typing scheme. In practical terms, this means that several clones or STs are disproportional frequently found in both, patients and water systems, and cannot be distinguished by current methods. Therefore, a distorted picture of endemic and globally-spread clones is generated and current typing methods cannot add substantial information during the identification of the infectious source. The aim of this thesis is to develop and implement new typing methods for L. pneumophila isolates with a higher resolution than the gold standard methods.
A DNA-DNA hybridization based microarray was designed and equipped with probes that target specifically L. pneumophila virulence factors and genes that are involved in the biosynthesis of lipopolysaccharide structures. Legionellae can be subgrouped on the basis of their lipopolysaccharide structures. Here, the usually phenotypic characterization of L. pneumophila Sg1 is successfully transmitted to a DNA-based genotypic method. Furthermore, the detailed validation of the DNA-microarray revealed a higher discriminatory power in comparison to the gold standard methods. It enables previously indistinguishable clones to be subdivided, providing valuable information about probable sources of infection.
The second new tool for typing of L. pneumophila is based on the core genome of the bacteria. An extended SBT-scheme was extracted from the core genome and accordingly named core genome multilocus sequence typing (cgMLST). This genome wide gene-by-gene typing approach allows a high genomic resolution of L. pneumophila isolates by retaining epidemiological concordance. A major advantage of this genome-based method is the detection of large recombination events within the analysed genomes, which is, so far, reserved for whole genome sequencing. The population structure of legionellae is largely driven by recombination and horizontal gene transfer rather than by spontaneous mutations. Therefore, the detection of recombination events is essential for typing of L. pneumophila isolates. In addition, the cgMLST-scheme assigns a core genome sequence type to the analysed isolate and allows backwards compatibility with the current SBT-scheme.
Both methods proved to be fast, reliable and robust typing methods through their application during outbreak investigations. Furthermore, both systems are particularly suited as routine molecular typing tools for the surveillance of single cases. The raw data are verified and translated into uniform portable codes, which enables the easy transfer and comparison of results. The standardized and portable quality of the results of both methods enables the establishment of a curated global database. This qualifies both methods as potential new gold standard methods for the genotyping of L. pneumophila isolates
Recommended from our members
Towards Operational Research Infrastructures with FAIR Data and Services
Environmental research infrastructures aim to provide scientists with facilities, resources and services to enable scientists to effectively perform advanced research. When addressing societal challenges such as climate change and pollution, scientists usually need data, models and methods from different domains to tackle the complexity of the complete environmental system. Research infrastructures are thus required to enable all data, including services, products, and virtual research environments is FAIR for research communities: Findable, Accessible, Interoperable and Reusable. In this last chapter, we conclude and identify future challenges in research infrastructure operation, user support, interoperability, and future evolution
Time to Go Green? Nature-Based Physical Activity as a Potential Treatment for Mental Disorders
Optical closure for an aerosol column: Method, accuracy, and inferable properties applied to a biomass-burning aerosol and its radiative forcing
Recommended from our members
Comparative ultrafast spectroscopy and structural analysis of OCP1 and OCP2 from Tolypothrix.
The orange carotenoid protein (OCP) is a structurally and functionally modular photoactive protein involved in cyanobacterial photoprotection. Recently, based on bioinformatic analysis and phylogenetic relationships, new families of OCP have been described, OCP2 and OCPx. The first characterization of the OCP2 showed both faster photoconversion and back-conversion, and lower fluorescence quenching of phycobilisomes relative to the well-characterized OCP1. Moreover, OCP2 is not regulated by the fluorescence recovery protein (FRP). In this work, we present a comprehensive study combining ultrafast spectroscopy and structural analysis to compare the photoactivation mechanisms of OCP1 and OCP2 from Tolypothrix PCC 7601. We show that despite significant differences in their functional characteristics, the spectroscopic properties of OCP1 and OCP2 are comparable. This indicates that the OCP functionality is not directly related to the spectroscopic properties of the bound carotenoid. In addition, the structural analysis by X-ray footprinting reveals that, overall, OCP1 and OCP2 have grossly the same photoactivation mechanism. However, the OCP2 is less reactive to radiolytic labeling, suggesting that the protein is less flexible than OCP1. This observation could explain fast photoconversion of OCP2
Cerebrospinal fluid analyses for the diagnosis of subarachnoid haemorrhage and experience from a Swedish study. What method is preferable when diagnosing a subarachnoid haemorrhage?
Subarachnoid haemorrhage (SAH) has a high mortality and morbidity rate. Early SAH diagnosis allows the early treatment of a ruptured cerebral aneurysm, which improves the prognosis. Diagnostic cerebrospinal fluid (CSF) analyses may be performed after a negative computed tomography scan, but the precise analytical methods to be used have been debated. Here, we summarize the scientific evidence for different CSF methods for SAH diagnosis and describe their implementation in different countries. The principle literature search was conducted using PubMed and Scopus with the search items "cerebrospinal fluid”, "subarachnoid haemorrhage”, and "diagnosis”. CSF analyses for SAH include visual examination, red blood cell counts, spectrophotometry for oxyhaemoglobin or bilirubin determination, CSF cytology, and ferritin measurement. The methods vary in availability and performance. There is a consensus that spectrophotometry has the highest diagnostic performance, but both oxyhaemoglobin and bilirubin determinations are susceptible to important confounding factors. Visual inspection of CSF for xanthochromia is still frequently used for diagnosis of SAH, but it is advised against because spectrophotometry has a superior diagnostic accuracy. A positive finding of CSF bilirubin is a strong indicator of an intracranial bleeding, whereas a positive finding of CSF oxyhaemoglobin may indicate an intracranial bleeding or a traumatic tap. Where spectrophotometry is not available, the combination of CSF cytology for erythrophages or siderophages and ferritin is a promising alternativ
- …