31 research outputs found

    Parallel flow boiler designs to minimise erosion and corrosion from dust loaded flue gases

    Full text link
    Improving power plant performance, availability and operational costs is crucial to remain competitive in today's competitive energy market. The boiler is a key component to achieve these objectives, particularly so when using challenging fuels, such as municipal solid waste or exhaust gases with high dust contents. This paper describes an innovative boiler design that has been used for the first time in an Energy from Waste plant in Bamberg, Germany. The new boiler design disregards the traditional heating surface arrangement and instead uses tube bundles arranged in parallel to the gas flow, which provides several advantages, such as reduced fouling. The paper describes the Bamberg project (boiler design and project highlights) and first operational results after 30,500h of operation. Additionally, the paper investigates further options to reduce fouling through the use of dimpled tubes, especially the ip tube ® technology. The technology is presented as well as first test results of such tubes in the Energy from Waste plant Rosenheim, Germany. The paper concludes with further applications for the parallel flow boiler design, such as cement kilns, to outline future markets Copyright © 2013 by ASME

    Standardization and quality management in next-generation sequencing

    Get PDF
    DNA sequencing continues to evolve quickly even after >30 years. Many new platforms suddenly appeared and former established systems have vanished in almost the same manner. Since establishment of next-generation sequencing devices, this progress gains momentum due to the continually growing demand for higher throughput, lower costs and better quality of data. In consequence of this rapid development, standardized procedures and data formats as well as comprehensive quality management considerations are still scarce. Here, we listed and summarized current standardization efforts and quality management initiatives from companies, organizations and societies in form of published studies and ongoing projects. These comprise on the one hand quality documentation issues like technical notes, accreditation checklists and guidelines for validation of sequencing workflows. On the other hand, general standard proposals and quality metrics are developed and applied to the sequencing workflow steps with the main focus on upstream processes. Finally, certain standard developments for downstream pipeline data handling, processing and storage are discussed in brief. These standardization approaches represent a first basis for continuing work in order to prospectively implement next-generation sequencing in important areas such as clinical diagnostics, where reliable results and fast processing is crucial. Additionally, these efforts will exert a decisive influence on traceability and reproducibility of sequence data

    Standardization and quality management in next-generation sequencing

    Full text link
    DNA sequencing continues to evolve quickly even after > 30 years. Many new platforms suddenly appeared and former established systems have vanished in almost the same manner. Since establishment of next-generation sequencing devices, this progress gains momentum due to the continually growing demand for higher throughput, lower costs and better quality of data. In consequence of this rapid development, standardized procedures and data formats as well as comprehensive quality management considerations are still scarce. Here, we listed and summarized current standardization efforts and quality management initiatives from companies, organizations and societies in form of published studies and ongoing projects. These comprise on the one hand quality documentation issues like technical notes, accreditation checklists and guidelines for validation of sequencing workflows. On the other hand, general standard proposals and quality metrics are developed and applied to the sequencing workflow steps with the main focus on upstream processes. Finally, certain standard developments for downstream pipeline data handling, processing and storage are discussed in brief. These standardization approaches represent a first basis for continuing work in order to prospectively implement next-generation sequencing in important areas such as clinical diagnostics, where reliable results and fast processing is crucial. Additionally, these efforts will exert a decisive influence on traceability and reproducibility of sequence data

    Ten simple rules on how to write a standard operating procedure

    Full text link
    Research publications and data nowadays should be publicly available on the internet and, theoretically, usable for everyone to develop further research, products, or services. The long-term accessibility of research data is, therefore, fundamental in the economy of the research production process. However, the availability of data is not sufficient by itself, but also their quality must be verifiable. Measures to ensure reuse and reproducibility need to include the entire research life cycle, from the experimental design to the generation of data, quality control, statistical analysis, interpretation, and validation of the results. Hence, high-quality records, particularly for providing a string of documents for the verifiable origin of data, are essential elements that can act as a certificate for potential users (customers). These records also improve the traceability and transparency of data and processes, therefore, improving the reliability of results. Standards for data acquisition, analysis, and documentation have been fostered in the last decade driven by grassroot initiatives of researchers and organizations such as the Research Data Alliance (RDA). Nevertheless, what is still largely missing in the life science academic research are agreed procedures for complex routine research workflows. Here, well-crafted documentation like standard operating procedures (SOPs) offer clear direction and instructions specifically designed to avoid deviations as an absolute necessity for reproducibility. Therefore, this paper provides a standardized workflow that explains step by step how to write an SOP to be used as a starting point for appropriate research documentation

    The ethics of genomic medicine:Redefining values and norms in the UK and France

    Get PDF
    International audienceThis paper presents a joint position of the UK-France Genomics and Ethics Network (UK-FR GENE), which has been set up to reflect on the ethical and social issues arising from the integration of genomics into routine clinical care in the UK and France. In 2018, the two countries announced enhanced cooperation between their national strategies, Genomics England and Plan France Médecine Génomique 2025, which offers a unique opportunity to study the impact of genomic medicine and relevant policies in different national contexts. The paper provides first insights into the two national strategies and the norms, values and principles at stake in each country. It discusses the impact of genomic medicine on established relationships and existing regulations, and examines its effects on solidarity and trust in public healthcare systems. Finally, it uses the social contract as an analytical lens to explore and redefine the balance between individual rights and collective duties in the context of genomic medicine. This paper leads to three key observations: (1) despite each country's strategy being at a different stage of implementation, the two countries face similar ethical issues; (2) each country tries to solve these issues by (re-)defining individual rights and collective duties in its own way; (3) the social contract presents a useful tool to analyse the ways the UK and France address the ethical challenges raised by genomics. This overview lays the groundwork for future in-depth comparison, and drive collaborative research, between the UK and France
    corecore