82 research outputs found

    Comparative Analyses of De Novo Transcriptome Assembly Pipelines for Diploid Wheat

    Get PDF
    Gene expression and transcriptome analysis are currently one of the main focuses of research for a great number of scientists. However, the assembly of raw sequence data to obtain a draft transcriptome of an organism is a complex multi-stage process usually composed of pre-processing, assembling, and post-processing. Each of these stages includes multiple steps such as data cleaning, error correction and assembly validation. Different combinations of steps, as well as different computational methods for the same step, generate transcriptome assemblies with different accuracy. Thus, using a combination that generates more accurate assemblies is crucial for any novel biological discoveries. Implementing accurate transcriptome assembly requires a great knowledge of different algorithms, bioinformatics tools and software that can be used in an analysis pipeline. Many pipelines can be represented as automated scalable scientific workflows that can be run simultaneously on powerful distributed and computational resources, such as Campus Clusters, Grids, and Clouds, and speed-up the analyses. In this thesis, we 1) compared and optimized de novo transcriptome assembly pipelines for diploid wheat; 2) investigated the impact of a few key parameters for generating accurate transcriptome assemblies, such as digital normalization and error correction methods, de novo assemblers and k-mer length strategies; 3) built distributed and scalable scientific workflow for blast2cap3, a step from the transcriptome assembly pipeline for protein-guided assembly, using the Pegasus Workflow Management System (WMS); and 4) deployed and examined the scientific workflow for blast2cap3 on two different computational platforms. Based on the analysis performed in this thesis, we conclude that the best transcriptome assembly is produced when the error correction method is used with Velvet Oases and the “multi-k” strategy. Moreover, the performed experiments show that the Pegasus WMS implementation of blast2cap3 reduces the running time for more than 95% compared to its current serial implementation. The results presented in this thesis provide valuable insight for designing good de novo transcriptome assembly pipeline and show the importance of using scientific workflows for executing computationally demanding pipelines. Advisor: Jitender S. Deogu

    Applications Development for the Computational Grid

    Get PDF

    XSEDE: The Extreme Science and Engineering Discovery Environment (OAC 15-48562) Interim Project Report 13: Report Year 5, Reporting Period 2 August 1, 2020 – October 31, 2020

    Get PDF
    This is the Interim Project Report 13 (IPR13) for the NSF XSEDE project. It includes Key Performance Indicator data and project highlights for Reporting Year 5, Report Period 2 (August 1-October 31, 2020).NSF OAC 15-48562Ope

    A Comparison of a Campus Cluster and Open Science Grid Platforms for Protein- Guided Assembly using Pegasus Workflow Management System

    Get PDF
    Scientific workflows are a useful tool for managing large and complex computational tasks. Due to its intensive resource requirements, the scientific workflows are often executed on distributed platforms, including campus clusters, grids and clouds. In this paper we build a scientific workflow for blast2cap3, the protein-guided assembly, using the Pegasus Workflow Management System (Pegasus WMS). The modularity of blast2cap3 allows us to decompose the existing serial approach on multiple tasks, some of which can be run in parallel. Afterwards, this workflow is deployed on two distributed execution platforms: Sandhills, the University of Nebraska Campus Cluster, and the Open Science Grid (OSG). We compare and evaluate the performance of the built workflow for the both platforms. Furthermore, we also investigate the influence of the number of clusters of transcripts in the blast2cap3 workflow over the total running time. The performed experiments show that the Pegasus WMS implementation of blast2cap3 significantly reduces the running time compared to the current serial implementation of blast2cap3 for more than 95 %. Although OSG provides more computational resources than Sandhills, our workflow experimental runs have better running time on Sandhills. Moreover, the selection of 300 clusters of transcripts gives the optimum performance with the resources allocated from Sandhills

    Emerging Technologies

    Get PDF
    This monograph investigates a multitude of emerging technologies including 3D printing, 5G, blockchain, and many more to assess their potential for use to further humanity’s shared goal of sustainable development. Through case studies detailing how these technologies are already being used at companies worldwide, author Sinan Küfeoğlu explores how emerging technologies can be used to enhance progress toward each of the seventeen United Nations Sustainable Development Goals and to guarantee economic growth even in the face of challenges such as climate change. To assemble this book, the author explored the business models of 650 companies in order to demonstrate how innovations can be converted into value to support sustainable development. To ensure practical application, only technologies currently on the market and in use actual companies were investigated. This volume will be of great use to academics, policymakers, innovators at the forefront of green business, and anyone else who is interested in novel and innovative business models and how they could help to achieve the Sustainable Development Goals. This is an open access book

    Combining SOA and BPM Technologies for Cross-System Process Automation

    Get PDF
    This paper summarizes the results of an industry case study that introduced a cross-system business process automation solution based on a combination of SOA and BPM standard technologies (i.e., BPMN, BPEL, WSDL). Besides discussing major weaknesses of the existing, custom-built, solution and comparing them against experiences with the developed prototype, the paper presents a course of action for transforming the current solution into the proposed solution. This includes a general approach, consisting of four distinct steps, as well as specific action items that are to be performed for every step. The discussion also covers language and tool support and challenges arising from the transformation

    Numerical modeling of thermal bar and stratification pattern in Lake Ontario using the EFDC model

    Get PDF
    Thermal bar is an important phenomenon in large, temperate lakes like Lake Ontario. Spring thermal bar formation reduces horizontal mixing, which in turn, inhibits the exchange of nutrients. Evolution of the spring thermal bar through Lake Ontario is simulated using the 3D hydrodynamic model Environmental Fluid Dynamics Code (EFDC). The model is forced with the hourly meteorological data from weather stations around the lake, flow data for Niagara and St. Lawrence rivers, and lake bathymetry. The simulation is performed from April to July, 2011; on a 2-km grid. The numerical model has been calibrated by specifying: appropriate initial temperature and solar radiation attenuation coefficients. The existing evaporation algorithm in EFDC is updated to modified mass transfer approach to ensure correct simulation of evaporation rate and latent heatflux. Reasonable values for mixing coefficients are specified based on sensitivity analyses. The model simulates overall surface temperature profiles well (RMSEs between 1-2°C). The vertical temperature profiles during the lake mixed phase are captured well (RMSEs < 0.5°C), indicating that the model sufficiently replicates the thermal bar evolution process. An update of vertical mixing coefficients is under investigation to improve the summer thermal stratification pattern. Keywords: Hydrodynamics, Thermal BAR, Lake Ontario, GIS

    Faculty Publications & Presentations, 2004-2005

    Get PDF
    • …
    corecore