97,967 research outputs found

    Software that goes with the flow in systems biology

    Get PDF
    A recent article in BMC Bioinformatics describes new advances in workflow systems for computational modeling in systems biology. Such systems can accelerate, and improve the consistency of, modeling through automation not only at the simulation and results-production stages, but also at the model-generation stage. Their work is a harbinger of the next generation of more powerful software for systems biologists

    BioWorkbench: A High-Performance Framework for Managing and Analyzing Bioinformatics Experiments

    Get PDF
    Advances in sequencing techniques have led to exponential growth in biological data, demanding the development of large-scale bioinformatics experiments. Because these experiments are computation- and data-intensive, they require high-performance computing (HPC) techniques and can benefit from specialized technologies such as Scientific Workflow Management Systems (SWfMS) and databases. In this work, we present BioWorkbench, a framework for managing and analyzing bioinformatics experiments. This framework automatically collects provenance data, including both performance data from workflow execution and data from the scientific domain of the workflow application. Provenance data can be analyzed through a web application that abstracts a set of queries to the provenance database, simplifying access to provenance information. We evaluate BioWorkbench using three case studies: SwiftPhylo, a phylogenetic tree assembly workflow; SwiftGECKO, a comparative genomics workflow; and RASflow, a RASopathy analysis workflow. We analyze each workflow from both computational and scientific domain perspectives, by using queries to a provenance and annotation database. Some of these queries are available as a pre-built feature of the BioWorkbench web application. Through the provenance data, we show that the framework is scalable and achieves high-performance, reducing up to 98% of the case studies execution time. We also show how the application of machine learning techniques can enrich the analysis process

    VizCom: A Novel Workflow Model for ICU Clinical Decision-Support

    Get PDF
    The Intensive Care Unit (ICU) has the highest annual mortality rate (4.4M) of any hospital unit or 25% of all clinical admissions. Studies show a relationship between clinician cognitive load and workflow, and their impact on patient safety and the subsequent occurrence of medical mishaps due to diagnostic error - in spite of advances in health information technology, e.g., bedside and clinical decision support (CDS) systems. The aim of our research is to: 1) investigate the root causes (underlying mechanisms) of ICU error related to the effects of clinical workflow: medical cognition, team communication/collaboration, and the use of diagnostic/CDS systems and 2) construct and validate a novel workflow model that supports improved clinical workflow, with goals to decrease adverse events, increase safety, and reduce intensivist time, effort, and cognitive resources. Lastly, our long-term objective is to apply data from aims one and two to design the next generation of diagnostic visualization-communication (VizCom) system that improves intensive care workflow, communication, and effectiveness in healthcare

    TensorLayer: A Versatile Library for Efficient Deep Learning Development

    Full text link
    Deep learning has enabled major advances in the fields of computer vision, natural language processing, and multimedia among many others. Developing a deep learning system is arduous and complex, as it involves constructing neural network architectures, managing training/trained models, tuning optimization process, preprocessing and organizing data, etc. TensorLayer is a versatile Python library that aims at helping researchers and engineers efficiently develop deep learning systems. It offers rich abstractions for neural networks, model and data management, and parallel workflow mechanism. While boosting efficiency, TensorLayer maintains both performance and scalability. TensorLayer was released in September 2016 on GitHub, and has helped people from academia and industry develop real-world applications of deep learning.Comment: ACM Multimedia 201

    Designing algorithms to aid discovery by chemical robots

    Get PDF
    Recently, automated robotic systems have become very efficient, thanks to improved coupling between sensor systems and algorithms, of which the latter have been gaining significance thanks to the increase in computing power over the past few decades. However, intelligent automated chemistry platforms for discovery orientated tasks need to be able to cope with the unknown, which is a profoundly hard problem. In this Outlook, we describe how recent advances in the design and application of algorithms, coupled with the increased amount of chemical data available, and automation and control systems may allow more productive chemical research and the development of chemical robots able to target discovery. This is shown through examples of workflow and data processing with automation and control, and through the use of both well-used and cutting-edge algorithms illustrated using recent studies in chemistry. Finally, several algorithms are presented in relation to chemical robots and chemical intelligence for knowledge discovery

    Toward high-content/high-throughput imaging and analysis of embryonic morphogenesis

    Get PDF
    In vivo study of embryonic morphogenesis tremendously benefits from recent advances in live microscopy and computational analyses. Quantitative and automated investigation of morphogenetic processes opens the field to high-content and high-throughput strategies. Following experimental workflow currently developed in cell biology, we identify the key challenges for applying such strategies in developmental biology. We review the recent progress in embryo preparation and manipulation, live imaging, data registration, image segmentation, feature computation, and data mining dedicated to the study of embryonic morphogenesis. We discuss a selection of pioneering studies that tackled the current methodological bottlenecks and illustrated the investigation of morphogenetic processes in vivo using quantitative and automated imaging and analysis of hundreds or thousands of cells simultaneously, paving the way for high-content/high-throughput strategies and systems analysis of embryonic morphogenesis

    Improving the performance of GIS/spatial analysts though novel applications of the Emotiv EPOC EEG headset

    Get PDF
    Geospatial information systems are used to analyze spatial data to provide decision makers with relevant, up-to-date, information. The processing time required for this information is a critical component to response time. Despite advances in algorithms and processing power, we still have many “human-in-the-loop” factors. Given the limited number of geospatial professionals, analysts using their time effectively is very important. The automation and faster humancomputer interactions of common tasks that will not disrupt their workflow or attention is something that is very desirable. The following research describes a novel approach to increase productivity with a wireless, wearable, electroencephalograph (EEG) headset within the geospatial workflow
    corecore