3,166 research outputs found

    Many-Task Computing and Blue Waters

    Full text link
    This report discusses many-task computing (MTC) generically and in the context of the proposed Blue Waters systems, which is planned to be the largest NSF-funded supercomputer when it begins production use in 2012. The aim of this report is to inform the BW project about MTC, including understanding aspects of MTC applications that can be used to characterize the domain and understanding the implications of these aspects to middleware and policies. Many MTC applications do not neatly fit the stereotypes of high-performance computing (HPC) or high-throughput computing (HTC) applications. Like HTC applications, by definition MTC applications are structured as graphs of discrete tasks, with explicit input and output dependencies forming the graph edges. However, MTC applications have significant features that distinguish them from typical HTC applications. In particular, different engineering constraints for hardware and software must be met in order to support these applications. HTC applications have traditionally run on platforms such as grids and clusters, through either workflow systems or parallel programming systems. MTC applications, in contrast, will often demand a short time to solution, may be communication intensive or data intensive, and may comprise very short tasks. Therefore, hardware and software for MTC must be engineered to support the additional communication and I/O and must minimize task dispatch overheads. The hardware of large-scale HPC systems, with its high degree of parallelism and support for intensive communication, is well suited for MTC applications. However, HPC systems often lack a dynamic resource-provisioning feature, are not ideal for task communication via the file system, and have an I/O system that is not optimized for MTC-style applications. Hence, additional software support is likely to be required to gain full benefit from the HPC hardware

    11th German Conference on Chemoinformatics (GCC 2015) : Fulda, Germany. 8-10 November 2015.

    Get PDF

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 352)

    Get PDF
    This bibliography lists 147 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during July 1991. Subject coverage includes: aerospace medicine and psychology, life support systems and controlled environments, safety equipment, exobiology and extraterrestrial life, and flight crew behavior and performance

    FiCoS: A fine-grained and coarse-grained GPU-powered deterministic simulator for biochemical networks.

    Get PDF
    Mathematical models of biochemical networks can largely facilitate the comprehension of the mechanisms at the basis of cellular processes, as well as the formulation of hypotheses that can be tested by means of targeted laboratory experiments. However, two issues might hamper the achievement of fruitful outcomes. On the one hand, detailed mechanistic models can involve hundreds or thousands of molecular species and their intermediate complexes, as well as hundreds or thousands of chemical reactions, a situation generally occurring in rule-based modeling. On the other hand, the computational analysis of a model typically requires the execution of a large number of simulations for its calibration, or to test the effect of perturbations. As a consequence, the computational capabilities of modern Central Processing Units can be easily overtaken, possibly making the modeling of biochemical networks a worthless or ineffective effort. To the aim of overcoming the limitations of the current state-of-the-art simulation approaches, we present in this paper FiCoS, a novel "black-box" deterministic simulator that effectively realizes both a fine-grained and a coarse-grained parallelization on Graphics Processing Units. In particular, FiCoS exploits two different integration methods, namely, the Dormand-Prince and the Radau IIA, to efficiently solve both non-stiff and stiff systems of coupled Ordinary Differential Equations. We tested the performance of FiCoS against different deterministic simulators, by considering models of increasing size and by running analyses with increasing computational demands. FiCoS was able to dramatically speedup the computations up to 855×, showing to be a promising solution for the simulation and analysis of large-scale models of complex biological processes

    Model-guided development of an evolutionarily stable yeast chassis.

    Get PDF
    First-principle metabolic modelling holds potential for designing microbial chassis that are resilient against phenotype reversal due to adaptive mutations. Yet, the theory of model-based chassis design has rarely been put to rigorous experimental test. Here, we report the development of Saccharomyces cerevisiae chassis strains for dicarboxylic acid production using genome-scale metabolic modelling. The chassis strains, albeit geared for higher flux towards succinate, fumarate and malate, do not appreciably secrete these metabolites. As predicted by the model, introducing product-specific TCA cycle disruptions resulted in the secretion of the corresponding acid. Adaptive laboratory evolution further improved production of succinate and fumarate, demonstrating the evolutionary robustness of the engineered cells. In the case of malate, multi-omics analysis revealed a flux bypass at peroxisomal malate dehydrogenase that was missing in the yeast metabolic model. In all three cases, flux balance analysis integrating transcriptomics, proteomics and metabolomics data confirmed the flux re-routing predicted by the model. Taken together, our modelling and experimental results have implications for the computer-aided design of microbial cell factories

    Why High-Performance Modelling and Simulation for Big Data Applications Matters

    Get PDF
    Modelling and Simulation (M&S) offer adequate abstractions to manage the complexity of analysing big data in scientific and engineering domains. Unfortunately, big data problems are often not easily amenable to efficient and effective use of High Performance Computing (HPC) facilities and technologies. Furthermore, M&S communities typically lack the detailed expertise required to exploit the full potential of HPC solutions while HPC specialists may not be fully aware of specific modelling and simulation requirements and applications. The COST Action IC1406 High-Performance Modelling and Simulation for Big Data Applications has created a strategic framework to foster interaction between M&S experts from various application domains on the one hand and HPC experts on the other hand to develop effective solutions for big data applications. One of the tangible outcomes of the COST Action is a collection of case studies from various computing domains. Each case study brought together both HPC and M&S experts, giving witness of the effective cross-pollination facilitated by the COST Action. In this introductory article we argue why joining forces between M&S and HPC communities is both timely in the big data era and crucial for success in many application domains. Moreover, we provide an overview on the state of the art in the various research areas concerned

    Kinetic modeling and exploratory numerical simulation of chloroplastic starch degradation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Higher plants and algae are able to fix atmospheric carbon dioxide through photosynthesis and store this fixed carbon in large quantities as starch, which can be hydrolyzed into sugars serving as feedstock for fermentation to biofuels and precursors. Rational engineering of carbon flow in plant cells requires a greater understanding of how starch breakdown fluxes respond to variations in enzyme concentrations, kinetic parameters, and metabolite concentrations. We have therefore developed and simulated a detailed kinetic ordinary differential equation model of the degradation pathways for starch synthesized in plants and green algae, which to our knowledge is the most complete such model reported to date.</p> <p>Results</p> <p>Simulation with 9 internal metabolites and 8 external metabolites, the concentrations of the latter fixed at reasonable biochemical values, leads to a single reference solution showing β-amylase activity to be the rate-limiting step in carbon flow from starch degradation. Additionally, the response coefficients for stromal glucose to the glucose transporter k<sub>cat </sub>and K<sub>M </sub>are substantial, whereas those for cytosolic glucose are not, consistent with a kinetic bottleneck due to transport. Response coefficient norms show stromal maltopentaose and cytosolic glucosylated arabinogalactan to be the most and least globally sensitive metabolites, respectively, and β-amylase k<sub>cat </sub>and K<sub>M </sub>for starch to be the kinetic parameters with the largest aggregate effect on metabolite concentrations as a whole. The latter kinetic parameters, together with those for glucose transport, have the greatest effect on stromal glucose, which is a precursor for biofuel synthetic pathways. Exploration of the steady-state solution space with respect to concentrations of 6 external metabolites and 8 dynamic metabolite concentrations show that stromal metabolism is strongly coupled to starch levels, and that transport between compartments serves to lower coupling between metabolic subsystems in different compartments.</p> <p>Conclusions</p> <p>We find that in the reference steady state, starch cleavage is the most significant determinant of carbon flux, with turnover of oligosaccharides playing a secondary role. Independence of stationary point with respect to initial dynamic variable values confirms a unique stationary point in the phase space of dynamically varying concentrations of the model network. Stromal maltooligosaccharide metabolism was highly coupled to the available starch concentration. From the most highly converged trajectories, distances between unique fixed points of phase spaces show that cytosolic maltose levels depend on the total concentrations of arabinogalactan and glucose present in the cytosol. In addition, cellular compartmentalization serves to dampen much, but not all, of the effects of one subnetwork on another, such that kinetic modeling of single compartments would likely capture most dynamics that are fast on the timescale of the transport reactions.</p

    Synthetic biology tools for environmental protection

    Get PDF
    Synthetic biology transforms the way we perceive biological systems. Emerging technologies in this field affect many disciplines of science and engineering. Traditionally, synthetic biology approaches were commonly aimed at developing cost-effective microbial cell factories to produce chemicals from renewable sources. Based on this, the immediate beneficial impact of synthetic biology on the environment came from reducing our oil dependency. However, synthetic biology is starting to play a more direct role in environmental protection. Toxic chemicals released by industries and agriculture endanger the environment, disrupting ecosystem balance and biodiversity loss. This review highlights synthetic biology approaches that can help environmental protection by providing remediation systems capable of sensing and responding to specific pollutants. Remediation strategies based on genetically engineered microbes and plants are discussed. Further, an overview of computational approaches that facilitate the design and application of synthetic biology tools in environmental protection is presented
    corecore