21 research outputs found

    Parameter Scan of an Effective Group Difference Pseudopotential Using Grid Computing

    No full text
    Abstract Computational modeling in the health sciences is still very challenging and much of the success has been despite the difficulties involved in integrating all of the technologies, software, and other tools necessary to answer complex questions. Very large-scale problems are open to questions of spatio-temporal scale, and whether physico-chemical complexity is matched by biological complexity. For example, for many reasons, many large-scale biomedical computations today still tend to use rather simplified physics/chemistry compared with the state of knowledge of the actual biology/biochemistry. The ability to invoke modern grid technologies offers the ability to create new paradigms for computing, enabling access of resources which facilitate spanning the biological scale

    Parameter exploration in science and engineering using many-task computing

    No full text
    Robust scientific methods require the exploration of the parameter space of a system (some of which can be run in parallel on distributed resources), and may involve complete state space exploration, experimental design, or numerical optimization techniques. Many-Task Computing (MTC) provides a framework for performing robust design, because it supports the execution of a large number of otherwise independent processes. Further, scientific workflow engines facilitate the specification and execution of complex software pipelines, such as those found in real science and engineering design problems. However, most existing workflow engines do not support a wide range of experimentation techniques, nor do they support a large number of independent tasks. In this paper, we discuss Nimrod/Ka set of add in components and a new run time machine for a general workflow engine, Kepler. Nimrod/K provides an execution architecture based on the tagged dataflow concepts, developed in 1980s for highly parallel machines. This is embodied in a new Kepler "Director" that supports many-task computing by orchestrating execution of tasks on on clusters, Grids, and Clouds. Further, Nimrod/K provides a set of "Actors" that facilitate the various modes of parameter exploration discussed above. We demonstrate the power of Nimrod/K to solve real problems in cardiac science

    Executing large parameter sweep applications on a multi-VO testbed

    No full text
    Applications that span multiple virtual organizations (VOs) are of great interest to the eScience community. However, recent attempts to execute large-scale parameter sweep applications (PSAs) with the Nimrod/G tool have exposed problems in the areas of fault tolerance, data storage and trust management. In response, we have implemented a task-splitting approach, which breaks up large PSAs into a sequence of dependent subtasks, improving fault tolerance; provides a garbage collection technique, which deletes unnecessary data; and employs a trust delegation technique that facilitates flexible third party data transfers across different VOs

    Parameter space exploration using scientific workflows

    No full text
    In recent years there has been interest in performing parameter space exploration across “scientific workflows”, however, many existing workflow tools are not well suited to this. In this paper we augment existing systems with a small set of special “actors” that implement the parameter estimation logic. Specifically, we discuss a set of new Kepler actors that support both complete and partial sweeps based on experimental design techniques. When combined with a novel parallel execution mechanism, we are able to execute parallel sweeps and searches across workflows that run on distributed “Grid” infrastructure. We illustrate our new system with a case study in cardiac cell modelling

    Applying grid computing to the parameter sweep of a group difference pseudopotential

    No full text
    Abstract. Theoretical modeling of chemical and biological processes is a key to understand nature and to predict experiments. Unfortunately, this is very data and computation extensive. However, the worldwide computing grid can now provide the necessary resources. Here, we present a coupling of the GAMESS quantum chemical code to the Nimrod/G grid distribution tool, which is applied to the parameter scan of a group difference pseudopotential (GDP). This represents the initial step in parameterization of a capping atom for hybrid quantum mechanics-molecular mechanics (QM/MM) calculations. The results give hints to the physical forces of functional group distinctions and starting points for later parameter optimizations. The demonstrated technology significantly extends the manageability of accurate, but costly quantum chemical calculations and is valuable for many applications involving thousands of independent runs.

    Parameter scan of an effective group difference pseudopotential using grid computing

    No full text
    Computational modeling in the health sciences is still very challenging and much of the success has been despite the difficulties involved in integrating all of the technologies, software, and other tools necessary to answer complex questions. Very large-scale problems are open to questions of spatio-temporal scale, and whether physico-chemical complexity is matched by biological complexity. For example, for many reasons, many large-scale biomedical computations today still tend to use rather simplified physics/chemistry compared with the state of knowledge of the actual biology/biochemistry. The ability to invoke modern grid technologies offers the ability to create new paradigms for computing, enabling access of resources which facilitate spanning the biological scale

    Mixing grids and clouds: high-throughput science using the Nimrod tool family

    No full text
    The Nimrod tool family facilitates high-throughput science by allowing researchers to explore complex design spaces using computational models. Users are able to describe large experiments in which models are executed across changing input parameters. Different members of the tool family support complete and partial parameter sweeps, numerical search by non-linear optimisation and even workflows. In order to provide timely results and to enable large-scale experiments, distributed computational resources are aggregated to form a logically single high-throughput engine. To date, we have leveraged grid middleware standards to spawn computations on remote machines. Recently, we added an interface to Amazon’s Elastic Compute Cloud (EC2), allowing users to mix conventional grid resources and clouds. A range of schedulers, from round-robin queues to those based on economic budgets, allow Nimrod to mix and match resources. This provides a powerful platform for computational researchers, because they can use a mix of university-level infrastructure and commercial clouds. In particular, the system allows a user to pay money to increase the quality of the research outcomes and to decide exactly how much they want to pay to achieve a given return. In this chapter, we will describe Nimrod and its architecture, and show how this naturally scales to incorporate clouds. We will illustrate the power of the system using a case study and will demonstrate that cloud computing has the potential to enable high-throughput science

    Embedding optimization in computational science workflows

    No full text
    Workflows support the automation of scientific processes, providing mechanisms that underpin modern computational science. They facilitate access to remote instruments, databases and parallel and distributed computers. Importantly, they allow software pipelines that perform multiple complex simulations (leveraging distributed platforms), with one simulation driving another. Such an environment is ideal for computational science experiments that require the evaluation of a range of different scenarios "in silico" in an attempt to find ones that optimize a particular outcome. However, in general, existing workflow tools do not incorporate optimization algorithms, and thus whilst users can specify simulation pipelines, they need to invoke the workflow as a stand-alone computation within an external optimization tool. Moreover, many existing workflow engines do not leverage parallel and distributed computers, making them unsuitable for executing computational science simulations. To solve this problem, we have developed a methodology for integrating optimization algorithms directly into workflows. We implement a range of generic actors for an existing workflow system called Kepler, and discuss how they can be combined in flexible ways to support various different design strategies. We illustrate the system by applying it to an existing bio-engineering design problem running on a Grid of distributed clusters

    Application of grid computing to parameter sweeps and optimizations in molecular modeling

    No full text
    In science and engineering in general and in computational chemistry in particular, parameter sweeps and optimizations are of high importance. Such parametric modeling jobs are embarrassingly parallel and thus well suited for grid computing. The Nimrod toolkit significantly simplifies the utilization of computational grids for this kind of research by hiding the complex grid middleware, automating job distribution, and providing easy-to-use user interfaces. Here, we present examples for the usage of Nimrod in molecular modeling. In detail, we discuss the parameterization of a group difference pseudopotential (GDP). Other applications are protein-ligand docking and a high-throughput workflow infrastructure for computational chemistry

    Incorporating local Ca2+ dynamics into single cell ventricular models

    No full text
    Understanding physiological mechanisms underlying the activity of the heart is of great medical importance. Mathematical modeling and numerical simulation have become a widely accepted method of unraveling the underlying mechanism of the heart. Calcium (Ca2 + ) dynamics regulate the excitation-contraction coupling in heart muscle cells and hence are among the key players in maintaining normal activity of the heart. Many existing ventricular single cell models lack the biophysically detailed description of the Ca2 +  dynamics. In this paper we examine how we can improve existing ventricular cell models by replacing their description of Ca2 +  dynamics with the local Ca2 +  control models. When replacing the existing Ca2 +  dynamics in a given cell model with a different Ca2 +  description, the parameters of the Ca2 +  subsystem need to be re-fitted. Moreover, the search through the plausible parameter space is computationally very intensive. Thus, the Grid enabled Nimrod/O software tools are used for optimizing the cell parameters. Nimrod/O provides a convenient, user-friendly framework for this as exemplified by the incorporation of local Ca2 +  dynamics into the ventricular single cell Noble 1998 model
    corecore