73,772 research outputs found
Using a desktop grid to support simulation modelling
Simulation is characterized by the need to run multiple sets of computationally intensive experiments. We argue that Grid computing can reduce the overall execution time of such experiments by tapping into the typically underutilized network of departmental desktop PCs, collectively known as desktop grids. Commercial-off-the-shelf simulation packages (CSPs) are used in industry to simulate models. To investigate if Grid computing can benefit simulation, this paper introduces our desktop grid, WinGrid, and discusses how this can be used to support the processing needs of CSPs. Results indicate a linear speed up and that Grid computing does indeed hold promise for simulation
Recommended from our members
The use of sequencing information in software specification for verification
Software requirements specifications, virtual machine definitions, and algorithmic design all place constraints on the sequence of operations that are permissible during a program's execution. This paper discusses how these constraints can be captured and used to aid in the program verification process. The sequencing constraints can be expressed as a grammar over the alphabet of program operations. Several techniques can be used in support of testing or verification based on these specifications. Dynamic aalysis and static analysis are considered here. The automatic generation of some of these aids is feasible; the means of doing so is described
Integrating BOINC with Microsoft Excel: A case study
The convergence of conventional Grid computing with public resource computing (PRC) offers potential benefits in the enterprise setting. For this work we took the popular PRC toolkit BOINC and used it to execute a previously monolithic Microsoft Excel financial model across several commodity computers. Our experience indicates that speedup approaching linear may be realised for certain scenarios, and that this approach offers a viable route to leveraging idle desktop PCs in the enterprise
Mapping Cluster Mass Distributions via Gravitational Lensing of Background Galaxies
We present a new method for measuring the projected mass distributions of
galaxy clusters. The gravitational amplification is measured by comparing the
joint distribution in redshift and magnitude of galaxies behind the cluster
with that of field galaxies. We show that the total amplification is directly
related to the surface mass density in the weak field limit, and so it is
possible to map the mass distribution of the cluster. The method is shown to be
limited by discreteness noise and galaxy clustering behind the lens. Galaxy
clustering sets a lower limit to the error along the redshift direction, but a
clustering independent lensing signature may be obtained from the magnitude
distribution at fixed redshift. Statistical techniques are developed for
estimating the surface mass density of the cluster. We extend these methods to
account for any obscuration by cluster halo dust, which may be mapped
independently of the dark matter. We apply the method to a series of numerical
simulations and show the feasibility of the approach. We consider approximate
redshift information, and show how the mass estimates are degraded.Comment: ApJ in press. 23 pages of LaTeX plus figs. Text & figs available by
anonymous ftp from resun03.roe.ac.uk in directory /pub/jap/lens (you need
btp.tex and apj.sty
Recommended from our members
Methodology for profiling literature in healthcare simulation
The publications that relate to the application of simulation to healthcare have steadily increased over the years. These publications are scattered amongst various journals that belong to several subject categories, including Operational Research, Health Economics and Pharmacokinetics. The simulation techniques that are applied to the study of healthcare problems are also varied. The aim of this study is to present
a methodology for profiling literature in
healthcare simulation. In our methodology, we
have considered papers on healthcare that have been published between 1970 and 2007 in
journals with impact factors that belonging to various subject categories reporting on the application of four simulation techniques, namely, Monte Carlo Simulation, Discrete-Event Simulation, System Dynamics and Agent-Based Simulation. The methodology has the following objectives: (a) to categorise the papers under the different simulation techniques and identify the
healthcare problems that each technique is
employed to investigate; (b) to profile, within our dataset, variables such as authors, article citations, etc.; (c) to identify turning point (strategically important) papers and authors through co-citation analysis of references cited
by the papers in our dataset. The focus of the paper is on the literature profiling methodology, and not the results that have been derived through the application of this methodology. The authors hope that the methodology presented here will be used to conduct similar work in not only healthcare but also other research domains
Reviewing past environments in a historic house using building simulation
This paper reviews different heatingregimes applied to the same space,using building simulation. Theconstruction of a computer simulationmodel to investigate past and presentenvironments in a historic house libraryis described. The model simulated fourhypothetical scenarios, based on realdata. The simulation outputs werereviewed in terms of the risk ofphysical and chemical deterioration,and their relationship with an existingnational standard for archives. Thepossibility of simulating pastenvironments to investigate naturalageing is also discussed
Selection of neutralizing antibody escape mutants with type A influenza virus HA-specific polyclonal antisera: possible significance for antigenic drift
Ten antisera were produced in rabbits by two or three intravenous injections of inactivated whole influenza type A virions. All contained haemagglutination-inhibition (HI) antibody directed predominantly to an epitope in antigenic site B and, in addition, various amounts of antibodies to an epitope in site A and in site D. The ability of untreated antisera to select neutralization escape mutants was investigated by incubating virus possessing the homologous haemagglutinin with antiserum adjusted to contain anti-B epitope HI titres of 100, 1000 and 10000 HIU/ml. Virus-antiserum mixtures were inoculated into embryonated hen's eggs, and progeny virus examined without further selection. Forty percent of the antisera at a titre of 1000 HIU/ml selected neutralizing antibody escape mutants as defined by their lack of reactivity to Mab HC10 (site B), and unchanged reactivity to other Mabs to site A and site D epitopes. All escape mutant-selecting antisera had a ratio of anti-site B (HC10)-epitope antibody[ratio]other antibodies of [gt-or-equal, slanted]2·0[ratio]1. The antiserum with the highest ratio (7·4[ratio]1) selected escape mutants in all eggs tested in four different experiments. No antiserum used at a titre of 10000 HIU/ml allowed multiplication of any virus. All antisera used at a titre of 100 HIU/ml permitted virus growth, but this was wild-type (wt) virus. We conclude that a predominant epitope-specific antibody response, a titre of [gt-or-equal, slanted]1000 HIU/ml, and a low absolute titre of other antibodies ([less-than-or-eq, slant]500 HIU/ml) are three requirements for the selection of escape mutants. None of the antisera in this study could have selected escape mutants without an appropriate dilution factor, so the occurrence of an escape mutant-selecting antiserum in nature is likely to be a rare event
- âŠ