15,149 research outputs found

    BRAHMS: Novel middleware for integrated systems computation

    Get PDF
    Biological computational modellers are becoming increasingly interested in building large, eclectic models, including components on many different computational substrates, both biological and non-biological. At the same time, the rise of the philosophy of embodied modelling is generating a need to deploy biological models as controllers for robots in real-world environments. Finally, robotics engineers are beginning to find value in seconding biomimetic control strategies for use on practical robots. Together with the ubiquitous desire to make good on past software development effort, these trends are throwing up new challenges of intellectual and technological integration (for example across scales, across disciplines, and even across time) - challenges that are unmet by existing software frameworks. Here, we outline these challenges in detail, and go on to describe a newly developed software framework, BRAHMS. that meets them. BRAHMS is a tool for integrating computational process modules into a viable, computable system: its generality and flexibility facilitate integration across barriers, such as those described above, in a coherent and effective way. We go on to describe several cases where BRAHMS has been successfully deployed in practical situations. We also show excellent performance in comparison with a monolithic development approach. Additional benefits of developing in the framework include source code self-documentation, automatic coarse-grained parallelisation, cross-language integration, data logging, performance monitoring, and will include dynamic load-balancing and 'pause and continue' execution. BRAHMS is built on the nascent, and similarly general purpose, model markup language, SystemML. This will, in future, also facilitate repeatability and accountability (same answers ten years from now), transparent automatic software distribution, and interfacing with other SystemML tools. (C) 2009 Elsevier Ltd. All rights reserved

    MC-TESTER v. 1.23: a universal tool for comparisons of Monte Carlo predictions for particle decays in high energy physics

    Full text link
    Theoretical predictions in high energy physics are routinely provided in the form of Monte Carlo generators. Comparisons of predictions from different programs and/or different initialization set-ups are often necessary. MC-TESTER can be used for such tests of decays of intermediate states (particles or resonances) in a semi-automated way. Since 2002 new functionalities were introduced into the package. In particular, it now works with the HepMC event record, the standard for C++ programs. The complete set-up for benchmarking the interfaces, such as interface between tau-lepton production and decay, including QED bremsstrahlung effects is shown. The example is chosen to illustrate the new options introduced into the program. From the technical perspective, our paper documents software updates and supplements previous documentation. As in the past, our test consists of two steps. Distinct Monte Carlo programs are run separately; events with decays of a chosen particle are searched, and information is stored by MC-TESTER. Then, at the analysis step, information from a pair of runs may be compared and represented in the form of tables and plots. Updates introduced in the progam up to version 1.24.3 are also documented. In particular, new configuration scripts or script to combine results from multitude of runs into single information file to be used in analysis step are explained.Comment: 27 pages 4 figure

    Text books untuk mata kuliah pemrograman web

    Get PDF
    .HTML.And.Web.Design.Tips.And.Techniques.Jan.2002.ISBN.0072228253.pd

    MadDM v.1.0: Computation of Dark Matter Relic Abundance Using MadGraph5

    Get PDF
    We present MadDM v.1.0, a numerical tool to compute dark matter relic abundance in a generic model. The code is based on the existing MadGraph 5 architecture and as such is easily integrable into any MadGraph collider study. A simple Python interface offers a level of user-friendliness characteristic of MadGraph 5 without sacrificing functionality. MadDM is able to calculate the dark matter relic abundance in models which include a multi-component dark sector, resonance annihilation channels and co-annihilations. We validate the code in a wide range of dark matter models by comparing the relic density results from MadDM to the existing tools and literature.Comment: 35 pages, 6 figure

    Reverse engineering to achieve maintainable WWW sites

    Get PDF
    The growth of the World Wide Web and the accelerated development of web sites and associated web technologies has resulted in a variety of maintenance problems. The maintenance problems associated with web sites and the WWW are examined. It is argued that currently web sites and the WWW lack both data abstractions and structures that could facilitate maintenance. A system to analyse existing web sites and extract duplicated content and style is described here. In designing the system, existing Reverse Engineering techniques have been applied, and a case for further application of these techniques is made in order to prepare sites for their inevitable evolution in futur

    RRS James Clark Ross Cruises JR265 and JR254D, 27 Nov-24 Dec 2011. Part 1: The Drake Passage hydrographic repeat section SR1b

    Get PDF
    This report describes the 17th complete occupation of the Drake Passage CTD section, established during the World Ocean Circulation Experiment as repeat section SR1b. It wasfirst occupied by National Oceanography Centre (previously IOSDL and then SOC) in collaboration with the British Antarctic Survey in 1993, and has been re-occupied most years since then. Thirty two full depth stations were performed during JR265: two test stations, and all 30 of the nominal stations for the SR1b Drake Passage section. An initial result is that the estimated total transport measured across the section was 133 Sv which compares well to an average transport measured from the 16 previous UK cruises of 135 Sv (standard deviation of 7 Sv). In conjunction with the hydrographic cruise, a "Waves Aerosol and Gas Exchange Study" (WAGES) intensive observation cruise JR245D was also carried out. WAGES involves continuous measurement of the air-sea turbulent fluxes of CO2, sea spray aerosol, momentum and sensible and latent heat fluxes, plus directional sea-state and whitecap parameters using systems installed on the ship in May 2010. In addition to the continuous measurements, a number of intensive observation periods (IOPs) have been carried out by WAGES staff on board the ship. These involve deployments of a spar buoy to measure wave breaking and an aerial camera system to measure whitecap fraction. The activities of JR254D are summarised here, but are described in detail in a separate cruise report. Cruise JR264 was carried out by NOC-L staff at the same time as JR265 and JR254D. JR264 is also the subject of a separate cruise report. The CTD was an underwater SBE 9 plus unit equipped with the following sensors: dual temperature and conductivity sensors, a pressure sensor encased in the SBE underwater unit, a SBE-43 oxygen probe, an Aquatracka MKIII fluorometer, a transmissometer, an upwardlooking downwelling PAR sensor, and an altimeter. A downward-looking LADCP (RDI Workhorse Monitor 300 kHz) was deployed on all stations. Various underway measurements were obtained, including navigation, VM-ADCP, sea surface temperature and salinity, water depth and various meteorological parameters. A practical aim during this cruise was to update the detailed guides for each of the hydrographic data streams which were first written duringJR195 in 2009. The hydrographic data analysis was performed using "MSTAR", a suite of Matlab programs developed at NOCS by Brian King and used on the JCR for the first time during JR195
    • …
    corecore