141,094 research outputs found

    Janus II: a new generation application-driven computer for spin-system simulations

    Get PDF
    This paper describes the architecture, the development and the implementation of Janus II, a new generation application-driven number cruncher optimized for Monte Carlo simulations of spin systems (mainly spin glasses). This domain of computational physics is a recognized grand challenge of high-performance computing: the resources necessary to study in detail theoretical models that can make contact with experimental data are by far beyond those available using commodity computer systems. On the other hand, several specific features of the associated algorithms suggest that unconventional computer architectures, which can be implemented with available electronics technologies, may lead to order of magnitude increases in performance, reducing to acceptable values on human scales the time needed to carry out simulation campaigns that would take centuries on commercially available machines. Janus II is one such machine, recently developed and commissioned, that builds upon and improves on the successful JANUS machine, which has been used for physics since 2008 and is still in operation today. This paper describes in detail the motivations behind the project, the computational requirements, the architecture and the implementation of this new machine and compares its expected performances with those of currently available commercial systems.Comment: 28 pages, 6 figure

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c

    Driven interfaces in random media at finite temperature : is there an anomalous zero-velocity phase at small external force ?

    Full text link
    The motion of driven interfaces in random media at finite temperature TT and small external force FF is usually described by a linear displacement hG(t)V(F,T)th_G(t) \sim V(F,T) t at large times, where the velocity vanishes according to the creep formula as V(F,T)eK(T)/FμV(F,T) \sim e^{-K(T)/F^{\mu}} for F0F \to 0. In this paper, we question this picture on the specific example of the directed polymer in a two dimensional random medium. We have recently shown (C. Monthus and T. Garel, arxiv:0802.2502) that its dynamics for F=0 can be analyzed in terms of a strong disorder renormalization procedure, where the distribution of renormalized barriers flows towards some "infinite disorder fixed point". In the present paper, we obtain that for small FF, this "infinite disorder fixed point" becomes a "strong disorder fixed point" with an exponential distribution of renormalized barriers. The corresponding distribution of trapping times then only decays as a power-law P(τ)1/τ1+αP(\tau) \sim 1/\tau^{1+\alpha}, where the exponent α(F,T)\alpha(F,T) vanishes as α(F,T)Fμ\alpha(F,T) \propto F^{\mu} as F0F \to 0. Our conclusion is that in the small force region α(F,T)<1\alpha(F,T)<1, the divergence of the averaged trapping time τˉ=+\bar{\tau}=+\infty induces strong non-self-averaging effects that invalidate the usual creep formula obtained by replacing all trapping times by the typical value. We find instead that the motion is only sub-linearly in time hG(t)tα(F,T)h_G(t) \sim t^{\alpha(F,T)}, i.e. the asymptotic velocity vanishes V=0. This analysis is confirmed by numerical simulations of a directed polymer with a metric constraint driven in a traps landscape. We moreover obtain that the roughness exponent, which is governed by the equilibrium value ζeq=2/3\zeta_{eq}=2/3 up to some large scale, becomes equal to ζ=1\zeta=1 at the largest scales.Comment: v3=final versio

    Distributed simulation of city inundation by coupled surface and subsurface porous flow for urban flood decision support system

    Get PDF
    We present a decision support system for flood early warning and disaster management. It includes the models for data-driven meteorological predictions, for simulation of atmospheric pressure, wind, long sea waves and seiches; a module for optimization of flood barrier gates operation; models for stability assessment of levees and embankments, for simulation of city inundation dynamics and citizens evacuation scenarios. The novelty of this paper is a coupled distributed simulation of surface and subsurface flows that can predict inundation of low-lying inland zones far from the submerged waterfront areas, as observed in St. Petersburg city during the floods. All the models are wrapped as software services in the CLAVIRE platform for urgent computing, which provides workflow management and resource orchestration.Comment: Pre-print submitted to the 2013 International Conference on Computational Scienc

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    Virtual Astronomy, Information Technology, and the New Scientific Methodology

    Get PDF
    All sciences, including astronomy, are now entering the era of information abundance. The exponentially increasing volume and complexity of modern data sets promises to transform the scientific practice, but also poses a number of common technological challenges. The Virtual Observatory concept is the astronomical community's response to these challenges: it aims to harness the progress in information technology in the service of astronomy, and at the same time provide a valuable testbed for information technology and applied computer science. Challenges broadly fall into two categories: data handling (or "data farming"), including issues such as archives, intelligent storage, databases, interoperability, fast networks, etc., and data mining, data understanding, and knowledge discovery, which include issues such as automated clustering and classification, multivariate correlation searches, pattern recognition, visualization in highly hyperdimensional parameter spaces, etc., as well as various applications of machine learning in these contexts. Such techniques are forming a methodological foundation for science with massive and complex data sets in general, and are likely to have a much broather impact on the modern society, commerce, information economy, security, etc. There is a powerful emerging synergy between the computationally enabled science and the science-driven computing, which will drive the progress in science, scholarship, and many other venues in the 21st century

    Evaluation of different sources of uncertainty in climate change impact research using a hydro-climatic model ensemble

    Get PDF
    The international research project QBic3 (Quebec-Bavarian Collaboration on Climate Change) aims at investigating the potential impacts of climate change on the hydrology of regional scale catchments in Southern Quebec (Canada) and Bavaria (Germany). Yet, the actual change in river runoff characteristics during the next 70 years is highly uncertain due to a multitude of uncertainty sources. The so-called hydro-climatic ensemble that is constructed to describe the uncertainties of this complex model chain consists of four different global climate models, downscaled by three different regional climate models, an exchangeable bias correction algorithm, a separate method to scale RCM outputs to the hydrological model scale and several hydrological models of differing complexity to assess the impact of different hydro model concepts. This choice of models and scenarios allows for the inter-comparison of the uncertainty ranges of climate and hydrological models, of the natural variability of the climate system as well as of the impact of scaling and correction of climate data on mean, high and low flow conditions. A methodology to display the relative importance of each source of uncertainty is proposed and results for past runoff and potential future changes are presented
    corecore