50,596 research outputs found

    Evolution of shocks and turbulence in major cluster mergers

    Full text link
    We performed a set of cosmological simulations of major mergers in galaxy clusters to study the evolution of merger shocks and the subsequent injection of turbulence in the post-shock region and in the intra-cluster medium (ICM). The computations were done with the grid-based, adaptive mesh refinement hydro code Enzo, using an especially designed refinement criteria for refining turbulent flows in the vicinity of shocks. A substantial amount of turbulence energy is injected in the ICM due to major merger. Our simulations show that the shock launched after a major merger develops an ellipsoidal shape and gets broken by the interaction with the filamentary cosmic web around the merging cluster. The size of the post-shock region along the direction of shock propagation is about 300 kpc h^-1, and the turbulent velocity dispersion in this region is larger than 100 km s^-1. Scaling analysis of the turbulence energy with the cluster mass within our cluster sample is consistent with M^(5/3), i.e. the scaling law for the thermal energy in the self-similar cluster model. This clearly indicates the close relation between virialization and injection of turbulence in the cluster evolution. We found that the ratio of the turbulent to total pressure in the cluster core within 2 Gyr after the major merger is larger than 10%, and it takes about 4 Gyr to get relaxed, which is substantially longer than typically assumed in the turbulent re-acceleration models, invoked to explain the statistics of observed radio halos. Striking similarities in the morphology and other physical parameters between our simulations and the "symmetrical radio relics" found at the periphery of the merging cluster A3376 are finally discussed. In particular, the interaction between the merger shock and the filaments surrounding the cluster could explain the presence of "notch-like" features at the edges of the double relics.Comment: 16 pages, 19 figures, Published in Astrophysical Journal (online) and printed version will be published on 1st January, 201

    Tagging time in prolog : the temporality effect project

    Get PDF
    This article combines a brief introduction into a particular philosophical theory of "time" with a demonstration of how this theory has been implemented in a Literary Studies oriented Humanities Computing project. The aim of the project was to create a model of text-based time cognition and design customized markup and text analysis tools that help to understand ‘‘how time works’’: more precisely, how narratively organised and communicated information motivates readers to generate the mental image of a chronologically organized world. The approach presented is based on the unitary model of time originally proposed by McTaggart, who distinguished between two perspectives onto time, the so-called A- and B-series. The first step towards a functional Humanities Computing implementation of this theoretical approach was the development of TempusMarker—a software tool providing automatic and semi-automatic markup routines for the tagging of temporal expressions in natural language texts. In the second step we discuss the principals underlying TempusParser—an analytical tool that can reconstruct temporal order in events by way of an algorithm-driven process of analysis and recombination of textual segments during which the "time stamp" of each segment as indicated by the temporal tags is interpreted

    Labelled transition systems as a Stone space

    Get PDF
    A fully abstract and universal domain model for modal transition systems and refinement is shown to be a maximal-points space model for the bisimulation quotient of labelled transition systems over a finite set of events. In this domain model we prove that this quotient is a Stone space whose compact, zero-dimensional, and ultra-metrizable Hausdorff topology measures the degree of bisimilarity such that image-finite labelled transition systems are dense. Using this compactness we show that the set of labelled transition systems that refine a modal transition system, its ''set of implementations'', is compact and derive a compactness theorem for Hennessy-Milner logic on such implementation sets. These results extend to systems that also have partially specified state propositions, unify existing denotational, operational, and metric semantics on partial processes, render robust consistency measures for modal transition systems, and yield an abstract interpretation of compact sets of labelled transition systems as Scott-closed sets of modal transition systems.Comment: Changes since v2: Metadata updat

    A Survey of Languages for Specifying Dynamics: A Knowledge Engineering Perspective

    Get PDF
    A number of formal specification languages for knowledge-based systems has been developed. Characteristics for knowledge-based systems are a complex knowledge base and an inference engine which uses this knowledge to solve a given problem. Specification languages for knowledge-based systems have to cover both aspects. They have to provide the means to specify a complex and large amount of knowledge and they have to provide the means to specify the dynamic reasoning behavior of a knowledge-based system. We focus on the second aspect. For this purpose, we survey existing approaches for specifying dynamic behavior in related areas of research. In fact, we have taken approaches for the specification of information systems (Language for Conceptual Modeling and TROLL), approaches for the specification of database updates and logic programming (Transaction Logic and Dynamic Database Logic) and the generic specification framework of abstract state machine

    3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios. GIS Pilot Applications

    Get PDF
    The project 3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications has been devised with the intention to deal with the demand for research, innovation and applicative methodology on the part of the international programme, requiring concrete results to increase the capacity to know, anticipate and respond to a natural disaster. This project therefore sets out to develop an experimental methodology, a wide geodatabase, a connected performant GIS platform and multifunctional scenarios able to profitably relate the added values deriving from different geotechnologies, aimed at a series of crucial steps regarding landscape reconstruction, event simulation, damage evaluation, emergency management, multi-temporal analysis. The Vesuvius area has been chosen for the pilot application owing to such an impressive number of people and buildings subject to volcanic risk that one could speak in terms of a possible national disaster. The steps of the project move around the following core elements: creation of models that reproduce the territorial and anthropic structure of the past periods, and reconstruction of the urbanized area, with temporal distinctions; three-dimensional representation of the Vesuvius area in terms of infrastructuralresidential aspects; GIS simulation of the expected event; first examination of the healthcareepidemiological consequences; educational proposals. This paper represents a proactive contribution which describes the aims of the project, the steps which constitute a set of specific procedures for the methodology which we are experimenting, and some thoughts regarding the geodatabase useful to “package” illustrative elaborations. Since the involvement of the population and adequate hazard preparedness are very important aspects, some educational and communicational considerations are presented in connection with the use of geotechnologies to promote the knowledge of risk
    corecore