38,070 research outputs found

    Upgrade of the CEDIT database of earthquake-induced ground effects in Italy

    Get PDF
    The database related to the Italian Catalogue of EarthquakeInduced Ground Failures (CEDIT), was recently upgraded and updated to 2017 in the frame of a work-in-progress focused on the following issues: i) reorganization of the geo-database architecture; ii) revision of the earthquake parameters from the CFTI5 e CPTI15 catalogues by INGV; ii) addition of new data on effects induced by earthquakes occurred from 2009 to 2017; iv) attribution of macroseismic intensity value to each effect site, according to the CFTI5 e CPTI15 catalogues by INGV. The revised CEDIT database aims at achieving: i) the optimization of the CEDIT catalogue in order to increase its usefulness for both Public Institutions and individual users; ii) a new architecture of the geo-database in view of a future implementation of the online catalogue which implies its usability via web-app also to support post-event detection and surveying activities. Here we illustrate the new geo-database design and discuss the statistics that can be derived from the updated database. Statistical analysis was carried out on the data recorded in the last update of CEDIT to 2017 and compared with the analysis of the previous update outline that: - the most represented ground effects are the landslides with a percentage of 55% followed by ground cracks with a percentage of 23%; - the MCS intensity (IMCS) distribution of the effect sites shows a maximum in correspondence of the IMCS class 8 even if a second frequency peak appears in the IMCS class 7 only for surface faulting effects; - the distribution of the effects according to the epicentral distance shows a decrease for all the typologies of induced ground effects with increasing epicentral distance

    On the Application of Data Mining to Official Data

    Get PDF
    Retrieving valuable knowledge and statistical patterns from official data has a great potential in supporting strategic policy making. Data Mining (DM) techniques are well-known for providing flexible and efficient analytical tools for data processing. In this paper, we provide an introduction to applications of DM to official statistics and flag the important issues and challenges. Considering recent advancements in software projects for DM, we propose intelligent data control system design and specifications as an example of DM application in official data processing.Data mining, Official data, Intelligent data control system

    Geospatial Narratives and their Spatio-Temporal Dynamics: Commonsense Reasoning for High-level Analyses in Geographic Information Systems

    Full text link
    The modelling, analysis, and visualisation of dynamic geospatial phenomena has been identified as a key developmental challenge for next-generation Geographic Information Systems (GIS). In this context, the envisaged paradigmatic extensions to contemporary foundational GIS technology raises fundamental questions concerning the ontological, formal representational, and (analytical) computational methods that would underlie their spatial information theoretic underpinnings. We present the conceptual overview and architecture for the development of high-level semantic and qualitative analytical capabilities for dynamic geospatial domains. Building on formal methods in the areas of commonsense reasoning, qualitative reasoning, spatial and temporal representation and reasoning, reasoning about actions and change, and computational models of narrative, we identify concrete theoretical and practical challenges that accrue in the context of formal reasoning about `space, events, actions, and change'. With this as a basis, and within the backdrop of an illustrated scenario involving the spatio-temporal dynamics of urban narratives, we address specific problems and solutions techniques chiefly involving `qualitative abstraction', `data integration and spatial consistency', and `practical geospatial abduction'. From a broad topical viewpoint, we propose that next-generation dynamic GIS technology demands a transdisciplinary scientific perspective that brings together Geography, Artificial Intelligence, and Cognitive Science. Keywords: artificial intelligence; cognitive systems; human-computer interaction; geographic information systems; spatio-temporal dynamics; computational models of narrative; geospatial analysis; geospatial modelling; ontology; qualitative spatial modelling and reasoning; spatial assistance systemsComment: ISPRS International Journal of Geo-Information (ISSN 2220-9964); Special Issue on: Geospatial Monitoring and Modelling of Environmental Change}. IJGI. Editor: Duccio Rocchini. (pre-print of article in press

    The contextual database of the Generations and Gender Program: overview, conceptual framework and the link to the Generations and Gender Survey

    Get PDF
    This paper follows two aims. First it intends to give an overview of the contextual database of the Generations and Gender Program and how it is linked to the Generations and Gender Survey. Secondly, it provides a documentation of the approaches taken towards the conceptual definition and construction of the database. The document consists of two parts. The first gives a brief description of the underlying ideas of the database and the approach taken in order to develop its conceptual framework and construct the database. The second part is a note on the link between the Generations and Gender Survey and the contextual database. Starting from the GGS questionnaire, the main interfaces between micro data and contextual domains are investigated.data collection

    MICSIM : Concept, Developments and Applications of a PC-Microsimulation Model for Research and Teaching

    Get PDF
    It is the growing societal interest about the individual and its behaviour in our and 'modern' societies which is asking for microanalyses about the individual situation. In order to allow these microanalyses on a quantitative and empirically based level microsimulation models were developed and increasingly used for economic and social policy impact analyses. Though microsimulation is known and applied (mainly by experts), an easy to use and powerful PC microsimulation model is hard to find. The overall aim of this study and of MICSIM - A PC Microsimulation Model is to describe and offer such a user-friendly and powerful general microsimulation model for (almost) any PC, to support the impact microanalyses both in applied research and teaching. Above all, MICSIM is a general microdata handler for a wide range of typical microanalysis requirements. This paper presents the concept, developments and applications of MICSIM. After some brief remarks on microsimulation characteristics in general, the concept and substantive domains of MICSIM: the simulation, the adjustment and aging, and the evaluation of microdata, are described by its mode of operation in principle. The realisations and developments of MICSIM then are portrayed by the different versions of the computer program. Some MICSIM applications and experiences in research and teaching are following with concluding remarks.Economic and Social Policy Analyses, Microsimulation (dynamic and static), Simulation, Adjustment and Evaluation of Microdata, PC Computer Program for Microanalyses in General

    Persistence Modeling for Assessing Marketing Strategy Performance

    Get PDF
    The question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling addresses the problem of long-run market-response quantification by combining into one measure of “net long-run impact†the chain reaction of consumer response, firm feedback and competitor response that emerges following the initial marketing action. In this paper, we (i) summarize recent marketing-strategic insights that have been accumulated through various persistence modeling applications, (ii) provide an introduction to some of the most frequently used persistence modeling techniques, and (iii) identify some other strategic research questions where persistence modeling may prove to be particularly valuable.long-run effectiveness;marketing strategy;time-series analysis

    Non-Parametric Analysis of Hedge Fund Returns: New Insights from High Frequency Data

    Get PDF
    This paper examines four different daily datasets of hedge fund return indexes: MSCI, FTSE, Dow Jones and HFRX, all based on investable hedge funds, and three different monthly datasets of hedge fund return indexes: CSFB, CISDM and HFR which comprise both investable and non-investable hedge funds. Our study, based on standard statistical analysis, non-parametric analysis of the distribution and non-parametric regressions with respect to the S&P500 index shows that key data biases and disparate index construction methodologies lead to different statistical properties of hedge fund databases. One key variable that highly affects the statistical properties of hedge fund index returns is the “investability” of hedge fundsHedge Fund, Risk Management, High frequency data

    Electroencephalographic field influence on calcium momentum waves

    Get PDF
    Macroscopic EEG fields can be an explicit top-down neocortical mechanism that directly drives bottom-up processes that describe memory, attention, and other neuronal processes. The top-down mechanism considered are macrocolumnar EEG firings in neocortex, as described by a statistical mechanics of neocortical interactions (SMNI), developed as a magnetic vector potential A\mathbf{A}. The bottom-up process considered are Ca2+\mathrm{Ca}^{2+} waves prominent in synaptic and extracellular processes that are considered to greatly influence neuronal firings. Here, the complimentary effects are considered, i.e., the influence of A\mathbf{A} on Ca2+\mathrm{Ca}^{2+} momentum, p\mathbf{p}. The canonical momentum of a charged particle in an electromagnetic field, Π=p+qA\mathbf{\Pi} = \mathbf{p} + q \mathbf{A} (SI units), is calculated, where the charge of Ca2+\mathrm{Ca}^{2+} is q=2eq = - 2 e, ee is the magnitude of the charge of an electron. Calculations demonstrate that macroscopic EEG A\mathbf{A} can be quite influential on the momentum p\mathbf{p} of Ca2+\mathrm{Ca}^{2+} ions, in both classical and quantum mechanics. Molecular scales of Ca2+\mathrm{Ca}^{2+} wave dynamics are coupled with A\mathbf{A} fields developed at macroscopic regional scales measured by coherent neuronal firing activity measured by scalp EEG. The project has three main aspects: fitting A\mathbf{A} models to EEG data as reported here, building tripartite models to develop A\mathbf{A} models, and studying long coherence times of Ca2+\mathrm{Ca}^{2+} waves in the presence of A\mathbf{A} due to coherent neuronal firings measured by scalp EEG. The SMNI model supports a mechanism wherein the p+qA\mathbf{p} + q \mathbf{A} interaction at tripartite synapses, via a dynamic centering mechanism (DCM) to control background synaptic activity, acts to maintain short-term memory (STM) during states of selective attention.Comment: Final draft. http://ingber.com/smni14_eeg_ca.pdf may be updated more frequentl

    Data Warehouse Design and Management: Theory and Practice

    Get PDF
    The need to store data and information permanently, for their reuse in later stages, is a very relevant problem in the modern world and now affects a large number of people and economic agents. The storage and subsequent use of data can indeed be a valuable source for decision making or to increase commercial activity. The next step to data storage is the efficient and effective use of information, particularly through the Business Intelligence, at whose base is just the implementation of a Data Warehouse. In the present paper we will analyze Data Warehouses with their theoretical models, and illustrate a practical implementation in a specific case study on a pharmaceutical distribution companyData warehouse, database, data model.
    corecore