4,513 research outputs found

    On the Structural Properties of Social Networks and their Measurement-calibrated Synthetic Counterparts

    Full text link
    Data-driven analysis of large social networks has attracted a great deal of research interest. In this paper, we investigate 120 real social networks and their measurement-calibrated synthetic counterparts generated by four well-known network models. We investigate the structural properties of the networks revealing the correlation profiles of graph metrics across various social domains (friendship networks, communication networks, and collaboration networks). We find that the correlation patterns differ across domains. We identify a non-redundant set of metrics to describe social networks. We study which topological characteristics of real networks the models can or cannot capture. We find that the goodness-of-fit of the network models depends on the domains. Furthermore, while 2K and stochastic block models lack the capability of generating graphs with large diameter and high clustering coefficient at the same time, they can still be used to mimic social networks relatively efficiently.Comment: To appear in International Conference on Advances in Social Networks Analysis and Mining (ASONAM '19), Vancouver, BC, Canad

    Proceedings of Abstracts Engineering and Computer Science Research Conference 2019

    Get PDF
    © 2019 The Author(s). This is an open-access work distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. For further details please see https://creativecommons.org/licenses/by/4.0/. Note: Keynote: Fluorescence visualisation to evaluate effectiveness of personal protective equipment for infection control is © 2019 Crown copyright and so is licensed under the Open Government Licence v3.0. Under this licence users are permitted to copy, publish, distribute and transmit the Information; adapt the Information; exploit the Information commercially and non-commercially for example, by combining it with other Information, or by including it in your own product or application. Where you do any of the above you must acknowledge the source of the Information in your product or application by including or linking to any attribution statement specified by the Information Provider(s) and, where possible, provide a link to this licence: http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/This book is the record of abstracts submitted and accepted for presentation at the Inaugural Engineering and Computer Science Research Conference held 17th April 2019 at the University of Hertfordshire, Hatfield, UK. This conference is a local event aiming at bringing together the research students, staff and eminent external guests to celebrate Engineering and Computer Science Research at the University of Hertfordshire. The ECS Research Conference aims to showcase the broad landscape of research taking place in the School of Engineering and Computer Science. The 2019 conference was articulated around three topical cross-disciplinary themes: Make and Preserve the Future; Connect the People and Cities; and Protect and Care

    Maximum entropy approach to multivariate time series randomization

    Get PDF
    Natural and social multivariate systems are commonly studied through sets of simultaneous and time-spaced measurements of the observables that drive their dynamics, i.e., through sets of time series. Typically, this is done via hypothesis testing: the statistical properties of the empirical time series are tested against those expected under a suitable null hypothesis. This is a very challenging task in complex interacting systems, where statistical stability is often poor due to lack of stationarity and ergodicity. Here, we describe an unsupervised, data-driven framework to perform hypothesis testing in such situations. This consists of a statistical mechanical approach—analogous to the configuration model for networked systems—for ensembles of time series designed to preserve, on average, some of the statistical properties observed on an empirical set of time series. We showcase its possible applications with a case study on financial portfolio selection

    Empirical Validation of Agent Based Models: A Critical Survey

    Get PDF
    This paper addresses the problem of finding the appropriate method for conducting empirical validation in agent-based (AB) models, which is often regarded as the Achilles’ heel of the AB approach to economic modelling. The paper has two objectives. First, to identify key issues facing AB economists engaged in empirical validation. Second, to critically appraise the extent to which alternative approaches deal with these issues. We identify a first set of issues that are common to both AB and neoclassical modellers and a second set of issues which are specific to AB modellers. This second set of issues is captured in a novel taxonomy, which takes into consideration the nature of the object under study, the goal of the analysis, the nature of the modelling assumptions, and the methodology of the analysis. Having identified the nature and causes of heterogeneity in empirical validation, we examine three important approaches to validation that have been developed in AB economics: indirect calibration, the Werker-Brenner approach, and the history-friendly approach. We also discuss a set of open questions within empirical validation. These include the trade-off between empirical support and tractability of findings, the issue of over-parameterisation, unconditional objects, counterfactuals, and the non-neutrality of data.Empirical validation, agent-based models, calibration, history-friendly modelling

    Decompositions of Triangle-Dense Graphs

    Full text link
    High triangle density -- the graph property stating that a constant fraction of two-hop paths belong to a triangle -- is a common signature of social networks. This paper studies triangle-dense graphs from a structural perspective. We prove constructively that significant portions of a triangle-dense graph are contained in a disjoint union of dense, radius 2 subgraphs. This result quantifies the extent to which triangle-dense graphs resemble unions of cliques. We also show that our algorithm recovers planted clusterings in approximation-stable k-median instances.Comment: 20 pages. Version 1->2: Minor edits. 2->3: Strengthened {\S}3.5, removed appendi

    Alaska University Transportation Center 2012 Annual Report

    Get PDF

    Modelling electricity prices: from the state of the art to a draft of a new proposal

    Get PDF
    In the last decades a liberalization of the electric market has started; prices are now determined on the basis of contracts on regular markets and their behaviour is mainly driven by usual supply and demand forces. A large body of literature has been developed in order to analyze and forecast their evolution: it includes works with different aims and methodologies depending on the temporal horizon being studied. In this survey we depict the actual state of the art focusing only on the recent papers oriented to the determination of trends in electricity spot prices and to the forecast of these prices in the short run. Structural methods of analysis, which result appropriate for the determination of forward and future values are left behind. Studies have been divided into three broad classes: Autoregressive models, Regime switching models, Volatility models. Six fundamental points arise: the peculiarities of electricity market, the complex statistical properties of prices, the lack of economic foundations of statistical models used for price analysis, the primacy of uniequational approaches, the crucial role played by demand and supply in prices determination, the lack of clearcut evidence in favour of a specific framework of analysis. To take into account the previous stylized issues, we propose the adoption of a methodological framework not yet used to model and forecast electricity prices: a time varying parameters Dynamic Factor Model (DFM). Such an eclectic approach, introduced in the late ‘70s for macroeconomic analysis, enables the identification of the unobservable dynamics of demand and supply driving electricity prices, the coexistence of short term and long term determinants, the creation of forecasts on future trends. Moreover, we have the possibility of simulating the impact that mismatches between demand and supply have over the price variable. This way it is possible to evaluate whether congestions in the network (eventually leading black out phenomena) trigger price reactions that can be considered as warning mechanisms.
    • 

    corecore