914,590 research outputs found

    Towards efficient comparison of change-based models

    Get PDF
    Comparison of large models can be time-consuming since every element has to be visited, matched, and compared with its respective element in other models. This can result in bottlenecks in collaborative modelling environments, where identifying differences between two versions of a model is desirable. Reducing the comparison process to only the elements that have been modified since a previous known state (e.g., previous version) could significantly reduce the time required for large model comparison. This paper presents how change-based persistence can be used to localise the comparison of models so that only elements affected by recent changes are compared and to substantially reduce comparison and differencing time (up to 90% in some experiments) compared to state-based model comparison

    Hbim methodology as a bridge between Italy and Argentina

    Get PDF
    none4siThe availability of efficient HBIM workflows could represent a very important change towards a more efficient management of the historical real estate. The present work shows how to obtain accurate and reliable information of heritage buildings through reality capture and 3D modelling to support restoration purposes or knowledge-based applications. Two cases studies metaphorically joint Italy with Argentina. The research article explains the workflows applied at the Palazzo Ferretti at Ancona and the Manzana Histórica de la Universidad National del Litoral, providing a constructive comparison and blending technological and theoretical approaches. In a bottom-up process, the assessment of two cases study validates a workflow allowing the achievement of a useful and proper data enrichment of each HBIM model. Another key aspect is the Level of Development (LOD) evaluation of both models: different ranges and scales are defined in America (100–500) and in Italy (A–G), nevertheless is possible to obtain standard shared procedures, enabling facilitation of HBIM development and diffusion in operating workflows.openA. Moreira, R. Quattrini, Gaston Maggiolo, Raissa MammoliMoreira, ALEJANDRO ARIEL; Quattrini, R.; Gaston, Maggiolo; Mammoli, Raiss

    Simulating the deep decarbonisation of residential heating for limiting global warming to 1.5C

    Get PDF
    Whole-economy scenarios for limiting global warming to 1.5C suggest that direct carbon emissions in the buildings sector should decrease to almost zero by 2050, but leave unanswered the question how this could be achieved by real-world policies. We take a modelling-based approach for simulating which policy measures could induce an almost-complete decarbonisation of residential heating, the by far largest source of direct emissions in residential buildings. Under which assumptions is it possible, and how long would it take? Policy effectiveness highly depends on behavioural decision- making by households, especially in a context of deep decarbonisation and rapid transformation. We therefore use the non-equilibrium bottom-up model FTT:Heat to simulate policies for a transition towards low-carbon heating in a context of inertia and bounded rationality, focusing on the uptake of heating technologies. Results indicate that the near-zero decarbonisation is achievable by 2050, but requires substantial policy efforts. Policy mixes are projected to be more effective and robust for driving the market of efficient low-carbon technologies, compared to the reliance on a carbon tax as the only policy instrument. In combination with subsidies for renewables, near-complete decarbonisation could be achieved with a residential carbon tax of 50-200Euro/tCO2. The policy-induced technology transition would increase average heating costs faced by households initially, but could also lead to cost reductions in most world regions in the medium term. Model projections illustrate the uncertainty that is attached to household behaviour for prematurely replacing heating systems

    Holistic Influence Maximization: Combining Scalability and Efficiency with Opinion-Aware Models

    Full text link
    The steady growth of graph data from social networks has resulted in wide-spread research in finding solutions to the influence maximization problem. In this paper, we propose a holistic solution to the influence maximization (IM) problem. (1) We introduce an opinion-cum-interaction (OI) model that closely mirrors the real-world scenarios. Under the OI model, we introduce a novel problem of Maximizing the Effective Opinion (MEO) of influenced users. We prove that the MEO problem is NP-hard and cannot be approximated within a constant ratio unless P=NP. (2) We propose a heuristic algorithm OSIM to efficiently solve the MEO problem. To better explain the OSIM heuristic, we first introduce EaSyIM - the opinion-oblivious version of OSIM, a scalable algorithm capable of running within practical compute times on commodity hardware. In addition to serving as a fundamental building block for OSIM, EaSyIM is capable of addressing the scalability aspect - memory consumption and running time, of the IM problem as well. Empirically, our algorithms are capable of maintaining the deviation in the spread always within 5% of the best known methods in the literature. In addition, our experiments show that both OSIM and EaSyIM are effective, efficient, scalable and significantly enhance the ability to analyze real datasets.Comment: ACM SIGMOD Conference 2016, 18 pages, 29 figure

    Efficient Monte Carlo methods for continuum radiative transfer

    Full text link
    We discuss the efficiency of Monte Carlo methods in solving continuum radiative transfer problems. The sampling of the radiation field and convergence of dust temperature calculations in the case of optically thick clouds are both studied. For spherically symmetric clouds we find that the computational cost of Monte Carlo simulations can be reduced, in some cases by orders of magnitude, with simple importance weighting schemes. This is particularly true for models consisting of cells of different sizes for which the run times would otherwise be determined by the size of the smallest cell. We present a new idea of extending importance weighting to scattered photons. This is found to be useful in calculations of scattered flux and could be important for three-dimensional models when observed intensity is needed only for one general direction of observations. Convergence of dust temperature calculations is studied for models with optical depths 10-10000. We examine acceleration methods where radiative interactions inside a cell or between neighbouring cells are treated explicitly. In optically thick clouds with strong self-coupling between dust temperatures the run times can be reduced by more than one order of magnitude. The use of a reference field was also examined. This eliminates the need for repeating simulation of constant sources (e.g., background radiation) after the first iteration and significantly reduces sampling errors. The applicability of the methods for three-dimensional models is discussed.Comment: submitted to A&A, 19 page
    • …
    corecore