309,878 research outputs found

    Single-Board-Computer Clusters for Cloudlet Computing in Internet of Things

    Get PDF
    The number of connected sensors and devices is expected to increase to billions in the near future. However, centralised cloud-computing data centres present various challenges to meet the requirements inherent to Internet of Things (IoT) workloads, such as low latency, high throughput and bandwidth constraints. Edge computing is becoming the standard computing paradigm for latency-sensitive real-time IoT workloads, since it addresses the aforementioned limitations related to centralised cloud-computing models. Such a paradigm relies on bringing computation close to the source of data, which presents serious operational challenges for large-scale cloud-computing providers. In this work, we present an architecture composed of low-cost Single-Board-Computer clusters near to data sources, and centralised cloud-computing data centres. The proposed cost-efficient model may be employed as an alternative to fog computing to meet real-time IoT workload requirements while keeping scalability. We include an extensive empirical analysis to assess the suitability of single-board-computer clusters as cost-effective edge-computing micro data centres. Additionally, we compare the proposed architecture with traditional cloudlet and cloud architectures, and evaluate them through extensive simulation. We finally show that acquisition costs can be drastically reduced while keeping performance levels in data-intensive IoT use cases.Ministerio de Economía y Competitividad TIN2017-82113-C2-1-RMinisterio de Economía y Competitividad RTI2018-098062-A-I00European Union’s Horizon 2020 No. 754489Science Foundation Ireland grant 13/RC/209

    Current Trends in Simheuristics: from smart transportation to agent-based simheuristics

    Get PDF
    Simheuristics extend metaheuristics by adding a simulation layer that allows the optimization component to deal efficiently with scenarios under uncertainty. This presentation reviews both initial as well as recent applications of simheuristics, mainly in the area of logistics and transportation. We also discuss a novel agent-based simheuristic (ABSH) approach that combines simheuristic and multi-agent systems to efficiently solve stochastic combinatorial optimization problems. The presentation is based on papers [1], [2], and [3], which have been already accepted in the prestigious Winter Simulation Conference.Peer ReviewedPostprint (published version

    Simulating the deep decarbonisation of residential heating for limiting global warming to 1.5C

    Get PDF
    Whole-economy scenarios for limiting global warming to 1.5C suggest that direct carbon emissions in the buildings sector should decrease to almost zero by 2050, but leave unanswered the question how this could be achieved by real-world policies. We take a modelling-based approach for simulating which policy measures could induce an almost-complete decarbonisation of residential heating, the by far largest source of direct emissions in residential buildings. Under which assumptions is it possible, and how long would it take? Policy effectiveness highly depends on behavioural decision- making by households, especially in a context of deep decarbonisation and rapid transformation. We therefore use the non-equilibrium bottom-up model FTT:Heat to simulate policies for a transition towards low-carbon heating in a context of inertia and bounded rationality, focusing on the uptake of heating technologies. Results indicate that the near-zero decarbonisation is achievable by 2050, but requires substantial policy efforts. Policy mixes are projected to be more effective and robust for driving the market of efficient low-carbon technologies, compared to the reliance on a carbon tax as the only policy instrument. In combination with subsidies for renewables, near-complete decarbonisation could be achieved with a residential carbon tax of 50-200Euro/tCO2. The policy-induced technology transition would increase average heating costs faced by households initially, but could also lead to cost reductions in most world regions in the medium term. Model projections illustrate the uncertainty that is attached to household behaviour for prematurely replacing heating systems

    A review of applied methods in Europe for flood-frequency analysis in a changing environment

    Get PDF
    The report presents a review of methods used in Europe for trend analysis, climate change projections and non-stationary analysis of extreme precipitation and flood frequency. In addition, main findings of the analyses are presented, including a comparison of trend analysis results and climate change projections. Existing guidelines in Europe on design flood and design rainfall estimation that incorporate climate change are reviewed. The report concludes with a discussion of research needs on non-stationary frequency analysis for considering the effects of climate change and inclusion in design guidelines. Trend analyses are reported for 21 countries in Europe with results for extreme precipitation, extreme streamflow or both. A large number of national and regional trend studies have been carried out. Most studies are based on statistical methods applied to individual time series of extreme precipitation or extreme streamflow using the non-parametric Mann-Kendall trend test or regression analysis. Some studies have been reported that use field significance or regional consistency tests to analyse trends over larger areas. Some of the studies also include analysis of trend attribution. The studies reviewed indicate that there is some evidence of a general increase in extreme precipitation, whereas there are no clear indications of significant increasing trends at regional or national level of extreme streamflow. For some smaller regions increases in extreme streamflow are reported. Several studies from regions dominated by snowmelt-induced peak flows report decreases in extreme streamflow and earlier spring snowmelt peak flows. Climate change projections have been reported for 14 countries in Europe with results for extreme precipitation, extreme streamflow or both. The review shows various approaches for producing climate projections of extreme precipitation and flood frequency based on alternative climate forcing scenarios, climate projections from available global and regional climate models, methods for statistical downscaling and bias correction, and alternative hydrological models. A large number of the reported studies are based on an ensemble modelling approach that use several climate forcing scenarios and climate model projections in order to address the uncertainty on the projections of extreme precipitation and flood frequency. Some studies also include alternative statistical downscaling and bias correction methods and hydrological modelling approaches. Most studies reviewed indicate an increase in extreme precipitation under a future climate, which is consistent with the observed trend of extreme precipitation. Hydrological projections of peak flows and flood frequency show both positive and negative changes. Large increases in peak flows are reported for some catchments with rainfall-dominated peak flows, whereas a general decrease in flood magnitude and earlier spring floods are reported for catchments with snowmelt-dominated peak flows. The latter is consistent with the observed trends. The review of existing guidelines in Europe on design floods and design rainfalls shows that only few countries explicitly address climate change. These design guidelines are based on climate change adjustment factors to be applied to current design estimates and may depend on design return period and projection horizon. The review indicates a gap between the need for considering climate change impacts in design and actual published guidelines that incorporate climate change in extreme precipitation and flood frequency. Most of the studies reported are based on frequency analysis assuming stationary conditions in a certain time window (typically 30 years) representing current and future climate. There is a need for developing more consistent non-stationary frequency analysis methods that can account for the transient nature of a changing climate

    The EnTrak system : supporting energy action planning via the Internet

    Get PDF
    Recent energy policy is designed to foster better energy efficiency and assist with the deployment of clean energy systems, especially those derived from renewable energy sources. To attain the envisaged targets will require action at all levels and effective collaboration between disparate groups (e.g. policy makers, developers, local authorities, energy managers, building designers, consumers etc) impacting on energy and environment. To support such actions and collaborations, an Internet-enabled energy information system called 'EnTrak' was developed. The aim was to provide decision-makers with information on energy demands, supplies and impacts by sector, time, fuel type and so on, in support of energy action plan formulation and enactment. This paper describes the system structure and capabilities of the EnTrak system

    Parallel and Distributed Simulation from Many Cores to the Public Cloud (Extended Version)

    Full text link
    In this tutorial paper, we will firstly review some basic simulation concepts and then introduce the parallel and distributed simulation techniques in view of some new challenges of today and tomorrow. More in particular, in the last years there has been a wide diffusion of many cores architectures and we can expect this trend to continue. On the other hand, the success of cloud computing is strongly promoting the everything as a service paradigm. Is parallel and distributed simulation ready for these new challenges? The current approaches present many limitations in terms of usability and adaptivity: there is a strong need for new evaluation metrics and for revising the currently implemented mechanisms. In the last part of the paper, we propose a new approach based on multi-agent systems for the simulation of complex systems. It is possible to implement advanced techniques such as the migration of simulated entities in order to build mechanisms that are both adaptive and very easy to use. Adaptive mechanisms are able to significantly reduce the communication cost in the parallel/distributed architectures, to implement load-balance techniques and to cope with execution environments that are both variable and dynamic. Finally, such mechanisms will be used to build simulations on top of unreliable cloud services.Comment: Tutorial paper published in the Proceedings of the International Conference on High Performance Computing and Simulation (HPCS 2011). Istanbul (Turkey), IEEE, July 2011. ISBN 978-1-61284-382-

    Oscillations, metastability and phase transitions in brain and models of cognition

    Get PDF
    Neuroscience is being practiced in many different forms and at many different organizational levels of the Nervous System. Which of these levels and associated conceptual frameworks is most informative for elucidating the association of neural processes with processes of Cognition is an empirical question and subject to pragmatic validation. In this essay, I select the framework of Dynamic System Theory. Several investigators have applied in recent years tools and concepts of this theory to interpretation of observational data, and for designing neuronal models of cognitive functions. I will first trace the essentials of conceptual development and hypotheses separately for discerning observational tests and criteria for functional realism and conceptual plausibility of the alternatives they offer. I will then show that the statistical mechanics of phase transitions in brain activity, and some of its models, provides a new and possibly revealing perspective on brain events in cognition
    corecore