2,633 research outputs found

    Not Always Sparse: Flooding Time in Partially Connected Mobile Ad Hoc Networks

    Full text link
    In this paper we study mobile ad hoc wireless networks using the notion of evolving connectivity graphs. In such systems, the connectivity changes over time due to the intermittent contacts of mobile terminals. In particular, we are interested in studying the expected flooding time when full connectivity cannot be ensured at each point in time. Even in this case, due to finite contact times durations, connected components may appear in the connectivity graph. Hence, this represents the intermediate case between extreme cases of fully mobile ad hoc networks and fully static ad hoc networks. By using a generalization of edge-Markovian graphs, we extend the existing models based on sparse scenarios to this intermediate case and calculate the expected flooding time. We also propose bounds that have reduced computational complexity. Finally, numerical results validate our models

    Estimation of indirect cost and evaluation of protective measures for infrastructure vulnerability: A case study on the transalpine transport corridor

    Get PDF
    Infrastructure vulnerability is a topic of rising interest in the scientific literature for both the general increase of unexpected events and the strategic importance of certain links. Protective investments are extremely costly and risks are distributed in space and time which poses important decision problems to the public sector decision makers. In an economic prospective, the evaluation of infrastructure vulnerability is oriented on the estimation of direct and indirect costs of hazards. Although the estimation of direct costs is straightforward, the evaluation of indirect cost involves factors non-directly observable making the approximation a difficult issue. This paper provides an estimate of the indirect costs caused by a two weeks closure of the north-south Gotthard road corridor, one of the most important infrastructure links in Europe, and implements a cost-benefit analysis tool that allows the evaluation of measures ensuring a full protection along the corridor. The identification of the indirect cost relies on the generalized cost estimation, which parameters come from two stated preference experiments, the first based on actual condition whereas the second assumes a road closure. The procedure outlined in this paper proposes a methodology aimed to identify and quantify the economic vulnerability associated with a road transport infrastructure and, to evaluate the economic and social efficiency of a vulnerability reduction by the consideration of protective measures.infrastructure vulnerability, choice experiment, cost-benefit analysis, freight transport

    Accounting for WTP/WTA discrepancy in discrete choice models: Discussion of policy implications based on two freight transport stated choice experiments

    Get PDF
    A key input in cost-benefit analysis is represented by the marginal rate of substitution which expresses the willingness to pay, or its counterpart willingness to accept, for both market and non-market goods. The consistent discrepancy between these two measures observed in the literature suggests the need to estimate reference dependent models able to capturing loss aversion by distinguishing the value attached to a gain from the value attached to a loss according to reference dependent theory. This paper proposes a comparison of willingness to pay and willingness to accept measures estimated from models with both symmetric and reference dependent utility specifications within two different freight transport stated choice experiments. The results show that the reference dependent specification outperforms the symmetric specification and they prove the robustness of reference dependent specification over datasets designed according different attributes levels ranges. Moreover we demonstrate the policy relevance of asymmetric specifications illustrating the strong implications for cost-benefit analysis in two case studies.WTP/WTA discrepancy, freight choice, policy evaluation

    Multi-Path Alpha-Fair Resource Allocation at Scale in Distributed Software Defined Networks

    Get PDF
    The performance of computer networks relies on how bandwidth is shared among different flows. Fair resource allocation is a challenging problem particularly when the flows evolve over time. To address this issue, bandwidth sharing techniques that quickly react to the traffic fluctuations are of interest, especially in large scale settings with hundreds of nodes and thousands of flows. In this context, we propose a distributed algorithm based on the Alternating Direction Method of Multipliers (ADMM) that tackles the multi-path fair resource allocation problem in a distributed SDN control architecture. Our ADMM-based algorithm continuously generates a sequence of resource allocation solutions converging to the fair allocation while always remaining feasible, a property that standard primal-dual decomposition methods often lack. Thanks to the distribution of all computer intensive operations, we demonstrate that we can handle large instances at scale

    Reducing the Environmental Impact of Wireless Communication via Probabilistic Machine Learning

    Full text link
    Machine learning methods are increasingly adopted in communications problems, particularly those arising in next generation wireless settings. Though seen as a key climate mitigation and societal adaptation enabler, communications related energy consumption is high and is expected to grow in future networks in spite of anticipated efficiency gains in 6G due to exponential communications traffic growth. To make meaningful climate mitigation impact in the communications sector, a mindset shift away from maximizing throughput at all cost and towards prioritizing energy efficiency is needed. Moreover, this must be adopted in both existing (without incurring further embodied carbon costs through equipment replacement) and future network infrastructure, given the long development time of mobile generations. To that end, we present summaries of two such problems, from both current and next generation network specifications, where probabilistic inference methods were used to great effect: using Bayesian parameter tuning we are able to safely reduce the energy consumption of existing hardware on a live communications network by 11%11\% whilst maintaining operator specified performance envelopes; through spatiotemporal Gaussian process surrogate modeling we reduce the overhead in a next generation hybrid beamforming system by over 60%60\%, greatly improving the networks' ability to target highly mobile users such as autonomous vehicles. The Bayesian paradigm is itself helpful in terms of energy usage, since training a Bayesian optimization model can require much less computation than, say, training a deep neural network
    • …
    corecore