40,054 research outputs found

    Load Balancing via Random Local Search in Closed and Open systems

    Full text link
    In this paper, we analyze the performance of random load resampling and migration strategies in parallel server systems. Clients initially attach to an arbitrary server, but may switch server independently at random instants of time in an attempt to improve their service rate. This approach to load balancing contrasts with traditional approaches where clients make smart server selections upon arrival (e.g., Join-the-Shortest-Queue policy and variants thereof). Load resampling is particularly relevant in scenarios where clients cannot predict the load of a server before being actually attached to it. An important example is in wireless spectrum sharing where clients try to share a set of frequency bands in a distributed manner.Comment: Accepted to Sigmetrics 201

    Tight Load Balancing via Randomized Local Search

    Full text link
    We consider the following balls-into-bins process with nn bins and mm balls: each ball is equipped with a mutually independent exponential clock of rate 1. Whenever a ball's clock rings, the ball samples a random bin and moves there if the number of balls in the sampled bin is smaller than in its current bin. This simple process models a typical load balancing problem where users (balls) seek a selfish improvement of their assignment to resources (bins). From a game theoretic perspective, this is a randomized approach to the well-known Koutsoupias-Papadimitriou model, while it is known as randomized local search (RLS) in load balancing literature. Up to now, the best bound on the expected time to reach perfect balance was O((lnn)2+ln(n)n2/m)O\left({(\ln n)}^2+\ln(n)\cdot n^2/m\right) due to Ganesh, Lilienthal, Manjunath, Proutiere, and Simatos (Load balancing via random local search in closed and open systems, Queueing Systems, 2012). We improve this to an asymptotically tight O(ln(n)+n2/m)O\left(\ln(n)+n^2/m\right). Our analysis is based on the crucial observation that performing "destructive moves" (reversals of RLS moves) cannot decrease the balancing time. This allows us to simplify problem instances and to ignore "inconvenient moves" in the analysis.Comment: 24 pages, 3 figures, preliminary version appeared in proceedings of 2017 IEEE International Parallel and Distributed Processing Symposium (IPDPS'17

    A Statistical Mechanical Load Balancer for the Web

    Full text link
    The maximum entropy principle from statistical mechanics states that a closed system attains an equilibrium distribution that maximizes its entropy. We first show that for graphs with fixed number of edges one can define a stochastic edge dynamic that can serve as an effective thermalization scheme, and hence, the underlying graphs are expected to attain their maximum-entropy states, which turn out to be Erdos-Renyi (ER) random graphs. We next show that (i) a rate-equation based analysis of node degree distribution does indeed confirm the maximum-entropy principle, and (ii) the edge dynamic can be effectively implemented using short random walks on the underlying graphs, leading to a local algorithm for the generation of ER random graphs. The resulting statistical mechanical system can be adapted to provide a distributed and local (i.e., without any centralized monitoring) mechanism for load balancing, which can have a significant impact in increasing the efficiency and utilization of both the Internet (e.g., efficient web mirroring), and large-scale computing infrastructure (e.g., cluster and grid computing).Comment: 11 Pages, 5 Postscript figures; added references, expanded on protocol discussio

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future
    corecore