7 research outputs found

    Resolving Braess's Paradox in Random Networks

    Get PDF
    Braess’s paradox states that removing a part of a network may improve the players’ latency at equilibrium. In this work, we study the approximability of the best subnetwork problem for the class of random Gn,p instances proven prone to Braess’s paradox by Valiant and Roughgarden RSA ’10 (Random Struct Algorithms 37(4):495–515, 2010), Chung and Young WINE ’10 (LNCS 6484:194–208, 2010) and Chung et al. RSA ’12 (Random Struct Algorithms 41(4):451–468, 2012). Our main contribution is a polynomial-time approximation-preserving reduction of the best subnetwork problem for such instances to the corresponding problem in a simplified network where all neighbors of source s and destination t are directly connected by 0 latency edges. Building on this, we consider two cases, either when the total rate r is sufficiently low, or, when r is sufficiently high. In the first case of low r=O(n+), here n+ is the maximum degree of {s,t}, we obtain an approximation scheme that for any constant ε>0 and with high probability, computes a subnetwork and an ε-Nash flow with maximum latency at most (1+ε)L∗+ε, where L∗ is the equilibrium latency of the best subnetwork. Our approximation scheme runs in polynomial time if the random network has average degree O(poly(lnn)) and the traffic rate is O(poly(lnlnn)), and in quasipolynomial time for average degrees up to o(n) and traffic rates of O(poly(lnn)). Finally, in the second case of high r=Ω(n+), we compute in strongly polynomial time a subnetwork and an ε-Nash flow with maximum latency at most (1+2ε+o(1))L∗

    Induced Litigation

    Get PDF
    If justice delayed is justice denied, justice is often denied in American courts. Delay in the courts is a ceaseless and unremitting problem of modem civil justice that has an irreparable effect on both plaintiffs and defendants. To combat this seemingly intractable problem, judges and court administrators routinely clamor for additional judicial resources to enable them to manage their dockets more effectively and efficiently. By building new courthouses and adding new judgeships, a court should be able to manage its caseload more efficiently. Trial judges should be able to hold motion hearings, host settlement conferences, and conduct trials in a timely fashion; appellate judges should be able to review briefs, hear oral arguments, and issue opinions expeditiously; and the crippling delay that often hobbles litigants and lawyers should give way to the speedy (or at least speedier) resolution of disputes

    Induced Litigation

    Get PDF
    If justice delayed is justice denied, justice is often denied in American courts. Delay in the courts is a ceaseless and unremitting problem of modem civil justice that has an irreparable effect on both plaintiffs and defendants. To combat this seemingly intractable problem, judges and court administrators routinely clamor for additional judicial resources to enable them to manage their dockets more effectively and efficiently. By building new courthouses and adding new judgeships, a court should be able to manage its caseload more efficiently. Trial judges should be able to hold motion hearings, host settlement conferences, and conduct trials in a timely fashion; appellate judges should be able to review briefs, hear oral arguments, and issue opinions expeditiously; and the crippling delay that often hobbles litigants and lawyers should give way to the speedy (or at least speedier) resolution of disputes

    Crowd dynamics

    Get PDF
    Crowd dynamics are complex. This thesis examines the nature of the crowd and its dynamics with specific reference to the issues of crowd safety. A model (Legion) was developed that simulates the crowd as an emergent phenomenon using simulated annealing and mobile cellular automata. We outline the elements of that model based on the interaction of four parameters: Objective, Motility, Constraint and Assimilation. The model treats every entity as an individual and it can simulate how people read and react to their environment in a variety of conditions. Which allows the user to study a wide range of crowd dynamics in different geometries and highlights the interactions of the crowd with their environment. We demonstrate that the model runs in polynomial time and can be used to assess the limits of crowd safety during normal and emergency egress. Over the last 10 years there have been many incidents of crowd related disasters. We highlight deficiencies in the existing guidelines relating to crowds. We compare and contrast the model with the safety guidelines and highlight specific areas where the guides may be improved. We demonstrate that the model is capable of reproducing these dynamics without additional parameters, satisfying Occam's Razor. The model is tested against known crowd dynamics from field studies, including Wembley Stadium, Balham Station and the Hong Kong Jockey club. We propose an alternative approach to assessing the dynamics of the crowd through the use of the simulation and analysis of least effort behaviour. Finally we test the model in a variety of applications where crowd related incidents warrant structural alterations at client sites. We demonstrate that the model explains the variance in a variety of field measurements, that it is robust and that it can be applied to future designs where safety and crowd comfort are criteria for design and cost savings

    Optimization of urban traffic control strategies by a network design model

    Get PDF
    The efficiency of congested urban transportation networks can be improved by implementing appropriate traffic control strategies, such as signal control timing, turning movement control; implementation of one-way traffic policies, lane distribution controls etc.. In this dissertation, the following strategies are addressed: 1) Intersection left turn addition/deletion, 2) Lane designation,. and 3) Signal optimization. The analogy between the network design problem (NDP) and the optimization of traffic control strategies motivated the formulation of an urban transportation network design problem (UTNDP) to optimize traffic control strategies. An UTNDP is a typical bi-level programming program, where the lower level problem is a User Equilibrium (UE) traffic assignment problem, while the upper level problem is a 0-1 integer programming problem. The upper level of an UTNDP model is used to represent the choices of the transportation authority. The lower level problem captures the travelers\u27 behavior. The objective function of the UTNDP is to minimize the total UE travel time. In this dissertation, a realistic travel time estimation procedure based on the 1997 HCM which takes account the effects of the above factors is proposed. The UTNDP is solved through a hybrid simulated annealing-TABU heuristic search strategy that was developed specifically for this problem. TABU lists are used to avoid cycling, and the Simulated Annealing step is used to select moves such that an annealing equilibrium state is achieved so that a reasonably good solution is guaranteed. The computational experiments are conducted on four test networks to demonstrate the feasibility and effectiveness of the UTNDP search strategy. Sensitivity analyses are also conducted on TABU list length, Markov chain increasing rate and control parameter dropping rate, and the weight coefficients of the HEF, which is composed of the current link v/c ratio, the historical contribution factor, and the random factor

    Dynamic traffic congestion pricing mechanism with user-centric considerations

    Get PDF
    Thesis: S.M. in Transportation, Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (pages 85-95).In this thesis, we consider the problem of designing real-time traffic routing systems in urban areas. Optimal dynamic routing for multiple passengers is known to be computationally hard due to its combinatorial nature. To overcome this difficulty, we propose a novel mechanism called User-Centric Dynamic Pricing (UCDP) based on recent advances in algorithmic mechanism design. The mechanism allows for congestion-free traffic in general road networks with heterogeneous users, while satisfying each user's travel preference. The mechanism first informs whether a passenger should use public transportation or the road network. In the latter case, a passenger reports his maximum accepted travel time with a lower bound announced publicly by the road authority. The mechanism then assigns the passenger a path that matches with his preference given the current traffic condition in the network. The proposed mechanism introduces a fairness constrained shortest path (FCSP) problem with a special structure, thus enabling polynomial time computation of path allocation that maximizes the sequential social surplus and guarantees fairness among passengers. The tolls of paths are then computed according to marginal cost payments. We show that reporting true preference is a weakly dominant strategy. The performance of the proposed mechanism is demonstrated on several simulated routing experiments in comparison to user equilibrium and system optimum.by Kim Thien Bui.S.M. in Transportatio

    Strategic algorithms

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 193-201).Classical algorithms from theoretical computer science arise time and again in practice. However,a practical situations typically do not fit precisely into the traditional theoretical models. Additional necessary components are, for example, uncertainty and economic incentives. Therefore, modem algorithm design is calling for more interdisciplinary approaches, as well as for deeper theoretical understanding, so that the algorithms can apply to more realistic settings and complex systems. Consider, for instance, the classical shortest path algorithm, which, given a graph with specified edge weights, seeks the path minimizing the total weight from a source to a destination. In practice, the edge weights are often uncertain and it is not even clear what we mean by shortest path anymore: is it the path that minimizes the expected weight? Or its variance, or some another metric? With a risk-averse objective function that takes into account both mean and standard deviation, we run into nonconvex optimization challenges that require new theory beyond classical shortest path algorithm design. Yet another shortest path application, routing of packets in the Internet, needs to further incorporate economic incentives to reflect the various business relationships among the Internet Service Providers that affect the choice of packet routes. Strategic Algorithms are algorithms that integrate optimization, uncertainty and economic modeling into algorithm design, with the goal of bringing about new theoretical developments and solving practical applications arising in complex computational-economic systems.(cont.) In short, this thesis contributes new algorithms and their underlying theory at the interface of optimization, uncertainty and economics. Although the interplay of these disciplines is present in various forms in our work, for the sake of presentation we have divided the material into three categories: 1. In Part I we investigate algorithms at the intersection of Optimization and Uncertainty. The key conceptual contribution in this part is discovering a novel connection between stochastic and nonconvex optimization. Traditional algorithm design has not taken into account the risk inherent in stochastic optimization problems. We consider natural objectives that incorporate risk, which tum out equivalent to certain nonconvex problems from the realm of continuous optimization. As a result, our work advances the state of art in both stochastic and in nonconvex optimization, presenting new complexity results and proposing general purpose efficient approximation algorithms, some of which have shown promising practical performance and have been implemented in a real traffic prediction and navigation system. 2. Part II proposes new algorithm and mechanism design at the intersection of Uncertainty and Economics. In Part I we postulate that the random variables in our models come from given distributions. However, determining those distributions or their parameters is a challenging and fundamental problem in itself. A tool from Economics that has recently gained momentum for measuring the probability distribution of a random variable is an information or prediction market. Such markets, most popularly known for predicting the outcomes of political elections or other events of interest, have shown remarkable accuracy in practice, though at the same time have left open the theoretical and strategic analysis of current implementations, as well as the need for new and improved designs which handle more complex outcome spaces (probability distribution functions) as opposed to binary or n-ary valued distributions. The contributions of this part include a unified strategic analysis of different prediction market designs that have been implemented in practice.(cont.) We also offer new market designs for handling exponentially large outcome spaces stemming from ranking or permutation-type outcomes, together with algorithmic and complexity analysis. 3. In Part III we consider the interplay of optimization and economics in the context of network routing. This part is motivated by the network of autonomous systems in the Internet where each portion of the network is controlled by an Internet service provider, namely by a self-interested economic agent. The business incentives do not exist merely in addition to the computer protocols governing the network. Although they are not currently integrated in those protocols and are decided largely via private contracting and negotiations, these economic considerations are a principal factor that determines how packets are routed. And vice versa, the demand and flow of network traffic fundamentally affect provider contracts and prices. The contributions of this part are the design and analysis of economic mechanisms for network routing. The mechanisms are based on first- and second-price auctions (the so-called Vickrey-Clarke-Groves, or VCG mechanisms). We first analyze the equilibria and prices resulting from these mechanisms. We then investigate the compatibility of the better understood VCG-mechanisms with the current inter-domain routing protocols, and we demonstrate the critical importance of correct modeling and how it affects the complexity and algorithms necessary to implement the economic mechanisms.by Evdokia Velinova Nikolova.Ph.D
    corecore