11 research outputs found

    Crowd dynamics

    Get PDF
    Crowd dynamics are complex. This thesis examines the nature of the crowd and its dynamics with specific reference to the issues of crowd safety. A model (Legion) was developed that simulates the crowd as an emergent phenomenon using simulated annealing and mobile cellular automata. We outline the elements of that model based on the interaction of four parameters: Objective, Motility, Constraint and Assimilation. The model treats every entity as an individual and it can simulate how people read and react to their environment in a variety of conditions. Which allows the user to study a wide range of crowd dynamics in different geometries and highlights the interactions of the crowd with their environment. We demonstrate that the model runs in polynomial time and can be used to assess the limits of crowd safety during normal and emergency egress. Over the last 10 years there have been many incidents of crowd related disasters. We highlight deficiencies in the existing guidelines relating to crowds. We compare and contrast the model with the safety guidelines and highlight specific areas where the guides may be improved. We demonstrate that the model is capable of reproducing these dynamics without additional parameters, satisfying Occam's Razor. The model is tested against known crowd dynamics from field studies, including Wembley Stadium, Balham Station and the Hong Kong Jockey club. We propose an alternative approach to assessing the dynamics of the crowd through the use of the simulation and analysis of least effort behaviour. Finally we test the model in a variety of applications where crowd related incidents warrant structural alterations at client sites. We demonstrate that the model explains the variance in a variety of field measurements, that it is robust and that it can be applied to future designs where safety and crowd comfort are criteria for design and cost savings

    REINFORCEMENT-LEARNING-BASED CROSS LAYER DESIGN IN MOBILE AD-HOC NETWORKS

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Network Maintenance and Capacity Management with Applications in Transportation

    Get PDF
    abstract: This research develops heuristics to manage both mandatory and optional network capacity reductions to better serve the network flows. The main application discussed relates to transportation networks, and flow cost relates to travel cost of users of the network. Temporary mandatory capacity reductions are required by maintenance activities. The objective of managing maintenance activities and the attendant temporary network capacity reductions is to schedule the required segment closures so that all maintenance work can be completed on time, and the total flow cost over the maintenance period is minimized for different types of flows. The goal of optional network capacity reduction is to selectively reduce the capacity of some links to improve the overall efficiency of user-optimized flows, where each traveler takes the route that minimizes the traveler’s trip cost. In this dissertation, both managing mandatory and optional network capacity reductions are addressed with the consideration of network-wide flow diversions due to changed link capacities. This research first investigates the maintenance scheduling in transportation networks with service vehicles (e.g., truck fleets and passenger transport fleets), where these vehicles are assumed to take the system-optimized routes that minimize the total travel cost of the fleet. This problem is solved with the randomized fixed-and-optimize heuristic developed. This research also investigates the maintenance scheduling in networks with multi-modal traffic that consists of (1) regular human-driven cars with user-optimized routing and (2) self-driving vehicles with system-optimized routing. An iterative mixed flow assignment algorithm is developed to obtain the multi-modal traffic assignment resulting from a maintenance schedule. The genetic algorithm with multi-point crossover is applied to obtain a good schedule. Based on the Braess’ paradox that removing some links may alleviate the congestion of user-optimized flows, this research generalizes the Braess’ paradox to reduce the capacity of selected links to improve the efficiency of the resultant user-optimized flows. A heuristic is developed to identify links to reduce capacity, and the corresponding capacity reduction amounts, to get more efficient total flows. Experiments on real networks demonstrate the generalized Braess’ paradox exists in reality, and the heuristic developed solves real-world test cases even when commercial solvers fail.Dissertation/ThesisDoctoral Dissertation Industrial Engineering 201

    A combined simulated annealing and TABU search strategy to solve network design problem with two classes of users

    Get PDF
    A methodology to solve a transportation network design problem (TCNDP) with two classes of users (passenger cars and trucks) is developed. Given an existing highway system, with a capital investment budget constraint, the methodology selects the best links to be expanded by an extra lane by considering one of three types of traffic operations: exclusive for passenger cars, exclusive for trucks, and for both passenger cars and trucks such that the network total user equilibrium (UE) travel time is minimized. The problem is formulated as an NP-hard combinatorial nonlinear integer programming problem. The classical branch and bound methodology for the integer programming problem is very inefficient in solving this computationally hard problem. A combined simulated annealing and tabu search strategy (SA-TABU), was developed which is shown to perform in a robust and efficient manner in solving five networks ranging from 36 to 332 links. A comprehensive heuristic evaluation function (HEF), a core for the heuristic search strategy, was developed which can be adjusted to the characteristics of the problem and the search strategy used. It is composed of three elements: the link volume to capacity ratio, the historical contribution of the link to the objective function, and a random variable which resembles the error term of the HEF. The principal characteristics of the SA-TABU are the following: HEF, Markov chain length, “temperature” dropping rate and the tabu list length. Sensitivity analysis was conducted in identifying the best parameter values of the main components of the SA-TABU. Sufficiently “good” solutions were found in all the problems within a rather short computational time. The solution results suggest that in most of the scenarios, the shared lane option, passenger cars and trucks, was found to be the most favored selection. Expanding approximately 10% of the links, results in a very high percentage improvement ranging from 73% to 97% for the five test networks

    Dynamic traffic congestion pricing mechanism with user-centric considerations

    Get PDF
    Thesis: S.M. in Transportation, Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (pages 85-95).In this thesis, we consider the problem of designing real-time traffic routing systems in urban areas. Optimal dynamic routing for multiple passengers is known to be computationally hard due to its combinatorial nature. To overcome this difficulty, we propose a novel mechanism called User-Centric Dynamic Pricing (UCDP) based on recent advances in algorithmic mechanism design. The mechanism allows for congestion-free traffic in general road networks with heterogeneous users, while satisfying each user's travel preference. The mechanism first informs whether a passenger should use public transportation or the road network. In the latter case, a passenger reports his maximum accepted travel time with a lower bound announced publicly by the road authority. The mechanism then assigns the passenger a path that matches with his preference given the current traffic condition in the network. The proposed mechanism introduces a fairness constrained shortest path (FCSP) problem with a special structure, thus enabling polynomial time computation of path allocation that maximizes the sequential social surplus and guarantees fairness among passengers. The tolls of paths are then computed according to marginal cost payments. We show that reporting true preference is a weakly dominant strategy. The performance of the proposed mechanism is demonstrated on several simulated routing experiments in comparison to user equilibrium and system optimum.by Kim Thien Bui.S.M. in Transportatio

    Self-Organized Dynamics of Power Grids: Smart Grids, Fluctuations and Cascades

    Get PDF
    Climate change is one of the most pressing issues of our time and mitigating it requires a reduction of CO2 emissions. A big step towards achieving this goal is increasing the share of renewable energy sources, as the energy sector currently contributes 35% to all greenhouse gas emissions. However, integrating these renewable energy sources challenges the current power system in two major ways. Firstly, renewable generation consists of more spatially distributed and smaller power plants than conventional generation by nuclear or coal plants, questioning the established hierarchical structures and demanding a new grid design. Restructuring becomes necessary because wind and solar plants have to be placed at favorable sites, e.g., close to coasts in the case of wind. Secondly, renewables do not provide a deterministic and controllable power output but introduce power fluctuations that have to be controlled adequately. Many solutions to these challenges are build on the concept of smart grids, which require an extensive information technology (IT) infrastructure communicating between consumers and generators to coordinate efficient actions. However, an intertwined power and IT system raises great privacy and security concerns. Is it possible to forgo a large IT infrastructure in future power grids and instead operate them purely based on local information? How would such a decentrally organized system work? What is the impact of fluctuation on short time scales on the dynamical stability? Which grid topologies are robust against random failures or targeted attacks? This thesis aims to establish a framework of such a self-organized dynamics of a power grid, analyzing its benefits and limitations with respect to fluctuations and discrete events. Instead of a centrally monitored and controlled smart grid, we propose the concept of Decentral Smart Grid Control, translating local power grid frequency information into actions to stabilize the grid. This is not limited to power generators but applies equally to consumers, naturally introducing a demand response. We analyze the dynamical stability properties of this framework using linear stability methods as well as applying numerical simulations to determine the size of the basin of attraction. To do so, we investigate general stability effects and sample network motifs to find that this self-organized grid dynamics is stable for large parameter regimes. However, when the actors of the power grid react to a frequency signal, this reaction has to be sufficiently fast since reaction delays are shown to destabilize the grid. We derive expressions for a maximum delay, which always desynchronizes the system based on a rebound effect, and for destabilizing delays based on resonance effects. These resonance instabilities are cured when the frequency signal is averaged over a few seconds (low-pass filter). Overall, we propose an alternative smart grid model without any IT infrastructure and analyze its stable operating space. Furthermore, we analyze the impact of fluctuations on the power grid. First, we determine the escape time of the grid, i.e., the time until the grid desynchronizes when subject to stochastic perturbations. We simulate these events and derive an analytical expression using Kramer's method, obtaining the scaling of the escape time as a function of the grid inertia, transmitted power, damping etc. Thereby, we identify weak links in networks, which have to be enhanced to guarantee a stable operation. Second, we collect power grid frequency measurements from different regions across the world and evaluate their statistical properties. Distributions are found to be heavy-tailed so that large disturbances are more common than predicted by Gaussian statistics. We model the grid dynamics using a stochastic differential equation to derive the scaling of the fluctuations based on power grid parameters, identifying effective damping as essential in reducing fluctuation risks. This damping may be provided by increased demand control as proposed by Decentral Smart Grid Control. Finally, we investigate discrete events, in particular the failure of a single transmission line, as a complementary form of disturbances. An initial failure of a transmission line leads to additional load on other lines, potentially overloading them and thereby causing secondary outages. Hence, a cascade of failures is induced that propagated through the network, resulting in a large-scale blackout. We investigate these cascades in a combined dynamical and event-driven framework, which includes transient dynamics, in contrast to the often used steady state analysis that only solves static flows in the grid while neglecting any dynamics. Concluding, we identify critical lines, prone to cause cascades when failing, and observe a nearly constant speed of the propagation of the cascade in an appropriate metric. Overall, we investigate the self-organized dynamics of power grids, demonstrating its benefits and limitations. We provide tools to improve current grid operation and outline a smart grid solution that is not reliant on IT. Thereby, we support establishing a 100% renewable energy system

    Acta Polytechnica Hungarica 2016

    Get PDF

    Strategic algorithms

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 193-201).Classical algorithms from theoretical computer science arise time and again in practice. However,a practical situations typically do not fit precisely into the traditional theoretical models. Additional necessary components are, for example, uncertainty and economic incentives. Therefore, modem algorithm design is calling for more interdisciplinary approaches, as well as for deeper theoretical understanding, so that the algorithms can apply to more realistic settings and complex systems. Consider, for instance, the classical shortest path algorithm, which, given a graph with specified edge weights, seeks the path minimizing the total weight from a source to a destination. In practice, the edge weights are often uncertain and it is not even clear what we mean by shortest path anymore: is it the path that minimizes the expected weight? Or its variance, or some another metric? With a risk-averse objective function that takes into account both mean and standard deviation, we run into nonconvex optimization challenges that require new theory beyond classical shortest path algorithm design. Yet another shortest path application, routing of packets in the Internet, needs to further incorporate economic incentives to reflect the various business relationships among the Internet Service Providers that affect the choice of packet routes. Strategic Algorithms are algorithms that integrate optimization, uncertainty and economic modeling into algorithm design, with the goal of bringing about new theoretical developments and solving practical applications arising in complex computational-economic systems.(cont.) In short, this thesis contributes new algorithms and their underlying theory at the interface of optimization, uncertainty and economics. Although the interplay of these disciplines is present in various forms in our work, for the sake of presentation we have divided the material into three categories: 1. In Part I we investigate algorithms at the intersection of Optimization and Uncertainty. The key conceptual contribution in this part is discovering a novel connection between stochastic and nonconvex optimization. Traditional algorithm design has not taken into account the risk inherent in stochastic optimization problems. We consider natural objectives that incorporate risk, which tum out equivalent to certain nonconvex problems from the realm of continuous optimization. As a result, our work advances the state of art in both stochastic and in nonconvex optimization, presenting new complexity results and proposing general purpose efficient approximation algorithms, some of which have shown promising practical performance and have been implemented in a real traffic prediction and navigation system. 2. Part II proposes new algorithm and mechanism design at the intersection of Uncertainty and Economics. In Part I we postulate that the random variables in our models come from given distributions. However, determining those distributions or their parameters is a challenging and fundamental problem in itself. A tool from Economics that has recently gained momentum for measuring the probability distribution of a random variable is an information or prediction market. Such markets, most popularly known for predicting the outcomes of political elections or other events of interest, have shown remarkable accuracy in practice, though at the same time have left open the theoretical and strategic analysis of current implementations, as well as the need for new and improved designs which handle more complex outcome spaces (probability distribution functions) as opposed to binary or n-ary valued distributions. The contributions of this part include a unified strategic analysis of different prediction market designs that have been implemented in practice.(cont.) We also offer new market designs for handling exponentially large outcome spaces stemming from ranking or permutation-type outcomes, together with algorithmic and complexity analysis. 3. In Part III we consider the interplay of optimization and economics in the context of network routing. This part is motivated by the network of autonomous systems in the Internet where each portion of the network is controlled by an Internet service provider, namely by a self-interested economic agent. The business incentives do not exist merely in addition to the computer protocols governing the network. Although they are not currently integrated in those protocols and are decided largely via private contracting and negotiations, these economic considerations are a principal factor that determines how packets are routed. And vice versa, the demand and flow of network traffic fundamentally affect provider contracts and prices. The contributions of this part are the design and analysis of economic mechanisms for network routing. The mechanisms are based on first- and second-price auctions (the so-called Vickrey-Clarke-Groves, or VCG mechanisms). We first analyze the equilibria and prices resulting from these mechanisms. We then investigate the compatibility of the better understood VCG-mechanisms with the current inter-domain routing protocols, and we demonstrate the critical importance of correct modeling and how it affects the complexity and algorithms necessary to implement the economic mechanisms.by Evdokia Velinova Nikolova.Ph.D

    Spontaneous changes of human behaviors and intervention strategies: human and animal diseases

    Get PDF
    Doctor of PhilosophyDepartment of Industrial & Manufacturing Systems EngineeringChih-Hang WuThe topic of infectious disease epidemics has recently attracted substantial attentions in research communities and it has been shown that the changes of human behaviors have significant impacts on the dynamics of disease transmission. However, the study and understanding of human reactions into spread of infectious disease are still in the very beginning phase and how human behaviors change during the spread of infectious disease has not been systematically investigated. Moreover, the study of human behaviors includes not only various enforced measures by public authorities such as school closure, quarantine, vaccination, etc, but also the spontaneous self-protective actions which are triggered by risk perception and fear of diseases. Hence, the goal of this research is to study the impacts of human behaviors to the epidemic from these two perspectives: spontaneous behavioral changes and public intervention strategies. For the sake of studying spontaneous changes of human behaviors, this research first time applied evolutionary spatial game into the study of human reactions to the spread of infectious disease. This method integrated contact structures and epidemics information into the individuals’ decision processes, by adding two different types of information into the payoff functions: the local information and global information. The new method would not only advance the field of game theory, but also the field of epidemiology. In addition, this method was also applied to a classic compartmental dynamic system which is a widely used model for studying the disease transmission. With extensive numerical studies, the results first proved the consistency of two models for the sake of validating the effectiveness of the spatial evolutionary game. Then the impacts of changes of human behaviors to the dynamics of disease transmission and how information impacts human behaviors were discussed temporally and spatially. In addition to the spontaneous behavioral changes, the corresponding intervention strategies by policy-makers played the key role in process of mitigating the spread of infectious disease. For the purpose of minimizing the total lost, including the social costs and number of infected individuals, the intervention strategies should be optimized. Sensitivity analysis, stability analysis, bifurcation analysis, and optimal control methods are possible tools to understand the effects of different combination of intervention strategies or even find an appropriate policy to mitigate the disease transmission. One zoonotic disease, named Zoonotic Visceral Leishmaniasis (ZVL), was studied by adopting different methods and assumptions. Particularly, a special case, backward bifurcation, was discussed for the transmission of ZVL. Last but not least, the methodology and modeling framework used in this dissertation can be expanded to other disease situations and intervention applications, and have a broad impact to the research area related to mathematical modeling, epidemiology, decision-making processes, and industrial engineering. The further studies can combine the changes of human behaviors and intervention strategies by policy-makers so as to seek an optimal information dissemination to minimize the social costs and the number of infected individuals. If successful, this research should aid policy-makers by improving communication between them and the public, by directing educational efforts, and by predicting public response to infectious diseases and new risk management strategies (regulations, vaccination, quarantine, etc.)
    corecore