1,803 research outputs found

    Changing Bases: Multistage Optimization for Matroids and Matchings

    Full text link
    This paper is motivated by the fact that many systems need to be maintained continually while the underlying costs change over time. The challenge is to continually maintain near-optimal solutions to the underlying optimization problems, without creating too much churn in the solution itself. We model this as a multistage combinatorial optimization problem where the input is a sequence of cost functions (one for each time step); while we can change the solution from step to step, we incur an additional cost for every such change. We study the multistage matroid maintenance problem, where we need to maintain a base of a matroid in each time step under the changing cost functions and acquisition costs for adding new elements. The online version of this problem generalizes online paging. E.g., given a graph, we need to maintain a spanning tree TtT_t at each step: we pay ct(Tt)c_t(T_t) for the cost of the tree at time tt, and also TtTt1| T_t\setminus T_{t-1} | for the number of edges changed at this step. Our main result is an O(logmlogr)O(\log m \log r)-approximation, where mm is the number of elements/edges and rr is the rank of the matroid. We also give an O(logm)O(\log m) approximation for the offline version of the problem. These bounds hold when the acquisition costs are non-uniform, in which caseboth these results are the best possible unless P=NP. We also study the perfect matching version of the problem, where we must maintain a perfect matching at each step under changing cost functions and costs for adding new elements. Surprisingly, the hardness drastically increases: for any constant ϵ>0\epsilon>0, there is no O(n1ϵ)O(n^{1-\epsilon})-approximation to the multistage matching maintenance problem, even in the offline case

    Algorithm Engineering in Robust Optimization

    Full text link
    Robust optimization is a young and emerging field of research having received a considerable increase of interest over the last decade. In this paper, we argue that the the algorithm engineering methodology fits very well to the field of robust optimization and yields a rewarding new perspective on both the current state of research and open research directions. To this end we go through the algorithm engineering cycle of design and analysis of concepts, development and implementation of algorithms, and theoretical and experimental evaluation. We show that many ideas of algorithm engineering have already been applied in publications on robust optimization. Most work on robust optimization is devoted to analysis of the concepts and the development of algorithms, some papers deal with the evaluation of a particular concept in case studies, and work on comparison of concepts just starts. What is still a drawback in many papers on robustness is the missing link to include the results of the experiments again in the design

    Energy Management in Microgrids: A Combination of Game Theory and Big Data‐Based Wind Power Forecasting

    Get PDF
    Energy internet provides an open framework for integrating every piece of equipment involved in energy generation, transmission, transformation, distribution, and consumption with novel information and communication technologies. In this chapter, the authors adopt a combination of game theory and big data to address the coordinated management of renewable and traditional energy, which is a typical issue on energy interconnections. The authors formulate the energy management problem as a three‐stage Stackelberg game and employ the backward induction method to derive the closed‐form expressions of the optimal strategies. Next, we study the big data‐based power generation forecasting techniques and introduce a scheme of the wind power forecasting, which can assist the microgrid to make strategies. Simulation results show that more accurate prediction results of wind power are conducive to better energy management

    Accelerating Monte Carlo simulations with an NVIDIA® graphics processor

    Get PDF
    Modern graphics cards, commonly used in desktop computers, have evolved beyond a simple interface between processor and display to incorporate sophisticated calculation engines that can be applied to general purpose computing. The Monte Carlo algorithm for modelling photon transport in turbid media has been implemented on an NVIDIA® 8800gt graphics card using the CUDA toolkit. The Monte Carlo method relies on following the trajectory of millions of photons through the sample, often taking hours or days to complete. The graphics-processor implementation, processing roughly 110 million scattering events per second, was found to run more than 70 times faster than a similar, single-threaded implementation on a 2.67 GHz desktop computer

    Applications of network optimization

    Get PDF
    Includes bibliographical references (p. 41-48).Ravindra K. Ahuja ... [et al.]

    Routing and scheduling optimisation under uncertainty for engineering applications

    Get PDF
    The thesis aims to develop a viable computational approach suitable for solving large vehicle routing and scheduling optimisation problems affected by uncertainty. The modelling framework is built upon recent advances in Stochastic Optimisation, Robust Optimisation and Distributionally Robust Optimization. The utility of the methodology is presented on two classes of discrete optimisation problems: scheduling satellite communication, which is a variant of Machine Scheduling, and the Vehicle Routing Problem with Time Windows and Synchronised Visits. For each problem class, a practical engineering application is formulated using data coming from the real world. The significant size of the problem instances reinforced the need to apply a different computational approach for each problem class. Satellite communication is scheduled using a Mixed-Integer Programming solver. In contrast, the vehicle routing problem with synchronised visits is solved using a hybrid method that combines Iterated Local Search, Constraint Programming and the Guided Local Search metaheuristic. The featured application of scheduling satellite communication is the Satellite Quantum Key Distribution for a system that consists of one spacecraft placed in the Lower Earth Orbit and a network of optical ground stations located in the United Kingdom. The satellite generates cryptographic keys and transmits them to individual ground stations. Each ground station should receive the number of keys in proportion to the importance of the ground station in the network. As clouds containing water attenuate the signal, reliable scheduling needs to account for cloud cover predictions, which are naturally affected by uncertainty. A new uncertainty sets tailored for modelling uncertainty in predictions of atmospheric phenomena is the main contribution to the methodology. The uncertainty set models the evolution of uncertain parameters using a Multivariate Vector Auto-Regressive Time Series, which preserves correlations over time and space. The problem formulation employing the new uncertainty set compares favourably to a suite of alternative models adapted from the literature considering both the computational time and the cost-effectiveness of the schedule evaluated in the cloud cover conditions observed in the real world. The other contribution of the thesis in the satellite scheduling domain is the formulation of the Satellite Quantum Key Distribution problem. The proof of computational complexity and thorough performance analysis of an example Satellite Quantum Key Distribution system accompany the formulation. The Home Care Scheduling and Routing Problem, which instances are solved for the largest provider of such services in Scotland, is the application of the Vehicle Routing Problem with Time Windows and Synchronised Visits. The problem instances contain over 500 visits. Around 20% of them require two carers simultaneously. Such problem instances are well beyond the scalability limitations of the exact method and considerably larger than instances of similar problems considered in the literature. The optimisation approach proposed in the thesis found effective solutions in attractive computational time (i.e., less than 30 minutes) and the solutions reduced the total travel time threefold compared to alternative schedules computed by human planners. The Essential Riskiness Index Optimisation was incorporated into the Constraint Programming model to address uncertainty in visits' duration. Besides solving large problem instances from the real world, the solution method reproduced the majority of the best results reported in the literature and strictly improved the solutions for several instances of a well-known benchmark for the Vehicle Routing Problem with Time Windows and Synchronised Visits.The thesis aims to develop a viable computational approach suitable for solving large vehicle routing and scheduling optimisation problems affected by uncertainty. The modelling framework is built upon recent advances in Stochastic Optimisation, Robust Optimisation and Distributionally Robust Optimization. The utility of the methodology is presented on two classes of discrete optimisation problems: scheduling satellite communication, which is a variant of Machine Scheduling, and the Vehicle Routing Problem with Time Windows and Synchronised Visits. For each problem class, a practical engineering application is formulated using data coming from the real world. The significant size of the problem instances reinforced the need to apply a different computational approach for each problem class. Satellite communication is scheduled using a Mixed-Integer Programming solver. In contrast, the vehicle routing problem with synchronised visits is solved using a hybrid method that combines Iterated Local Search, Constraint Programming and the Guided Local Search metaheuristic. The featured application of scheduling satellite communication is the Satellite Quantum Key Distribution for a system that consists of one spacecraft placed in the Lower Earth Orbit and a network of optical ground stations located in the United Kingdom. The satellite generates cryptographic keys and transmits them to individual ground stations. Each ground station should receive the number of keys in proportion to the importance of the ground station in the network. As clouds containing water attenuate the signal, reliable scheduling needs to account for cloud cover predictions, which are naturally affected by uncertainty. A new uncertainty sets tailored for modelling uncertainty in predictions of atmospheric phenomena is the main contribution to the methodology. The uncertainty set models the evolution of uncertain parameters using a Multivariate Vector Auto-Regressive Time Series, which preserves correlations over time and space. The problem formulation employing the new uncertainty set compares favourably to a suite of alternative models adapted from the literature considering both the computational time and the cost-effectiveness of the schedule evaluated in the cloud cover conditions observed in the real world. The other contribution of the thesis in the satellite scheduling domain is the formulation of the Satellite Quantum Key Distribution problem. The proof of computational complexity and thorough performance analysis of an example Satellite Quantum Key Distribution system accompany the formulation. The Home Care Scheduling and Routing Problem, which instances are solved for the largest provider of such services in Scotland, is the application of the Vehicle Routing Problem with Time Windows and Synchronised Visits. The problem instances contain over 500 visits. Around 20% of them require two carers simultaneously. Such problem instances are well beyond the scalability limitations of the exact method and considerably larger than instances of similar problems considered in the literature. The optimisation approach proposed in the thesis found effective solutions in attractive computational time (i.e., less than 30 minutes) and the solutions reduced the total travel time threefold compared to alternative schedules computed by human planners. The Essential Riskiness Index Optimisation was incorporated into the Constraint Programming model to address uncertainty in visits' duration. Besides solving large problem instances from the real world, the solution method reproduced the majority of the best results reported in the literature and strictly improved the solutions for several instances of a well-known benchmark for the Vehicle Routing Problem with Time Windows and Synchronised Visits
    corecore