40,995 research outputs found
Contingency-Constrained Unit Commitment With Intervening Time for System Adjustments
The N-1-1 contingency criterion considers the con- secutive loss of two
components in a power system, with intervening time for system adjustments. In
this paper, we consider the problem of optimizing generation unit commitment
(UC) while ensuring N-1-1 security. Due to the coupling of time periods
associated with consecutive component losses, the resulting problem is a very
large-scale mixed-integer linear optimization model. For efficient solution, we
introduce a novel branch-and-cut algorithm using a temporally decomposed
bilevel separation oracle. The model and algorithm are assessed using multiple
IEEE test systems, and a comprehensive analysis is performed to compare system
performances across different contingency criteria. Computational results
demonstrate the value of considering intervening time for system adjustments in
terms of total cost and system robustness.Comment: 8 pages, 5 figure
Contingency-Constrained Unit Commitment with Post-Contingency Corrective Recourse
We consider the problem of minimizing costs in the generation unit commitment
problem, a cornerstone in electric power system operations, while enforcing an
N-k-e reliability criterion. This reliability criterion is a generalization of
the well-known - criterion, and dictates that at least
fraction of the total system demand must be met following the failures of
or fewer system components. We refer to this problem as the
Contingency-Constrained Unit Commitment problem, or CCUC. We present a
mixed-integer programming formulation of the CCUC that accounts for both
transmission and generation element failures. We propose novel cutting plane
algorithms that avoid the need to explicitly consider an exponential number of
contingencies. Computational studies are performed on several IEEE test systems
and a simplified model of the Western US interconnection network, which
demonstrate the effectiveness of our proposed methods relative to current
state-of-the-art
A Flexible, Natural Formulation for the Network Design Problem with Vulnerability Constraints
Given a graph, a set of origin-destination (OD) pairs with communication requirements, and an integer k >= 2, the network design problem with vulnerability constraints (NDPVC) is to identify a subgraph with the minimum total edge costs such that, between each OD pair, there exist a hop-constrained primary path and a hop-constrained backup path after any k - 1 edges of the graph fail. Formulations exist for single-edge failures (i.e., k = 2). To solve the NDPVC for an arbitrary number of edge failures, we develop two natural formulations based on the notion of length-bounded cuts. We compare their strengths and flexibilities in solving the problem for k >= 3. We study different methods to separate infeasible solutions by computing length-bounded cuts of a given size. Experimental results show that, for single-edge failures, our formulation increases the number of solved benchmark instances from 61% (obtained within a two-hour limit by the best published algorithm) to more than 95%, thus increasing the number of solved instances by 1,065. Our formulation also accelerates the solution process for larger hop limits and efficiently solves the NDPVC for general k. We test our best algorithm for two to five simultaneous edge failures and investigate the impact of multiple failures on the network design
Risk based resilient network design
This paper presents a risk-based approach to resilient network design. The basic design problem considered is that given a working network and a fixed budget, how best to allocate the budget for deploying a survivability technique in different parts of the network based on managing the risk. The term risk measures two related quantities: the likelihood of failure or attack, and the amount of damage caused by the failure or attack. Various designs with different risk-based design objectives are considered, for example, minimizing the expected damage, minimizing the maximum damage, and minimizing a measure of the variability of damage that could occur in the network. A design methodology for the proposed risk-based survivable network design approach is presented within an optimization model framework. Numerical results and analysis illustrating the different risk based designs and the tradeoffs among the schemes are presented. © 2011 Springer Science+Business Media, LLC
Synthesis, Interdiction, and Protection of Layered Networks
This research developed the foundation, theory, and framework for a set of analysis techniques to assist decision makers in analyzing questions regarding the synthesis, interdiction, and protection of infrastructure networks. This includes extension of traditional network interdiction to directly model nodal interdiction; new techniques to identify potential targets in social networks based on extensions of shortest path network interdiction; extension of traditional network interdiction to include layered network formulations; and develops models/techniques to design robust layered networks while considering trade-offs with cost. These approaches identify the maximum protection/disruption possible across layered networks with limited resources, find the most robust layered network design possible given the budget limitations while ensuring that the demands are met, include traditional social network analysis, and incorporate new techniques to model the interdiction of nodes and edges throughout the formulations. In addition, the importance and effects of multiple optimal solutions for these (and similar) models is investigated. All the models developed are demonstrated on notional examples and were tested on a range of sample problem sets
The distance-based critical node detection problem : models and algorithms
In the wake of terrorism and natural disasters, assessing networked systems for vulnerability to failures that arise from these events is essential to maintaining the operations of the systems. This is very crucial given the heavy dependence of daily social and economic activities on networked systems such as transport, telecommunication and energy networks as well as the interdependence of these networks. In this thesis, we explore methods to assess the vulnerability of networked systems to element failures which employ connectivity as the performance measure for vulnerability. The associated optimisation problem termed the critical node (edge) detection problem seeks to identify a subset of nodes (edges) of a network whose deletion (failure) optimises a network connectivity objective. Traditional connectivity measures employed in most studies of the critical node detection problem overlook internal cohesiveness of networks and the extent of connectivity in the network. This limits the effectiveness of the developed methods in uncovering vulnerability with regards to network connectivity. Our work therefore focuses on distance-based connectivity which is a fairly new class of connectivity introduced for studying the critical node detection problem to overcome the limitations of traditional connectivity measures.
In Chapter 1, we provide an introduction outlining the motivations and the methods related to our study. In Chapter 2, we review the literature on the critical node detection problem as well as its application areas and related problems. Following this, we formally introduce the distance-based critical node detection problem in Chapter 3 where we propose new integer programming models for the case of hop-based distances and an efficient algorithm for the separation problems associated with the models. We also propose two families of valid inequalities. In Chapter 4, we consider the distance-based critical node detection problem using a heuristic approach in which we propose a centrality-based heuristic that employs a backbone crossover and a centrality-based neighbourhood search. In Chapter 5, we present generalisations of the methods proposed in Chapter 3 to edge-weighted graphs. We also introduce the edge-deletion version of the problem which we term the distance based critical edge detection problem. Throughout Chapters 3, 4 and 5, we provide computational experiments.
Finally, in Chapter 6 we present conclusions as well future research directions.
Keywords: Network Vulnerability, Critical Node Detection Problem, Distance-based Connectivity, Integer Programming, Lazy Constraints, Branch-and-cut, Heuristics.In the wake of terrorism and natural disasters, assessing networked systems for vulnerability to failures that arise from these events is essential to maintaining the operations of the systems. This is very crucial given the heavy dependence of daily social and economic activities on networked systems such as transport, telecommunication and energy networks as well as the interdependence of these networks. In this thesis, we explore methods to assess the vulnerability of networked systems to element failures which employ connectivity as the performance measure for vulnerability. The associated optimisation problem termed the critical node (edge) detection problem seeks to identify a subset of nodes (edges) of a network whose deletion (failure) optimises a network connectivity objective. Traditional connectivity measures employed in most studies of the critical node detection problem overlook internal cohesiveness of networks and the extent of connectivity in the network. This limits the effectiveness of the developed methods in uncovering vulnerability with regards to network connectivity. Our work therefore focuses on distance-based connectivity which is a fairly new class of connectivity introduced for studying the critical node detection problem to overcome the limitations of traditional connectivity measures.
In Chapter 1, we provide an introduction outlining the motivations and the methods related to our study. In Chapter 2, we review the literature on the critical node detection problem as well as its application areas and related problems. Following this, we formally introduce the distance-based critical node detection problem in Chapter 3 where we propose new integer programming models for the case of hop-based distances and an efficient algorithm for the separation problems associated with the models. We also propose two families of valid inequalities. In Chapter 4, we consider the distance-based critical node detection problem using a heuristic approach in which we propose a centrality-based heuristic that employs a backbone crossover and a centrality-based neighbourhood search. In Chapter 5, we present generalisations of the methods proposed in Chapter 3 to edge-weighted graphs. We also introduce the edge-deletion version of the problem which we term the distance based critical edge detection problem. Throughout Chapters 3, 4 and 5, we provide computational experiments.
Finally, in Chapter 6 we present conclusions as well future research directions.
Keywords: Network Vulnerability, Critical Node Detection Problem, Distance-based Connectivity, Integer Programming, Lazy Constraints, Branch-and-cut, Heuristics
BeWith: A Between-Within Method to Discover Relationships between Cancer Modules via Integrated Analysis of Mutual Exclusivity, Co-occurrence and Functional Interactions
The analysis of the mutational landscape of cancer, including mutual
exclusivity and co-occurrence of mutations, has been instrumental in studying
the disease. We hypothesized that exploring the interplay between
co-occurrence, mutual exclusivity, and functional interactions between genes
will further improve our understanding of the disease and help to uncover new
relations between cancer driving genes and pathways. To this end, we designed a
general framework, BeWith, for identifying modules with different combinations
of mutation and interaction patterns. We focused on three different settings of
the BeWith schema: (i) BeME-WithFun in which the relations between modules are
enriched with mutual exclusivity while genes within each module are
functionally related; (ii) BeME-WithCo which combines mutual exclusivity
between modules with co-occurrence within modules; and (iii) BeCo-WithMEFun
which ensures co-occurrence between modules while the within module relations
combine mutual exclusivity and functional interactions. We formulated the
BeWith framework using Integer Linear Programming (ILP), enabling us to find
optimally scoring sets of modules. Our results demonstrate the utility of
BeWith in providing novel information about mutational patterns, driver genes,
and pathways. In particular, BeME-WithFun helped identify functionally coherent
modules that might be relevant for cancer progression. In addition to finding
previously well-known drivers, the identified modules pointed to the importance
of the interaction between NCOR and NCOA3 in breast cancer. Additionally, an
application of the BeME-WithCo setting revealed that gene groups differ with
respect to their vulnerability to different mutagenic processes, and helped us
to uncover pairs of genes with potentially synergetic effects, including a
potential synergy between mutations in TP53 and metastasis related DCC gene
Identification of critical combination of vulnerable links in transportation networks – a global optimisation approach
This paper presents a global optimisation framework for identifying the most critical combination of vulnerable links in a transportation network. The problem is formulated as a mixed-integer non-linear programme with equilibrium constraints, aiming to determine the combination of links whose deterioration would induce the most increase in total travel cost in the network. A global optimisation solution method applying a piecewise linearisation approach and range-reduction technique is developed to solve the model. From the numerical results, it is interesting and counterintuitive to note that the set of most vulnerable links when simultaneous multiple-link failure occurs is not simply the combination of the most vulnerable links with single-link failure, and the links in the critical combination of vulnerable links are not necessarily connected or even in the neighbourhood of each other. The numerical results also show that the ranking of vulnerable links will be significantly affected by certain input parameters
- …