113,640 research outputs found

    A contribution to the evaluation and optimization of networks reliability

    Get PDF
    L’évaluation de la fiabilité des réseaux est un problème combinatoire très complexe qui nécessite des moyens de calcul très puissants. Plusieurs méthodes ont été proposées dans la littérature pour apporter des solutions. Certaines ont été programmées dont notamment les méthodes d’énumération des ensembles minimaux et la factorisation, et d’autres sont restées à l’état de simples théories. Cette thèse traite le cas de l’évaluation et l’optimisation de la fiabilité des réseaux. Plusieurs problèmes ont été abordés dont notamment la mise au point d’une méthodologie pour la modélisation des réseaux en vue de l’évaluation de leur fiabilités. Cette méthodologie a été validée dans le cadre d’un réseau de radio communication étendu implanté récemment pour couvrir les besoins de toute la province québécoise. Plusieurs algorithmes ont aussi été établis pour générer les chemins et les coupes minimales pour un réseau donné. La génération des chemins et des coupes constitue une contribution importante dans le processus d’évaluation et d’optimisation de la fiabilité. Ces algorithmes ont permis de traiter de manière rapide et efficace plusieurs réseaux tests ainsi que le réseau de radio communication provincial. Ils ont été par la suite exploités pour évaluer la fiabilité grâce à une méthode basée sur les diagrammes de décision binaire. Plusieurs contributions théoriques ont aussi permis de mettre en place une solution exacte de la fiabilité des réseaux stochastiques imparfaits dans le cadre des méthodes de factorisation. A partir de cette recherche plusieurs outils ont été programmés pour évaluer et optimiser la fiabilité des réseaux. Les résultats obtenus montrent clairement un gain significatif en temps d’exécution et en espace de mémoire utilisé par rapport à beaucoup d’autres implémentations. Mots-clés: Fiabilité, réseaux, optimisation, diagrammes de décision binaire, ensembles des chemins et coupes minimales, algorithmes, indicateur de Birnbaum, systèmes de radio télécommunication, programmes.Efficient computation of systems reliability is required in many sensitive networks. Despite the increased efficiency of computers and the proliferation of algorithms, the problem of finding good and quickly solutions in the case of large systems remains open. Recently, efficient computation techniques have been recognized as significant advances to solve the problem during a reasonable period of time. However, they are applicable to a special category of networks and more efforts still necessary to generalize a unified method giving exact solution. Assessing the reliability of networks is a very complex combinatorial problem which requires powerful computing resources. Several methods have been proposed in the literature. Some have been implemented including minimal sets enumeration and factoring methods, and others remained as simple theories. This thesis treats the case of networks reliability evaluation and optimization. Several issues were discussed including the development of a methodology for modeling networks and evaluating their reliabilities. This methodology was validated as part of a radio communication network project. In this work, some algorithms have been developed to generate minimal paths and cuts for a given network. The generation of paths and cuts is an important contribution in the process of networks reliability and optimization. These algorithms have been subsequently used to assess reliability by a method based on binary decision diagrams. Several theoretical contributions have been proposed and helped to establish an exact solution of the stochastic networks reliability in which edges and nodes are subject to failure using factoring decomposition theorem. From this research activity, several tools have been implemented and results clearly show a significant gain in time execution and memory space used by comparison to many other implementations. Key-words: Reliability, Networks, optimization, binary decision diagrams, minimal paths set and cuts set, algorithms, Birnbaum performance index, Networks, radio-telecommunication systems, programs

    Chance-Constrained Outage Scheduling using a Machine Learning Proxy

    Full text link
    Outage scheduling aims at defining, over a horizon of several months to years, when different components needing maintenance should be taken out of operation. Its objective is to minimize operation-cost expectation while satisfying reliability-related constraints. We propose a distributed scenario-based chance-constrained optimization formulation for this problem. To tackle tractability issues arising in large networks, we use machine learning to build a proxy for predicting outcomes of power system operation processes in this context. On the IEEE-RTS79 and IEEE-RTS96 networks, our solution obtains cheaper and more reliable plans than other candidates

    High-Dimensional Stochastic Design Optimization by Adaptive-Sparse Polynomial Dimensional Decomposition

    Full text link
    This paper presents a novel adaptive-sparse polynomial dimensional decomposition (PDD) method for stochastic design optimization of complex systems. The method entails an adaptive-sparse PDD approximation of a high-dimensional stochastic response for statistical moment and reliability analyses; a novel integration of the adaptive-sparse PDD approximation and score functions for estimating the first-order design sensitivities of the statistical moments and failure probability; and standard gradient-based optimization algorithms. New analytical formulae are presented for the design sensitivities that are simultaneously determined along with the moments or the failure probability. Numerical results stemming from mathematical functions indicate that the new method provides more computationally efficient design solutions than the existing methods. Finally, stochastic shape optimization of a jet engine bracket with 79 variables was performed, demonstrating the power of the new method to tackle practical engineering problems.Comment: 18 pages, 2 figures, to appear in Sparse Grids and Applications--Stuttgart 2014, Lecture Notes in Computational Science and Engineering 109, edited by J. Garcke and D. Pfl\"{u}ger, Springer International Publishing, 201

    Quantile-based optimization under uncertainties using adaptive Kriging surrogate models

    Full text link
    Uncertainties are inherent to real-world systems. Taking them into account is crucial in industrial design problems and this might be achieved through reliability-based design optimization (RBDO) techniques. In this paper, we propose a quantile-based approach to solve RBDO problems. We first transform the safety constraints usually formulated as admissible probabilities of failure into constraints on quantiles of the performance criteria. In this formulation, the quantile level controls the degree of conservatism of the design. Starting with the premise that industrial applications often involve high-fidelity and time-consuming computational models, the proposed approach makes use of Kriging surrogate models (a.k.a. Gaussian process modeling). Thanks to the Kriging variance (a measure of the local accuracy of the surrogate), we derive a procedure with two stages of enrichment of the design of computer experiments (DoE) used to construct the surrogate model. The first stage globally reduces the Kriging epistemic uncertainty and adds points in the vicinity of the limit-state surfaces describing the system performance to be attained. The second stage locally checks, and if necessary, improves the accuracy of the quantiles estimated along the optimization iterations. Applications to three analytical examples and to the optimal design of a car body subsystem (minimal mass under mechanical safety constraints) show the accuracy and the remarkable efficiency brought by the proposed procedure

    Inexact Convex Relaxations for AC Optimal Power Flow: Towards AC Feasibility

    Full text link
    Convex relaxations of AC optimal power flow (AC-OPF) problems have attracted significant interest as in several instances they provably yield the global optimum to the original non-convex problem. If, however, the relaxation is inexact, the obtained solution is not AC-feasible. The quality of the obtained solution is essential for several practical applications of AC-OPF, but detailed analyses are lacking in existing literature. This paper aims to cover this gap. We provide an in-depth investigation of the solution characteristics when convex relaxations are inexact, we assess the most promising AC feasibility recovery methods for large-scale systems, and we propose two new metrics that lead to a better understanding of the quality of the identified solutions. We perform a comprehensive assessment on 96 different test cases, ranging from 14 to 3120 buses, and we show the following: (i) Despite an optimality gap of less than 1%, several test cases still exhibit substantial distances to both AC feasibility and local optimality and the newly proposed metrics characterize these deviations. (ii) Penalization methods fail to recover an AC-feasible solution in 15 out of 45 cases, and using the proposed metrics, we show that most failed test instances exhibit substantial distances to both AC-feasibility and local optimality. For failed test instances with small distances, we show how our proposed metrics inform a fine-tuning of penalty weights to obtain AC-feasible solutions. (iii) The computational benefits of warm-starting non-convex solvers have significant variation, but a computational speedup exists in over 75% of the cases
    • …
    corecore