30,114 research outputs found

    Chance-Constrained Outage Scheduling using a Machine Learning Proxy

    Full text link
    Outage scheduling aims at defining, over a horizon of several months to years, when different components needing maintenance should be taken out of operation. Its objective is to minimize operation-cost expectation while satisfying reliability-related constraints. We propose a distributed scenario-based chance-constrained optimization formulation for this problem. To tackle tractability issues arising in large networks, we use machine learning to build a proxy for predicting outcomes of power system operation processes in this context. On the IEEE-RTS79 and IEEE-RTS96 networks, our solution obtains cheaper and more reliable plans than other candidates

    Optimal staffing under an annualized hours regime using Cross-Entropy optimization

    Get PDF
    This paper discusses staffing under annualized hours. Staffing is the selection of the most cost-efficient workforce to cover workforce demand. Annualized hours measure working time per year instead of per week, relaxing the restriction for employees to work the same number of hours every week. To solve the underlying combinatorial optimization problem this paper develops a Cross-Entropy optimization implementation that includes a penalty function and a repair function to guarantee feasible solutions. Our experimental results show Cross-Entropy optimization is efficient across a broad range of instances, where real-life sized instances are solved in seconds, which significantly outperforms an MILP formulation solved with CPLEX. In addition, the solution quality of Cross-Entropy closely approaches the optimal solutions obtained by CPLEX. Our Cross-Entropy implementation offers an outstanding method for real-time decision making, for example in response to unexpected staff illnesses, and scenario analysis

    Optimal design of water distribution systems based on entropy and topology

    Get PDF
    A new multi-objective evolutionary optimization approach for joint topology and pipe size design of water distribution systems is presented. The algorithm proposed considers simultaneously the adequacy of flow and pressure at the demand nodes; the initial construction cost; the network topology; and a measure of hydraulic capacity reliability. The optimization procedure is based on a general measure of hydraulic performance that combines statistical entropy, network connectivity and hydraulic feasibility. The topological properties of the solutions are accounted for and arbitrary assumptions regarding the quality of infeasible solutions are not applied. In other words, both feasible and infeasible solutions participate in the evolutionary processes; solutions survive and reproduce or perish strictly according to their Pareto-optimality. Removing artificial barriers in this way frees the algorithm to evolve optimal solutions quickly. Furthermore, any redundant binary codes that result from crossover or mutation are eliminated gradually in a seamless and generic way that avoids the arbitrary loss of potentially useful genetic material and preserves the quality of the information that is transmitted from one generation to the next. The approach proposed is entirely generic: we have not introduced any additional parameters that require calibration on a case-by-case basis. Detailed and extensive results for two test problems are included that suggest the approach is highly effective. In general, the frontier-optimal solutions achieved include topologies that are fully branched, partially- and fully-looped and, for networks with multiple sources, completely separate sub-networks

    Algorithms for the continuous nonlinear resource allocation problem---new implementations and numerical studies

    Full text link
    Patriksson (2008) provided a then up-to-date survey on the continuous,separable, differentiable and convex resource allocation problem with a single resource constraint. Since the publication of that paper the interest in the problem has grown: several new applications have arisen where the problem at hand constitutes a subproblem, and several new algorithms have been developed for its efficient solution. This paper therefore serves three purposes. First, it provides an up-to-date extension of the survey of the literature of the field, complementing the survey in Patriksson (2008) with more then 20 books and articles. Second, it contributes improvements of some of these algorithms, in particular with an improvement of the pegging (that is, variable fixing) process in the relaxation algorithm, and an improved means to evaluate subsolutions. Third, it numerically evaluates several relaxation (primal) and breakpoint (dual) algorithms, incorporating a variety of pegging strategies, as well as a quasi-Newton method. Our conclusion is that our modification of the relaxation algorithm performs the best. At least for problem sizes up to 30 million variables the practical time complexity for the breakpoint and relaxation algorithms is linear

    Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes

    Get PDF
    Exploiting the theory of state space models, we derive the exact expressions of the information transfer, as well as redundant and synergistic transfer, for coupled Gaussian processes observed at multiple temporal scales. All of the terms, constituting the frameworks known as interaction information decomposition and partial information decomposition, can thus be analytically obtained for different time scales from the parameters of the VAR model that fits the processes. We report the application of the proposed methodology firstly to benchmark Gaussian systems, showing that this class of systems may generate patterns of information decomposition characterized by mainly redundant or synergistic information transfer persisting across multiple time scales or even by the alternating prevalence of redundant and synergistic source interaction depending on the time scale. Then, we apply our method to an important topic in neuroscience, i.e., the detection of causal interactions in human epilepsy networks, for which we show the relevance of partial information decomposition to the detection of multiscale information transfer spreading from the seizure onset zone
    corecore