21,887 research outputs found

    Optimal Design and Operation of Heat Exchanger Network

    Get PDF
    Heat exchanger networks (HENs) are the backbone of heat integration due to their ability in energy and environmental managements. This thesis deals with two issues on HENs. The first concerns with designing of economically optimal Heat exchanger network (HEN) whereas the second focus on optimal operation of HEN in the presence of uncertainties and disturbances within the network. In the first issue, a pinch technology based optimal HEN design is firstly implemented on a 3–streams heat recovery case study to design a simple HEN and then, a more complex HEN is designed for a coal-fired power plant retrofitted with CO2 capture unit to achieve the objectives of minimising energy penalty on the power plant due to its integration with the CO2 capture plant. The benchmark in this case study is a stream data from (Khalilpour and Abbas, 2011). Improvement to their work includes: (1) the use of economic data to evaluate achievable trade-offs between energy, capital and utility cost for determination of minimum temperature difference; (2) redesigning of the HEN based on the new minimum temperature difference and (3) its comparison with the base case design. The results shows that the energy burden imposed on the power plant with CO2 capture is significantly reduced through HEN leading to utility cost saving maximisation. The cost of addition of HEN is recoverable within a short payback period of about 2.8 years. In the second issue, optimal HEN operation considering range of uncertainties and disturbances in flowrates and inlet stream temperatures while minimizing utility consumption at constant target temperatures based on self-optimizing control (SOC) strategy. The new SOC method developed in this thesis is a data-driven SOC method which uses process data collected overtime during plant operation to select control variables (CVs). This is in contrast to the existing SOC strategies in which the CV selection requires process model to be linearized for nonlinear processes which leads to unaccounted losses due to linearization errors. The new approach selects CVs in which the necessary condition of optimality (NCO) is directly approximated by the CV through a single regression step. This work was inspired by Ye et al., (2013) regression based globally optimal CV selection with no model linearization and Ye et al., (2012) two steps regression based data-driven CV selection but with poor optimal results due to regression errors in the two steps procedures. The advantage of this work is that it doesn’t require evaluation of derivatives hence CVs can be evaluated even with commercial simulators such as HYSYS and UNISIM from among others. The effectiveness of the proposed method is again applied to the 3-streams HEN case study and also the HEN for coal-fired power plant with CO2 capture unit. The case studies show that the proposed methodology provides better optimal operation under uncertainties when compared to the existing model-based SOC techniques

    Solving the G-problems in less than 500 iterations: Improved efficient constrained optimization by surrogate modeling and adaptive parameter control

    Get PDF
    Constrained optimization of high-dimensional numerical problems plays an important role in many scientific and industrial applications. Function evaluations in many industrial applications are severely limited and no analytical information about objective function and constraint functions is available. For such expensive black-box optimization tasks, the constraint optimization algorithm COBRA was proposed, making use of RBF surrogate modeling for both the objective and the constraint functions. COBRA has shown remarkable success in solving reliably complex benchmark problems in less than 500 function evaluations. Unfortunately, COBRA requires careful adjustment of parameters in order to do so. In this work we present a new self-adjusting algorithm SACOBRA, which is based on COBRA and capable to achieve high-quality results with very few function evaluations and no parameter tuning. It is shown with the help of performance profiles on a set of benchmark problems (G-problems, MOPTA08) that SACOBRA consistently outperforms any COBRA algorithm with fixed parameter setting. We analyze the importance of the several new elements in SACOBRA and find that each element of SACOBRA plays a role to boost up the overall optimization performance. We discuss the reasons behind and get in this way a better understanding of high-quality RBF surrogate modeling

    A goal programming methodology for multiobjective optimization of distributed energy hubs operation

    Get PDF
    This paper addresses the problem of optimal energy flow management in multicarrier energy networks in the presence of interconnected energy hubs. The overall problem is here formalized by a nonlinear constrained multiobjective optimization problem and solved by a goal attainment based methodology. The application of this solution approach allows the analyst to identify the optimal operation state of the distributed energy hubs which ensures an effective and reliable operation of the multicarrier energy network in spite of large variations of load demands and energy prices. Simulation results obtained on the 30 bus IEEE test network are presented and discussed in order to demonstrate the significance and the validity of the proposed method

    Optimal Networks from Error Correcting Codes

    Full text link
    To address growth challenges facing large Data Centers and supercomputing clusters a new construction is presented for scalable, high throughput, low latency networks. The resulting networks require 1.5-5 times fewer switches, 2-6 times fewer cables, have 1.2-2 times lower latency and correspondingly lower congestion and packet losses than the best present or proposed networks providing the same number of ports at the same total bisection. These advantage ratios increase with network size. The key new ingredient is the exact equivalence discovered between the problem of maximizing network bisection for large classes of practically interesting Cayley graphs and the problem of maximizing codeword distance for linear error correcting codes. Resulting translation recipe converts existent optimal error correcting codes into optimal throughput networks.Comment: 14 pages, accepted at ANCS 2013 conferenc

    Data-Driven Robust Optimization

    Full text link
    The last decade witnessed an explosion in the availability of data for operations research applications. Motivated by this growing availability, we propose a novel schema for utilizing data to design uncertainty sets for robust optimization using statistical hypothesis tests. The approach is flexible and widely applicable, and robust optimization problems built from our new sets are computationally tractable, both theoretically and practically. Furthermore, optimal solutions to these problems enjoy a strong, finite-sample probabilistic guarantee. \edit{We describe concrete procedures for choosing an appropriate set for a given application and applying our approach to multiple uncertain constraints. Computational evidence in portfolio management and queuing confirm that our data-driven sets significantly outperform traditional robust optimization techniques whenever data is available.Comment: 38 pages, 15 page appendix, 7 figures. This version updated as of Oct. 201

    Optimizing I/O for Big Array Analytics

    Full text link
    Big array analytics is becoming indispensable in answering important scientific and business questions. Most analysis tasks consist of multiple steps, each making one or multiple passes over the arrays to be analyzed and generating intermediate results. In the big data setting, I/O optimization is a key to efficient analytics. In this paper, we develop a framework and techniques for capturing a broad range of analysis tasks expressible in nested-loop forms, representing them in a declarative way, and optimizing their I/O by identifying sharing opportunities. Experiment results show that our optimizer is capable of finding execution plans that exploit nontrivial I/O sharing opportunities with significant savings.Comment: VLDB201
    • …
    corecore