1,293 research outputs found

    A Comparison of Archiving Strategies for Characterization of Nearly Optimal Solutions under Multi-Objective Optimization

    Full text link
    [EN] In a multi-objective optimization problem, in addition to optimal solutions, multimodal and/or nearly optimal alternatives can also provide additional useful information for the decision maker. However, obtaining all nearly optimal solutions entails an excessive number of alternatives. Therefore, to consider the nearly optimal solutions, it is convenient to obtain a reduced set, putting the focus on the potentially useful alternatives. These solutions are the alternatives that are close to the optimal solutions in objective space, but which differ significantly in the decision space. To characterize this set, it is essential to simultaneously analyze the decision and objective spaces. One of the crucial points in an evolutionary multi-objective optimization algorithm is the archiving strategy. This is in charge of keeping the solution set, called the archive, updated during the optimization process. The motivation of this work is to analyze the three existing archiving strategies proposed in the literature (ArchiveUpdateP(Q,epsilon)D(xy), Archive_nevMOGA, and targetSelect) that aim to characterize the potentially useful solutions. The archivers are evaluated on two benchmarks and in a real engineering example. The contribution clearly shows the main differences between the three archivers. This analysis is useful for the design of evolutionary algorithms that consider nearly optimal solutions.This work was supported in part by the Ministerio de Ciencia, Innovacion y Universidades (Spain) (grant number RTI2018-096904-B-I00), by the Generalitat Valenciana regional government through project AICO/2019/055 and by the Universitat Politecnica de Valencia (grant number SP20200109).Pajares-Ferrando, A.; Blasco, X.; Herrero Durá, JM.; Martínez Iranzo, MA. (2021). A Comparison of Archiving Strategies for Characterization of Nearly Optimal Solutions under Multi-Objective Optimization. Mathematics. 9(9):1-28. https://doi.org/10.3390/math9090999S1289

    Meta-parametric design: Developing a computational approach for early stage collaborative practice

    Get PDF
    Computational design is the study of how programmable computers can be integrated into the process of design. It is not simply the use of pre-compiled computer aided design software that aims to replicate the drawing board, but rather the development of computer algorithms as an integral part of the design process. Programmable machines have begun to challenge traditional modes of thinking in architecture and engineering, placing further emphasis on process ahead of the final result. Just as Darwin and Wallace had to think beyond form and inquire into the development of biological organisms to understand evolution, so computational methods enable us to rethink how we approach the design process itself. The subject is broad and multidisciplinary, with influences from design, computer science, mathematics, biology and engineering. This thesis begins similarly wide in its scope, addressing both the technological aspects of computational design and its application on several case study projects in professional practice. By learning through participant observation in combination with secondary research, it is found that design teams can be most effective at the early stage of projects by engaging with the additional complexity this entails. At this concept stage, computational tools such as parametric models are found to have insufficient flexibility for wide design exploration. In response, an approach called Meta-Parametric Design is proposed, inspired by developments in genetic programming (GP). By moving to a higher level of abstraction as computational designers, a Meta-Parametric approach is able to adapt to changing constraints and requirements whilst maintaining an explicit record of process for collaborative working

    Asymmetric Release Planning-Compromising Satisfaction against Dissatisfaction

    Full text link
    Maximizing satisfaction from offering features as part of the upcoming release(s) is different from minimizing dissatisfaction gained from not offering features. This asymmetric behavior has never been utilized for product release planning. We study Asymmetric Release Planning (ARP) by accommodating asymmetric feature evaluation. We formulated and solved ARP as a bi-criteria optimization problem. In its essence, it is the search for optimized trade-offs between maximum stakeholder satisfaction and minimum dissatisfaction. Different techniques including a continuous variant of Kano analysis are available to predict the impact on satisfaction and dissatisfaction with a product release from offering or not offering a feature. As a proof of concept, we validated the proposed solution approach called Satisfaction-Dissatisfaction Optimizer (SDO) via a real-world case study project. From running three replications with varying effort capacities, we demonstrate that SDO generates optimized trade-off solutions being (i) of a different value profile and different structure, (ii) superior to the application of random search and heuristics in terms of quality and completeness, and (iii) superior to the usage of manually generated solutions generated from managers of the case study company. A survey with 20 stakeholders evaluated the applicability and usefulness of the generated results

    Managing complexity in marketing:from a design Weltanschauung

    Get PDF

    Many-objective design of reservoir systems - Applications to the Blue Nile

    Get PDF
    This work proposes a multi-criteria optimization-based approach for supporting the negotiated design of multireservoir systems. The research addresses the multi-reservoir system design problem (selecting among alternative options, reservoir sizing), the capacity expansion problem (timing the activation of new assets and the filling of new large reservoirs) and management of multi-reservoir systems at various expansion stages. The aim is to balance multiple long and short-term performance objectives of relevance to stakeholders with differing interests. The work also investigates how problem re-formulations can be used to improve computational efficiency at the design and assessment stage and proposes a framework for post-processing of many objective optimization results to facilitate negotiation among multiple stakeholders. The proposed methods are demonstrated using the Blue Nile in a suite of proof-of-concept studies. Results take the form of Pareto-optimal trade-offs where each point on the curve or surface represents the design of water resource systems (i.e., asset choice, size, implementation dates of reservoirs, and operating policy) and coordination strategies (e.g., cost sharing and power trade) where further benefits in one measure necessarily come at the expense of another. Technical chapters aim to offer practical Nile management and/or investment recommendations deriving from the analysis which could be refined in future more detailed studies

    Interactive optimization for supporting multicriteria decisions in urban and energy system planning

    Get PDF
    Climate change and growing urban populations are increasingly putting pressure on cities to reduce their carbon emissions and transition towards efficient and renewable energy systems. This challenges in particular urban planners, who are expected to integrate technical energy aspects and balance them with the conflicting and often elusive needs of other urban actors. This thesis explores how multicriteria decision analysis, and in particular multiobjective optimization techniques, can support this task. While multiobjective optimization is particularly suited for generating efficient and original alternatives, it presents two shortcomings when targeted at large, intractable problems. First, the problem size prevents a complete identification of all solutions. Second, the preferences required to narrow the problem size are difficult to know and formulate precisely before seeing the possible alternatives. Interactive optimization addresses both of these gaps by involving the human decision-maker in the calculation process, incorporating their preferences at the same time as the generated alternatives enrich their understanding of acceptable tradeoffs and important criteria. For interactive optimization methods to be adopted in practice, computational frameworks are required, which can handle and visualize many objectives simultaneously, provide optimal solutions quickly and representatively, all while remaining simple and intuitive to use and understand by practitioners. Accordingly, the main objective of this thesis is: To develop a decision support methodology which enables the integration of energy issues in the early stages of urban planning. The proposed response and main contribution is SAGESSE (Systematic Analysis, Generation, Exploration, Steering and Synthesis Experience), an interactive multiobjective optimization decision support methodology, which addresses the practical and technical shortcomings above. Its innovative aspect resides in the combination of (i) parallel coordinates as a means to simultaneously explore and steer the alternative-generation process, (ii) a quasi-random sampling technique to efficiently explore the solution space in areas specified by the decision maker, and (iii) the integration of multiattribute decision analysis, cluster analysis and linked data visualization techniques to facilitate the interpretation of the Pareto front in real-time. Developed in collaboration with urban and energy planning practitioners, the methodology was applied to two Swiss urban planning case-studies: one greenfield project, in which all buildings and energy technologies are conceived ex nihilo, and one brownfield project, in which an existing urban neighborhood is redeveloped. These applications led to the progressive development of computational methods based on mathematical programming and data modeling (in the context of another thesis) which, applied with SAGESSE, form the planning support system URBio. Results indicate that the methodology is effective in exploring hundreds of plans and revealing tradeoffs and synergies between multiple objectives. The concrete outcomes of the calculations provide inputs for specifying political targets and deriving urban master plans

    Using and Interpreting the Bayesian Optimization Algorithm to Improve Early Stage Design of Marine Structures.

    Full text link
    Early stage naval structural design continues to advance as designers seek to improve the quality and speed of the design process. The early stages of design produce preliminary dimensions or scantlings which control the cost and structural performance of a vessel. Increased complexity in the evaluation of structural response has led to a need for efficient algorithms well suited to solving structural design specific optimization problems. As problem sizes increase, existing optimizers can become slow or inaccurate. The Bayesian Optimization Algorithm (BOA) is presented as one solution to efficiently solve problems in the structural design optimization process. The Bayesian optimization algorithm is an Estimation of Distribution Algorithm (EDA) that uses a statistical sample of potential design solutions to create and train a Bayesian network (BN). The application of BNs is well suited for nearly decomposable problem composition which closely matches rules based structural design evaluation. This makes the BOA well suited to solve complex early stage structural optimization problems. Additionally, the learning processes used to create and train the BNs can be analyzed and interpreted to capture design knowledge. This return of knowledge to the designer helps to improve designer intuition and model synthesis in the face of more complex and intricate models. The BNs are thus analyzed to augment design problem understanding and explore trade-offs within the design space. The result matches a paradigm shift in early stage optimization of naval structures. Designers gain better understanding of critical design variables and their interactions as compared to the previous focus on the single most optimal solution. This leads to efficient simulations which rapidly explore design spaces, document critical design variable relationships and enable the designer to create better early stage design solutions.PhDNaval Architecture and Marine EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/133317/1/tedevine_1.pd

    Corridor Location: Generating Competitive and Efficient Route Alternatives

    Get PDF
    The problem of transmission line corridor location can be considered, at best, a "wicked" public systems decision problem. It requires the consideration of numerous objectives while balancing the priorities of a variety of stakeholders, and designers should be prepared to develop diverse non-inferior route alternatives that must be defensible under the scrutiny of a public forum. Political elements aside, the underlying geographical computational problems that must be solved to provide a set of high quality alternatives are no less easy, as they require solving difficult spatial optimization problems on massive GIS terrain-based raster data sets.Transmission line siting methodologies have previously been developed to guide designers in this endeavor, but close scrutiny of these methodologies show that there are many shortcomings with their approaches. The main goal of this dissertation is to take a fresh look at the process of corridor location, and develop a set of algorithms that compute path alternatives using a foundation of solid geographical theory in order to offer designers better tools for developing quality alternatives that consider the entire spectrum of viable solutions. And just as importantly, as data sets become increasingly massive and present challenging computational elements, it is important that algorithms be efficient and able to take advantage of parallel computing resources.A common approach to simplify a problem with numerous objectives is to combine the cost layers into a composite a priori weighted single-objective raster grid. This dissertation examines new methods used for determining a spatially diverse set of near-optimal alternatives, and develops parallel computing techniques for brute-force near-optimal path enumeration, as well as more elegant methods that take advantage of the hierarchical structure of the underlying path-tree computation to select sets of spatially diverse near optimal paths.Another approach for corridor location is to simultaneously consider all objectives to determine the set of Pareto-optimal solutions between the objectives. This amounts to solving a discrete multi-objective shortest path problem, which is considered to be NP-Hard for computing the full set of non-inferior solutions. Given the difficulty of solving for the complete Pareto-optimal set, this dissertation develops an approximation heuristic to compute path sets that are nearly exact-optimal in a fraction of the time when compared to exact algorithms. This method is then applied as an upper bound to an exact enumerative approach, resulting in significant performance speedups. But as analytic computing continues to moved toward distributed clusters, it is important to optimize algorithms to take full advantage parallel computing. To that extent, this dissertation develops a scalable parallel framework that efficiently solves for the supported/convex solutions of a biobjective shortest path problem. This framework is equally applicable to other biobjective network optimization problems, providing a powerful tool for solving the next generation of location analysis and geographical optimization models
    • …
    corecore