128,711 research outputs found
Multiple objective optimal control of integrated urban wastewater systems
Copyright © 2008 Elsevier. NOTICE: this is the author’s version of a work that was accepted for publication in Environmental Modelling and Software. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Environmental Modelling and Software, Vol. 23 Issue 2 (2008). DOI: 10.1016/j.envsoft.2007.06.003Integrated modelling of the urban wastewater system has received increasing attention in recent years and it has been clearly demonstrated, at least at a theoretical level, that system performance can be enhanced through optimized, integrated control. However, most research to date has focused on simple, single objective control. This paper proposes consideration of multiple objectives to more readily tackle complex real world situations. The water quality indicators of the receiving water are considered as control objectives directly, rather than by reference to surrogate criteria in the sewer system or treatment plant. A powerful multi-objective optimization genetic algorithm, NSGA II, is used to derive the Pareto optimal solutions, which can illustrate the whole trade-off relationships between objectives. A case study is used to demonstrate the benefits of multiple objective control and a significant improvement in each of the objectives can be observed in comparison with a conventional base case scenario. The simulation results also show the effectiveness of NSGA 11 for the integrated urban wastewater system despite its complexity
A bi-objective genetic algorithm approach to risk mitigation in project scheduling
A problem of risk mitigation in project scheduling is formulated as a bi-objective optimization problem, where the expected makespan and the expected total cost are both to be minimized. The expected total cost is the sum of four cost components: overhead cost, activity execution cost, cost of reducing risks and penalty cost for tardiness. Risks for activities are predefined. For each risk at an activity, various levels are defined, which correspond to the results of different preventive measures. Only those risks with a probable impact on the duration of the related activity are considered here. Impacts of risks are not only accounted for through the expected makespan but are also translated into cost and thus have an impact on the expected total cost. An MIP model and a heuristic solution approach based on genetic algorithms (GAs) is proposed. The experiments conducted indicate that GAs provide a fast and effective solution approach to the problem. For smaller problems, the results obtained by the GA are very good. For larger problems, there is room for improvement
Search based software engineering: Trends, techniques and applications
© ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives.
This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E
Evolutionary improvement of programs
Most applications of genetic programming (GP) involve the creation of an entirely new function, program or expression to solve a specific problem. In this paper, we propose a new approach that applies GP to improve existing software by optimizing its non-functional properties such as execution time, memory usage, or power consumption. In general, satisfying non-functional requirements is a difficult task and often achieved in part by optimizing compilers. However, modern compilers are in general not always able to produce semantically equivalent alternatives that optimize non-functional properties, even if such alternatives are known to exist: this is usually due to the limited local nature of such optimizations. In this paper, we discuss how best to combine and extend the existing evolutionary methods of GP, multiobjective optimization, and coevolution in order to improve existing software. Given as input the implementation of a function, we attempt to evolve a semantically equivalent version, in this case optimized to reduce execution time subject to a given probability distribution of inputs. We demonstrate that our framework is able to produce non-obvious optimizations that compilers are not yet able to generate on eight example functions. We employ a coevolved population of test cases to encourage the preservation of the function's semantics. We exploit the original program both through seeding of the population in order to focus the search, and as an oracle for testing purposes. As well as discussing the issues that arise when attempting to improve software, we employ rigorous experimental method to provide interesting and practical insights to suggest how to address these issues
Darwinian Data Structure Selection
Data structure selection and tuning is laborious but can vastly improve an
application's performance and memory footprint. Some data structures share a
common interface and enjoy multiple implementations. We call them Darwinian
Data Structures (DDS), since we can subject their implementations to survival
of the fittest. We introduce ARTEMIS a multi-objective, cloud-based
search-based optimisation framework that automatically finds optimal, tuned DDS
modulo a test suite, then changes an application to use that DDS. ARTEMIS
achieves substantial performance improvements for \emph{every} project in
Java projects from DaCapo benchmark, popular projects and uniformly
sampled projects from GitHub. For execution time, CPU usage, and memory
consumption, ARTEMIS finds at least one solution that improves \emph{all}
measures for () of the projects. The median improvement across
the best solutions is , , for runtime, memory and CPU
usage.
These aggregate results understate ARTEMIS's potential impact. Some of the
benchmarks it improves are libraries or utility functions. Two examples are
gson, a ubiquitous Java serialization framework, and xalan, Apache's XML
transformation tool. ARTEMIS improves gson by \%, and for
memory, runtime, and CPU; ARTEMIS improves xalan's memory consumption by
\%. \emph{Every} client of these projects will benefit from these
performance improvements.Comment: 11 page
Engineering failure analysis and design optimisation with HiP-HOPS
The scale and complexity of computer-based safety critical systems, like those used in the transport and manufacturing industries, pose significant challenges for failure analysis. Over the last decade, research has focused on automating this task. In one approach, predictive models of system failure are constructed from the topology of the system and local component failure models using a process of composition. An alternative approach employs model-checking of state automata to study the effects of failure and verify system safety properties. In this paper, we discuss these two approaches to failure analysis. We then focus on Hierarchically Performed Hazard Origin & Propagation Studies (HiP-HOPS) - one of the more advanced compositional approaches - and discuss its capabilities for automatic synthesis of fault trees, combinatorial Failure Modes and Effects Analyses, and reliability versus cost optimisation of systems via application of automatic model transformations. We summarise these contributions and demonstrate the application of HiP-HOPS on a simplified fuel oil system for a ship engine. In light of this example, we discuss strengths and limitations of the method in relation to other state-of-the-art techniques. In particular, because HiP-HOPS is deductive in nature, relating system failures back to their causes, it is less prone to combinatorial explosion and can more readily be iterated. For this reason, it enables exhaustive assessment of combinations of failures and design optimisation using computationally expensive meta-heuristics. (C) 2010 Elsevier Ltd. All rights reserved
A Hierachical Evolutionary Algorithm for Multiobjective Optimization in IMRT
Purpose: Current inverse planning methods for IMRT are limited because they
are not designed to explore the trade-offs between the competing objectives
between the tumor and normal tissues. Our goal was to develop an efficient
multiobjective optimization algorithm that was flexible enough to handle any
form of objective function and that resulted in a set of Pareto optimal plans.
Methods: We developed a hierarchical evolutionary multiobjective algorithm
designed to quickly generate a diverse Pareto optimal set of IMRT plans that
meet all clinical constraints and reflect the trade-offs in the plans. The top
level of the hierarchical algorithm is a multiobjective evolutionary algorithm
(MOEA). The genes of the individuals generated in the MOEA are the parameters
that define the penalty function minimized during an accelerated deterministic
IMRT optimization that represents the bottom level of the hierarchy. The MOEA
incorporates clinical criteria to restrict the search space through protocol
objectives and then uses Pareto optimality among the fitness objectives to
select individuals.
Results: Acceleration techniques implemented on both levels of the
hierarchical algorithm resulted in short, practical runtimes for optimizations.
The MOEA improvements were evaluated for example prostate cases with one target
and two OARs. The modified MOEA dominated 11.3% of plans using a standard
genetic algorithm package. By implementing domination advantage and protocol
objectives, small diverse populations of clinically acceptable plans that were
only dominated 0.2% by the Pareto front could be generated in a fraction of an
hour.
Conclusions: Our MOEA produces a diverse Pareto optimal set of plans that
meet all dosimetric protocol criteria in a feasible amount of time. It
optimizes not only beamlet intensities but also objective function parameters
on a patient-specific basis
Exploiting the Design Freedom of RM
This paper details how Rapid Manufacturing (RM) can overcome the restrictions imposed by the
inherent process limitations of conventional manufacturing techniques and become the enabling
technology in fabricating optimal products. A new design methodology capable of exploiting
RM’s increased design freedom is therefore needed. Inspired by natural world structures of trees
and bones, a multi-objective, genetic algorithm based topology optimisation approach is
presented. This combines multiple unit cell structures and varying volume fractions to create a
heterogeneous part structure which exhibits a uniform stress distribution.Mechanical Engineerin
- …