260 research outputs found
An Overview of Schema Theory
The purpose of this paper is to give an introduction to the field of Schema
Theory written by a mathematician and for mathematicians. In particular, we
endeavor to to highlight areas of the field which might be of interest to a
mathematician, to point out some related open problems, and to suggest some
large-scale projects. Schema theory seeks to give a theoretical justification
for the efficacy of the field of genetic algorithms, so readers who have
studied genetic algorithms stand to gain the most from this paper. However,
nothing beyond basic probability theory is assumed of the reader, and for this
reason we write in a fairly informal style.
Because the mathematics behind the theorems in schema theory is relatively
elementary, we focus more on the motivation and philosophy. Many of these
results have been proven elsewhere, so this paper is designed to serve a
primarily expository role. We attempt to cast known results in a new light,
which makes the suggested future directions natural. This involves devoting a
substantial amount of time to the history of the field.
We hope that this exposition will entice some mathematicians to do research
in this area, that it will serve as a road map for researchers new to the
field, and that it will help explain how schema theory developed. Furthermore,
we hope that the results collected in this document will serve as a useful
reference. Finally, as far as the author knows, the questions raised in the
final section are new.Comment: 27 pages. Originally written in 2009 and hosted on my website, I've
decided to put it on the arXiv as a more permanent home. The paper is
primarily expository, so I don't really know where to submit it, but perhaps
one day I will find an appropriate journa
A Kullback-Leibler Divergence Exploration into a Look-Ahead Simulation Optimization of the Extended Compact Genetic Algorithm
The Kullback-Leibler Divergence of gene distributions
between successive generations of the Extended Compact
Genetic Algorithm (ECGA) is explored. Therein, the fragility
of the algorithm’s dependability to the beginning generations’
biasing is suggested. A novel approach within the scope of the
ECGA for choosing a better bias by allowing the ECGA to
simulate itself is presented. It is shown that, by simulating itself,
the ECGA is able to use a smaller population and evaluate fewer
fitness calls while maintaining the same ability to find optimal
solutions
Explicit Building Block Multiobjective Evolutionary Computation: Methods and Applications
This dissertation presents principles, techniques, and performance of evolutionary computation optimization methods. Concentration is on concepts, design formulation, and prescription for multiobjective problem solving and explicit building block (BB) multiobjective evolutionary algorithms (MOEAs). Current state-of-the-art explicit BB MOEAs are addressed in the innovative design, execution, and testing of a new multiobjective explicit BB MOEA. Evolutionary computation concepts examined are algorithm convergence, population diversity and sizing, genotype and phenotype partitioning, archiving, BB concepts, parallel evolutionary algorithm (EA) models, robustness, visualization of evolutionary process, and performance in terms of effectiveness and efficiency. The main result of this research is the development of a more robust algorithm where MOEA concepts are implicitly employed. Testing shows that the new MOEA can be more effective and efficient than previous state-of-the-art explicit BB MOEAs for selected test suite multiobjective optimization problems (MOPs) and U.S. Air Force applications. Other contributions include the extension of explicit BB definitions to clarify the meanings for good single and multiobjective BBs. A new visualization technique is developed for viewing genotype, phenotype, and the evolutionary process in finding Pareto front vectors while tracking the size of the BBs. The visualization technique is the result of a BB tracing mechanism integrated into the new MOEA that enables one to determine the required BB sizes and assign an approximation epistasis level for solving a particular problem. The culmination of this research is explicit BB state-of-the-art MOEA technology based on the MOEA design, BB classifier type assessment, solution evolution visualization, and insight into MOEA test metric validation and usage as applied to test suite, deception, bioinformatics, unmanned vehicle flight pattern, and digital symbol set design MOPs
Protecting the infrastructure: 3rd Australian information warfare & security conference 2002
The conference is hosted by the We-B Centre (working with a-business) in the School of Management Information System, the School of Computer & Information Sciences at Edith Cowan University. This year\u27s conference is being held at the Sheraton Perth Hotel in Adelaide Terrace, Perth. Papers for this conference have been written by a wide range of academics and industry specialists. We have attracted participation from both national and international authors and organisations.
The papers cover many topics, all within the field of information warfare and its applications, now and into the future.
The papers have been grouped into six streams:
• Networks
• IWAR Strategy
• Security
• Risk Management
• Social/Education
• Infrastructur
Black box search : framework and methods
A theoretical framework is constructed to analyze the behavior of all determin-istic non-repeating search algorithms as they apply to all possible functions of a given finite domain and range. A population table data structure is introduced for this purpose, and many properties of the framework are discovered, including the number of deterministic non-repeating search algorithms. Canonical forms are pre-sented for all elements of the framework, as well as methods for converting between the objects and their canonical numbers and back again. The theorems regarding population tables allow for a simple, alternate form of the No Free Lunch (NFL) theorem, an important theorem regarding search algorithm performance over all functions. Previously, this theorem has only been proven in overly-complicated, confusing fashion. Other statements of the NFL theorem are shown in the light of this framework and the theorem is extended to non-complete sets of functions and to a non-trivial definition of stochastic search. The framework allows for an extensive study of minimax distinctions between search algorithms. A change of representation is easily expressed in the framework with obvious performance im-plications. The expected performance of random search with replacement, random search without replacement, and enumeration will be studied in some detail. Claims in the field regarding search algorithm robustness will be tested empirically. Experiments were performed to determine how the compressibility of a function impacts its performance, with an emphasis on randomly selected functions. A genetic algorithm was run on two sets of functions: one set contained functions that were known to be compressible, and the other contained functions that had a high probability of being incompressible. Performance was found to be the same for both sets
NATURAL ALGORITHMS IN DIGITAL FILTER DESIGN
Digital filters are an important part of Digital Signal Processing (DSP), which plays
vital roles within the modern world, but their design is a complex task requiring a great
deal of specialised knowledge. An analysis of this design process is presented, which
identifies opportunities for the application of optimisation.
The Genetic Algorithm (GA) and Simulated Annealing are problem-independent
and increasingly popular optimisation techniques. They do not require detailed prior
knowledge of the nature of a problem, and are unaffected by a discontinuous search
space, unlike traditional methods such as calculus and hill-climbing.
Potential applications of these techniques to the filter design process are discussed,
and presented with practical results. Investigations into the design of Frequency Sampling
(FS) Finite Impulse Response (FIR) filters using a hybrid GA/hill-climber proved
especially successful, improving on published results. An analysis of the search space
for FS filters provided useful information on the performance of the optimisation technique.
The ability of the GA to trade off a filter's performance with respect to several design
criteria simultaneously, without intervention by the designer, is also investigated.
Methods of simplifying the design process by using this technique are presented, together
with an analysis of the difficulty of the non-linear FIR filter design problem from
a GA perspective. This gave an insight into the fundamental nature of the optimisation
problem, and also suggested future improvements.
The results gained from these investigations allowed the framework for a potential
'intelligent' filter design system to be proposed, in which embedded expert knowledge,
Artificial Intelligence techniques and traditional design methods work together. This
could deliver a single tool capable of designing a wide range of filters with minimal
human intervention, and of proposing solutions to incomplete problems. It could also
provide the basis for the development of tools for other areas of DSP system design
- …