461 research outputs found

    New experimentalism applied to evolutionary computation

    Get PDF
    This thesis develops a solid statistical methodology to analyze search heuristics such as evolutionary algorithm s based on the concept of the new experimentalism. The new experimentalism is an influential discipline in the modern philosophy of science. The new experimentalists are seeking for a relatively secure basis for science, not in theory or observation, but in experiment. Deborah Mayo - one of its prominent proponents - developed a detailed way in which sc ientific claims can be validated by experiment. First, the concept of the new experimentalism for computer experiments is introduced. The difference between significant and meani ngful results is detailed. Following Mayo, a re-interpretation of the Neyman-Pearson theory of testing for computer experiments is given. Since statistical tests can be used as l earning tools, they provide means to extend widely accepted popperian paradigms. Models are characterized as central elements of science. We claim that experiment dominates theor y. Many, even conflicting, theories can co-exist independently for one unique experimental result. Maybe there is no theory applying to every phenomenon, but many simple theories describing what happens from case to case. Basic definitions from computational statistics, classical design of experiments (DOE), and modern design of computer experiments (DAC E) are explained to provide the reader with the required background information from statistics. An elevator group control model, which has been developed in cooperation with one of the world's leading elevator manufacturers, is introduced as an example for complex real-world optimization problems. It is used to illustrate the difference between art ificial functions from test suites and real-world problems. Problems related to these commonly used test-suites are discussed. Experimenters have to decide where to place sample points. Classical and modern experimental designs are compared to describe the difference between space-filling designs and designs that place experimental points at the boundari es of the experimental region. In many situations, it might be beneficial to generate the design points not at once, but sequentially. A sequential design, which provides a basis for a parameter tuning method, is developed. Exogenous strategy parameters, which have to be specified before an optimization algorithm can be started, are presented for determi nistic and stochastic search algorithms. The discussion of the concept of optimization provides the foundation to define performance measures for search heuristics. Optimization relies on a number of very restrictive assumptions that are not met in many real-world settings. Efficiency and effectivity are introduced with respect to these problems as two i mportant categories to classify performance measures. As the pre-requisites have been introduced, experiments can be performed and analyzed in framework of the new experimentalis m. A classical approach, based on DOE, is presented first. Then, sequential parameter optimization (SPO) is developed as a modern methodology to improve ('tune') and co mpare the performance of algorithms. It is demonstrated how the tuning process, which requires only a relatively small number of experiments, can improve the algorithm's per formance significantly. Even more, the new experimentalism, as introduced and applied in this thesis, provides means to understand the algorithm's performance. Various schem es for selection under noise are introduced to demonstrate this feature. To give an example, it is demonstrated how threshold selection can improve the local and global performan ce of search heuristics under noise. Threshold selection can be characterized as a smart and simple heuristic that performs relatively good in certain environments. These heurist ics are interpreted in Herbert Simon's framework of bounded rationality. Finally, a commonly accepted model that describes the relation between experiment and theory is rev ised and enhanced

    Complexity Theory, Adaptation, and Administrative Law

    Get PDF
    Recently, commentators have applied insights from complexity theory to legal analysis generally and to administrative law in particular. This Article focuses on one of the central problems that complexity. theory addresses, the importance and mechanisms of adaptation within complex systems. In Part I, the Article uses three features of complex adaptive systems-emergence from self-assembly, nonlinearity, and sensitivity to initial conditions-and explores the extent to which they may add value as a matter of positive analysis to the understanding of change within legal systems. In Part H, the Article focuses on three normative claims in public law scholarship that depend explicitly or implicitly on notions of adaptation: that states offer advantages over the federal government because experimentation can make them more adaptive, that federal agencies should themselves become more experimentalist using the tool of adaptive management, and that administrative agencies shou Id adopt collaborative mechanisms in policymaking. Using two analytic tools found in the complexity literature, the genetic algorithm and evolutionary game theory, the Article tests the extent to which these three normative claims are borne out

    Towards the Evolution of Novel Vertical-Axis Wind Turbines

    Full text link
    Renewable and sustainable energy is one of the most important challenges currently facing mankind. Wind has made an increasing contribution to the world's energy supply mix, but still remains a long way from reaching its full potential. In this paper, we investigate the use of artificial evolution to design vertical-axis wind turbine prototypes that are physically instantiated and evaluated under approximated wind tunnel conditions. An artificial neural network is used as a surrogate model to assist learning and found to reduce the number of fabrications required to reach a higher aerodynamic efficiency, resulting in an important cost reduction. Unlike in other approaches, such as computational fluid dynamics simulations, no mathematical formulations are used and no model assumptions are made.Comment: 14 pages, 11 figure

    Neyman-Pearson theory of testing and Mayo s extensions applied to evolutionary computing

    Get PDF
    Evolutionary computation (EC) is a relatively new discipline in computer science (Eiben & Smith, 2003). It tackles hard real-world optimization problems, e.g., problems from chemical engineering, airfoil optimization, or bioinformatics, where classical methods from mathematical optimization fail. Many theoretical results in this field are too abstract, they do not match with reality. To develop problem specific algorithms, experimentation is necessary. During the first phase of experimental research in EC (before 1980), which can be characterized as -foundation and development,- the comparison of different algorithms was mostly based on mean values - nearly no further statistics have been used. In the second phase, where EC -moved to mainstream- (1980-2000), classical statistical methods were introduced. There is a strong need to compare EC algorithms to mathematical optimization (main stream) methods. Adequate statistical tools for EC are developed in the third phase (since 2000). They should be able to cope with problems like small sample sizes, nonnormal distributions, noisy results, etc. However - even if these tools are under development - they do not bridge the gap between the statistical significance of an experimental result and its scientific meaning. Based on Mayo s learning model (NPT) we will propose some ideas how to bridge this gap (Mayo, 1983, 1996). We will present plots of the observed significance level and discuss the sequential parameter optimization (SPO) approach. SPO is a heuristic, but implementable approach, which provides a framework for a sound statistical methodology in EC (Bartz-Beielstein, 2006)

    Identity Problems (An Interview with John B. Davis)

    Get PDF
    In this interview, Professor Davis discusses the evolution of his career and research interests as a philosopher-economist and gives his perspective on a number of important issues in the field. He argues that historians and methodologists of economics should be engaged in the practice of economics, and that historians should be more open to philosophical analysis of the content of economic ideas. He suggests that the history of recent economics is a particularly fruitful and important area for research exactly because it is an open-ended story that is very relevant to understanding the underlying concerns and concepts of contemporary economics. He discusses his engagement with heterodox economics schools, and their engagement with a rapidly changing mainstream economics. He argues that the theory of the individual is “the central philosophical issue in economics” and discusses his extensive contributions to the issue

    An open contribution to the understanding of the OMC: changing the conventions supporting national policies

    No full text
    I discuss what the Open Method of Coordination does, not from the point of view of its procedures, but via the cognitive instruments that serve as tools for coordination, benchmarking and adjustment of national policies. This helps to connect with the analysis of conventions with regards to employment and unemployment

    Making sense of institutional change in China: The cultural dimension of economic growth and modernization

    Get PDF
    Building on a new model of institutions proposed by Aoki and the systemic approach to economic civilizations outlined by Kuran, this paper attempts an analysis of the cultural foundations of recent Chinese economic development. I argue that the cultural impact needs to be conceived as a creative process that involves linguistic entities and other public social items in order to provide integrative meaning to economic interactions and identities to different agents involved. I focus on three phenomena that stand at the center of economic culture in China, networks, localism and modernism. I eschew the standard dualism of individualism vs. collectivism in favour of a more detailed view on the self in social relationships. The Chinese pattern of social relations, guanxi, is also a constituent of localism, i.e. a peculiar arrangement and resulting dynamics of central-local interactions in governing the economy. Localism is balanced by culturalist controls of the center, which in contemporary China builds on the worldview of modernism. Thus, economic modernization is a cultural phenomenon on its own sake. I summarize these interactions in a process analysis based on Aoki's framework. --Aoki,culture and the economy,emics/etics,guanxi,relational collectivism,central/local government relations,culturalism,population quality,consumerism
    • 

    corecore