21,812 research outputs found

    Myths and Legends of the Baldwin Effect

    Get PDF
    This position paper argues that the Baldwin effect is widely misunderstood by the evolutionary computation community. The misunderstandings appear to fall into two general categories. Firstly, it is commonly believed that the Baldwin effect is concerned with the synergy that results when there is an evolving population of learning individuals. This is only half of the story. The full story is more complicated and more interesting. The Baldwin effect is concerned with the costs and benefits of lifetime learning by individuals in an evolving population. Several researchers have focussed exclusively on the benefits, but there is much to be gained from attention to the costs. This paper explains the two sides of the story and enumerates ten of the costs and benefits of lifetime learning by individuals in an evolving population. Secondly, there is a cluster of misunderstandings about the relationship between the Baldwin effect and Lamarckian inheritance of acquired characteristics. The Baldwin effect is not Lamarckian. A Lamarckian algorithm is not better for most evolutionary computing problems than a Baldwinian algorithm. Finally, Lamarckian inheritance is not a better model of memetic (cultural) evolution than the Baldwin effect

    Full Lensing Analysis of Abell 1703: Comparison of Independent Lens-Modelling Techniques

    Full text link
    The inner mass-profile of the relaxed cluster Abell 1703 is analysed by two very different strong-lensing techniques applied to deep ACS and WFC3 imaging. Our parametric method has the accuracy required to reproduce the many sets of multiple images, based on the assumption that mass approximately traces light. We test this assumption with a fully non-parametric, adaptive grid method, with no knowledge of the galaxy distribution. Differences between the methods are seen on fine scales due to member galaxies which must be included in models designed to search for lensed images, but on the larger scale the general distribution of dark matter is in good agreement, with very similar radial mass profiles. We add undiluted weak-lensing measurements from deep multi-colour Subaru imaging to obtain a fully model-independent mass profile out to the virial radius and beyond. Consistency is found in the region of overlap between the weak and strong lensing, and the full mass profile is well-described by an NFW model of a concentration parameter, cvir7.15±0.5c_{\rm vir}\simeq 7.15\pm0.5 (and Mvir1.22±0.15×1015M/hM_{vir}\simeq 1.22\pm0.15 \times 10^{15}M_{\odot}/h). Abell 1703 lies above the standard cc--MM relation predicted for the standard Λ\LambdaCDM model, similar to other massive relaxed clusters with accurately determined lensing-based profiles.Comment: 12 pages, 17 figures, 1 table, accepted for publication in MNRAS. V2 includes minor changes and revised figure

    Optimisation of composite boat hulls using first principles and design rules

    No full text
    The design process is becoming increasingly complex with designers balancing societal, environmental and political issues. Composite materials are attractive to designers due to excellent strength to weight ratio, low corrosion and ability to be tailored to the application. One problem with composite materials can be the low stiffness that they exhibit and as such for many applications they are stiffened. These stiffened structures create a complex engineering problem by which they must be designed to have the lowest cost and mass and yet withstand loads. This paper therefore examines the way in which rapid assessment of stiffened boat structures can be performed for the concept design stage. Navier grillage method is combined with genetic algorithms to produce panels optimised for mass and cost. These models are constrained using design rules, in this case ISO 12215 and Lloyd's Register Rules for Special Service Craft. The results show a method that produces a reasonable stiffened structure rapidly that could be used in advanced concept design or early detailed design to reduce design time

    Energy-Aware Cloud Management through Progressive SLA Specification

    Full text link
    Novel energy-aware cloud management methods dynamically reallocate computation across geographically distributed data centers to leverage regional electricity price and temperature differences. As a result, a managed VM may suffer occasional downtimes. Current cloud providers only offer high availability VMs, without enough flexibility to apply such energy-aware management. In this paper we show how to analyse past traces of dynamic cloud management actions based on electricity prices and temperatures to estimate VM availability and price values. We propose a novel SLA specification approach for offering VMs with different availability and price values guaranteed over multiple SLAs to enable flexible energy-aware cloud management. We determine the optimal number of such SLAs as well as their availability and price guaranteed values. We evaluate our approach in a user SLA selection simulation using Wikipedia and Grid'5000 workloads. The results show higher customer conversion and 39% average energy savings per VM.Comment: 14 pages, conferenc

    Reliability-based optimization design of geosynthetic reinforced embankment slopes

    Get PDF
    This study examines the optimization design of geosynthetic reinforced embankment slopes (GRES) considering both economic benefits and technical safety requirements. In engineering design, cost is always a big concern. To minimize the cost, engineers tend to seek an optimal combination of design parameters among the considered alternatives while ensuring the optimal solution is safe. Reliability-based optimization (RBO) is such a technique that provides engineers the optimal design with the minimum cost while all technical design requirements are satisfied. The research goal of this study is to implement a mathematical formulation algorithm of the RBO technique in GRES design. To achieve this goal, slope stability is studied using the limit equilibrium method (LEM). Considering geotechnical uncertainties, the first-order reliability method (FORM) is adopted to perform probabilistic slope stability analysis, address the critical slip surfaces, and assess the reliability of the slope system. The slope stability and reliability are then used as the crucial constraints in the following RBO procedure, wherein the constrained optimization problem will be solved by adopting a genetic algorithm (GA). Sensitivity analysis is carried out on the basis of the probabilistic slope stability analysis to highlight the influence of each involved random variable on the probabilistic performance of the slope system; and thereby, infer the corresponding impact on the optimization design. A framework of how to implement the RBO in GRES design is proposed. An engineering case history is accordingly studied to demonstrate the practical application of the proposed design framework. Compared to the conventional (manual) process, the proposed design framework is more systematic and effective, especially with the large number of design variables involved in geosynthetic reinforced slopes. --Abstract, page iii

    Half a billion simulations: evolutionary algorithms and distributed computing for calibrating the SimpopLocal geographical model

    Get PDF
    Multi-agent geographical models integrate very large numbers of spatial interactions. In order to validate those models large amount of computing is necessary for their simulation and calibration. Here a new data processing chain including an automated calibration procedure is experimented on a computational grid using evolutionary algorithms. This is applied for the first time to a geographical model designed to simulate the evolution of an early urban settlement system. The method enables us to reduce the computing time and provides robust results. Using this method, we identify several parameter settings that minimise three objective functions that quantify how closely the model results match a reference pattern. As the values of each parameter in different settings are very close, this estimation considerably reduces the initial possible domain of variation of the parameters. The model is thus a useful tool for further multiple applications on empirical historical situations

    Optimización del diseño estructural de pavimentos asfálticos para calles y carreteras

    Get PDF
    gráficos, tablasThe construction of asphalt pavements in streets and highways is an activity that requires optimizing the consumption of significant economic and natural resources. Pavement design optimization meets contradictory objectives according to the availability of resources and users’ needs. This dissertation explores the application of metaheuristics to optimize the design of asphalt pavements using an incremental design based on the prediction of damage and vehicle operating costs (VOC). The costs are proportional to energy and resource consumption and polluting emissions. The evolution of asphalt pavement design and metaheuristic optimization techniques on this topic were reviewed. Four computer programs were developed: (1) UNLEA, a program for the structural analysis of multilayer systems. (2) PSO-UNLEA, a program that uses particle swarm optimization metaheuristic (PSO) for the backcalculation of pavement moduli. (3) UNPAVE, an incremental pavement design program based on the equations of the North American MEPDG and includes the computation of vehicle operating costs based on IRI. (4) PSO-PAVE, a PSO program to search for thicknesses that optimize the design considering construction and vehicle operating costs. The case studies show that the backcalculation and structural design of pavements can be optimized by PSO considering restrictions in the thickness and the selection of materials. Future developments should reduce the computational cost and calibrate the pavement performance and VOC models. (Texto tomado de la fuente)La construcción de pavimentos asfálticos en calles y carreteras es una actividad que requiere la optimización del consumo de cuantiosos recursos económicos y naturales. La optimización del diseño de pavimentos atiende objetivos contradictorios de acuerdo con la disponibilidad de recursos y las necesidades de los usuarios. Este trabajo explora el empleo de metaheurísticas para optimizar el diseño de pavimentos asfálticos empleando el diseño incremental basado en la predicción del deterioro y los costos de operación vehicular (COV). Los costos son proporcionales al consumo energético y de recursos y las emisiones contaminantes. Se revisó la evolución del diseño de pavimentos asfálticos y el desarrollo de técnicas metaheurísticas de optimización en este tema. Se desarrollaron cuatro programas de computador: (1) UNLEA, programa para el análisis estructural de sistemas multicapa. (2) PSO-UNLEA, programa que emplea la metaheurística de optimización con enjambre de partículas (PSO) para el cálculo inverso de módulos de pavimentos. (3) UNPAVE, programa de diseño incremental de pavimentos basado en las ecuaciones de la MEPDG norteamericana, y el cálculo de costos de construcción y operación vehicular basados en el IRI. (4) PSO-PAVE, programa que emplea la PSO en la búsqueda de espesores que permitan optimizar el diseño considerando los costos de construcción y de operación vehicular. Los estudios de caso muestran que el cálculo inverso y el diseño estructural de pavimentos pueden optimizarse mediante PSO considerando restricciones en los espesores y la selección de materiales. Los desarrollos futuros deben enfocarse en reducir el costo computacional y calibrar los modelos de deterioro y COV.DoctoradoDoctor en Ingeniería - Ingeniería AutomáticaDiseño incremental de pavimentosEléctrica, Electrónica, Automatización Y Telecomunicacione

    An Evolutionary Neural Network Approach for Slopes Stability Assessment

    Get PDF
    A current big challenge for developed or developing countries is how to keep large-scale transportation infrastructure networks operational under all conditions. Network extensions and budgetary constraints for maintenance purposes are among the main factors that make transportation network management a non-trivial task. On the other hand, the high number of parameters affecting the stability condition of engineered slopes makes their assessment even more complex and difficult to accomplish. Aiming to help achieve the more efficient management of such an important element of modern society, a first attempt at the development of a classification system for rock and soil cuttings, as well as embankments based on visual features, was made in this paper using soft computing algorithms. The achieved results, although interesting, nevertheless have some important limitations to their successful use as auxiliary tools for transportation network management tasks. Accordingly, we carried out new experiments through the combination of modern optimization and soft computing algorithms. Thus, one of the main challenges to overcome is related to the selection of the best set of input features for a feedforward neural network for earthwork hazard category (EHC) identification. We applied a genetic algorithm (GA) for this purpose. Another challenging task is related to the asymmetric distribution of the data (since typically good conditions are much more common than bad ones). To address this question, three training sampling approaches were explored: no resampling, the synthetic minority oversampling technique (SMOTE), and oversampling. Some relevant observations were taken from the optimization process, namely, the identification of which variables are more frequently selected for EHC identification. After finding the most efficient models, a detailed sensitivity analysis was applied over the selected models, allowing us to measure the relative importance of each attribute in EHC identification
    corecore