232 research outputs found

    Denoising Autoencoders for fast Combinatorial Black Box Optimization

    Full text link
    Estimation of Distribution Algorithms (EDAs) require flexible probability models that can be efficiently learned and sampled. Autoencoders (AE) are generative stochastic networks with these desired properties. We integrate a special type of AE, the Denoising Autoencoder (DAE), into an EDA and evaluate the performance of DAE-EDA on several combinatorial optimization problems with a single objective. We asses the number of fitness evaluations as well as the required CPU times. We compare the results to the performance to the Bayesian Optimization Algorithm (BOA) and RBM-EDA, another EDA which is based on a generative neural network which has proven competitive with BOA. For the considered problem instances, DAE-EDA is considerably faster than BOA and RBM-EDA, sometimes by orders of magnitude. The number of fitness evaluations is higher than for BOA, but competitive with RBM-EDA. These results show that DAEs can be useful tools for problems with low but non-negligible fitness evaluation costs.Comment: corrected typos and small inconsistencie

    The cohesive frictional crack model applied to the analysis of the dam-foundation joint

    Get PDF
    The mechanical behaviour of dam-foundation joints plays a key role in concrete dam engineering since it is the weakest part of the structure and therefore the evolutionary crack process occurring along this joint determines the global load-bearing capacity. The reference volume involved in the above mentioned process is so large that it cannot be tested in a laboratory: structural analysis has to be carried on by numerical modelling. The use of the asymptotic expansions proposed by Karihaloo and Xiao at the tip of a crack with normal cohesion and Coulomb friction can overcome the numerical difficulties that appear in large scale problems when the Newton-Raphson procedure is applied to a set of equilibrium equations based on ordinary shape functions (Standard Finite Element Method). In this way it is possible to analyze problems with friction and crack propagation under the constant load induced by hydro-mechanical coupling. For each position of the fictitious crack tip, the condition K1=K2=0 allows us to obtain the external load level and the tangential stress at the tip. If the joint tangential strength is larger than the value obtained, the solution is acceptable, because the tensile strength is assumed negligible and the condition K1=0 is sufficient to cause the crack growth. Otherwise, the load level obtained can be considered as an overestimation of the critical value and a special form of contact problem has to be solved along the fictitious process zone. For the boundary condition analyzed (ICOLD benchmark on gravity dam model), after an initial increasing phase, the water lag remains almost constant and the maximum value of load carrying capacity is achieved when the water lag reaches its constant valu

    Phenomenological Comparison of Models with Extended Higgs Sectors

    Full text link
    Beyond the Standard Model (SM) extensions usually include extended Higgs sectors. Models with singlet or doublet fields are the simplest ones that are compatible with the ρ\rho parameter constraint. The discovery of new non-SM Higgs bosons and the identification of the underlying model requires dedicated Higgs properties analyses. In this paper, we compare several Higgs sectors featuring 3 CP-even neutral Higgs bosons that are also motivated by their simplicity and their capability to solve some of the flaws of the SM. They are: the SM extended by a complex singlet field (CxSM), the singlet extension of the 2-Higgs-Doublet Model (N2HDM), and the Next-to-Minimal Supersymmetric SM extension (NMSSM). In addition, we analyse the CP-violating 2-Higgs-Doublet Model (C2HDM), which provides 3 neutral Higgs bosons with a pseudoscalar admixture. This allows us to compare the effects of singlet and pseudoscalar admixtures. Through dedicated scans of the allowed parameter space of the models, we analyse the phenomenologically viable scenarios from the view point of the SM-like Higgs boson and of the signal rates of the non-SM-like Higgs bosons to be found. In particular, we analyse the effect of singlet/pseudoscalar admixture, and the potential to differentiate these models in the near future. This is supported by a study of couplings sums of the Higgs bosons to massive gauge bosons and to fermions, where we identify features that allow us to distinguish the models, in particular when only part of the Higgs spectrum is discovered. Our results can be taken as guidelines for future LHC data analyses, by the ATLAS and CMS experiments, to identify specific benchmark points aimed at revealing the underlying model.Comment: Matches journal version; figures for NMSSM changed; conclusions unchange

    Hyperspace geography: Visualizing fitness landscapes beyond 4D

    Get PDF
    Human perception is finely tuned to extract structure about the 4D world of time and space as well as properties such as color and texture. Developing intuitions about spatial structure beyond 4D requires exploiting other perceptual and cognitive abilities. One of the most natural ways to explore complex spaces is for a user to actively navigate through them, using local explorations and global summaries to develop intuitions about structure, and then testing the developing ideas by further exploration. This article provides a brief overview of a technique for visualizing surfaces defined over moderate-dimensional binary spaces, by recursively unfolding them onto a 2D hypergraph. We briefly summarize the uses of a freely available Web-based visualization tool, Hyperspace Graph Paper (HSGP), for exploring fitness landscapes and search algorithms in evolutionary computation. HSGP provides a way for a user to actively explore a landscape, from simple tasks such as mapping the neighborhood structure of different points, to seeing global properties such as the size and distribution of basins of attraction or how different search algorithms interact with landscape structure. It has been most useful for exploring recursive and repetitive landscapes, and its strength is that it allows intuitions to be developed through active navigation by the user, and exploits the visual system's ability to detect pattern and texture. The technique is most effective when applied to continuous functions over Boolean variables using 4 to 16 dimensions

    Symbiotic Tabu Search

    Get PDF

    Evolvable Neuronal Paths: A Novel Basis for Information and Search in the Brain

    Get PDF
    We propose a previously unrecognized kind of informational entity in the brain that is capable of acting as the basis for unlimited hereditary variation in neuronal networks. This unit is a path of activity through a network of neurons, analogous to a path taken through a hidden Markov model. To prove in principle the capabilities of this new kind of informational substrate, we show how a population of paths can be used as the hereditary material for a neuronally implemented genetic algorithm, (the swiss-army knife of black-box optimization techniques) which we have proposed elsewhere could operate at somatic timescales in the brain. We compare this to the same genetic algorithm that uses a standard ‘genetic’ informational substrate, i.e. non-overlapping discrete genotypes, on a range of optimization problems. A path evolution algorithm (PEA) is defined as any algorithm that implements natural selection of paths in a network substrate. A PEA is a previously unrecognized type of natural selection that is well suited for implementation by biological neuronal networks with structural plasticity. The important similarities and differences between a standard genetic algorithm and a PEA are considered. Whilst most experiments are conducted on an abstract network model, at the conclusion of the paper a slightly more realistic neuronal implementation of a PEA is outlined based on Izhikevich spiking neurons. Finally, experimental predictions are made for the identification of such informational paths in the brain

    The C2HDM revisited

    Get PDF
    The complex two-Higgs doublet model is one of the simplest ways to extend the scalar sector of the Standard Model to include a new source of CP-violation. The model has been used as a benchmark model to search for CP-violation at the LHC and as a possible explanation for the matter-antimatter asymmetry of the Universe. In this work, we re-analyse in full detail the softly broken Z2\mathbb{Z}_2 symmetric complex two-Higgs doublet model (C2HDM). We provide the code C2HDM_HDECAY implementing the C2HDM in the well-known HDECAY program which calculates the decay widths including the state-of-the-art higher order QCD corrections and the relevant off-shell decays. Using C2HDM_HDECAY together with the most relevant theoretical and experimental constraints, including electric dipole moments (EDMs), we review the parameter space of the model and discuss its phenomenology. In particular, we find cases where large CP-odd couplings to fermions are still allowed and provide benchmark points for these scenarios. We examine the prospects of discovering CP-violation at the LHC and show how theoretically motivated measures of CP-violation correlate with observables.The work of D.F., J.C.R. and J.P.S. is supported in part by the Portuguese Fundacao para a Ciencia e Tecnologia (FCT) under contracts CERN/FIS-NUC/0010/2015 and UID/FIS/00777/2013. MM acknowledges financial support from the DFG project "Precision Calculations in the Higgs Sector - Paving the Way to the New Physics Landscape" (ID: MU 3138/1-1).info:eu-repo/semantics/publishedVersio
    corecore