97,115 research outputs found

    A Feature-Based Analysis on the Impact of Set of Constraints for e-Constrained Differential Evolution

    Full text link
    Different types of evolutionary algorithms have been developed for constrained continuous optimization. We carry out a feature-based analysis of evolved constrained continuous optimization instances to understand the characteristics of constraints that make problems hard for evolutionary algorithm. In our study, we examine how various sets of constraints can influence the behaviour of e-Constrained Differential Evolution. Investigating the evolved instances, we obtain knowledge of what type of constraints and their features make a problem difficult for the examined algorithm.Comment: 17 Page

    A Feature-Based Comparison of Evolutionary Computing Techniques for Constrained Continuous Optimisation

    Full text link
    Evolutionary algorithms have been frequently applied to constrained continuous optimisation problems. We carry out feature based comparisons of different types of evolutionary algorithms such as evolution strategies, differential evolution and particle swarm optimisation for constrained continuous optimisation. In our study, we examine how sets of constraints influence the difficulty of obtaining close to optimal solutions. Using a multi-objective approach, we evolve constrained continuous problems having a set of linear and/or quadratic constraints where the different evolutionary approaches show a significant difference in performance. Afterwards, we discuss the features of the constraints that exhibit a difference in performance of the different evolutionary approaches under consideration.Comment: 16 Pagesm 2 Figure

    Solving the G-problems in less than 500 iterations: Improved efficient constrained optimization by surrogate modeling and adaptive parameter control

    Get PDF
    Constrained optimization of high-dimensional numerical problems plays an important role in many scientific and industrial applications. Function evaluations in many industrial applications are severely limited and no analytical information about objective function and constraint functions is available. For such expensive black-box optimization tasks, the constraint optimization algorithm COBRA was proposed, making use of RBF surrogate modeling for both the objective and the constraint functions. COBRA has shown remarkable success in solving reliably complex benchmark problems in less than 500 function evaluations. Unfortunately, COBRA requires careful adjustment of parameters in order to do so. In this work we present a new self-adjusting algorithm SACOBRA, which is based on COBRA and capable to achieve high-quality results with very few function evaluations and no parameter tuning. It is shown with the help of performance profiles on a set of benchmark problems (G-problems, MOPTA08) that SACOBRA consistently outperforms any COBRA algorithm with fixed parameter setting. We analyze the importance of the several new elements in SACOBRA and find that each element of SACOBRA plays a role to boost up the overall optimization performance. We discuss the reasons behind and get in this way a better understanding of high-quality RBF surrogate modeling

    Statistical constraints on the IR galaxy number counts and cosmic IR background from the Spitzer GOODS survey

    Full text link
    We perform fluctuation analyses on the data from the Spitzer GOODS survey (epoch one) in the Hubble Deep Field North (HDF-N). We fit a parameterised power-law number count model of the form dN/dS = N_o S^{-\delta} to data from each of the four Spitzer IRAC bands, using Markov Chain Monte Carlo (MCMC) sampling to explore the posterior probability distribution in each case. We obtain best-fit reduced chi-squared values of (3.43 0.86 1.14 1.13) in the four IRAC bands. From this analysis we determine the likely differential faint source counts down to 10−8Jy10^{-8} Jy, over two orders of magnitude in flux fainter than has been previously determined. From these constrained number count models, we estimate a lower bound on the contribution to the Infra-Red (IR) background light arising from faint galaxies. We estimate the total integrated background IR light in the Spitzer GOODS HDF-N field due to faint sources. By adding the estimates of integrated light given by Fazio et al (2004), we calculate the total integrated background light in the four IRAC bands. We compare our 3.6 micron results with previous background estimates in similar bands and conclude that, subject to our assumptions about the noise characteristics, our analyses are able to account for the vast majority of the 3.6 micron background. Our analyses are sensitive to a number of potential systematic effects; we discuss our assumptions with regards to noise characteristics, flux calibration and flat-fielding artifacts.Comment: 10 pages; 29 figures (Figure added); correction made to flux scale of Fazio points in Figure

    The velocity function in the local environment from LCDM and LWDM constrained simulations

    Full text link
    Using constrained simulations of the local Universe for generic cold dark matter and for 1keV warm dark matter, we investigate the difference in the abundance of dark matter halos in the local environment. We find that the mass function within 20 Mpc/h of the Local Group is ~2 times larger than the universal mass function in the 10^9-10^13 M_odot/h mass range. Imposing the field of view of the on-going HI blind survey ALFALFA in our simulations, we predict that the velocity function in the Virgo-direction region exceeds the universal velocity function by a factor of 3. Furthermore, employing a scheme to translate the halo velocity function into a galaxy velocity function, we compare the simulation results with a sample of galaxies from the early catalog release of ALFALFA. We find that our simulations are able to reproduce the velocity function in the 80-300 km/s velocity range, having a value ~10 times larger than the universal velocity function in the Virgo-direction region. In the low velocity regime, 35-80 km/s, the warm dark matter simulation reproduces the observed flattening of the velocity function. On the contrary, the simulation with cold dark matter predicts a steep rise in the velocity function towards lower velocities; for V_max=35 km/s, it forecasts ~10 times more sources than the ones observed. If confirmed by the complete ALFALFA survey, our results indicate a potential problem for the cold dark matter paradigm or for the conventional assumptions about energetic feedback in dwarf galaxies.Comment: 24 pages, 14 figures, 1 table, accepted for publication in Ap

    Improved constraints on the expansion rate of the Universe up to z~1.1 from the spectroscopic evolution of cosmic chronometers

    Get PDF
    We present new improved constraints on the Hubble parameter H(z) in the redshift range 0.15 < z < 1.1, obtained from the differential spectroscopic evolution of early-type galaxies as a function of redshift. We extract a large sample of early-type galaxies (\sim11000) from several spectroscopic surveys, spanning almost 8 billion years of cosmic lookback time (0.15 < z < 1.42). We select the most massive, red elliptical galaxies, passively evolving and without signature of ongoing star formation. Those galaxies can be used as standard cosmic chronometers, as firstly proposed by Jimenez & Loeb (2002), whose differential age evolution as a function of cosmic time directly probes H(z). We analyze the 4000 {\AA} break (D4000) as a function of redshift, use stellar population synthesis models to theoretically calibrate the dependence of the differential age evolution on the differential D4000, and estimate the Hubble parameter taking into account both statistical and systematical errors. We provide 8 new measurements of H(z) (see Tab. 4), and determine its change in H(z) to a precision of 5-12% mapping homogeneously the redshift range up to z \sim 1.1; for the first time, we place a constraint on H(z) at z \neq 0 with a precision comparable with the one achieved for the Hubble constant (about 5-6% at z \sim 0.2), and covered a redshift range (0.5 < z < 0.8) which is crucial to distinguish many different quintessence cosmologies. These measurements have been tested to best match a \Lambda CDM model, clearly providing a statistically robust indication that the Universe is undergoing an accelerated expansion. This method shows the potentiality to open a new avenue in constrain a variety of alternative cosmologies, especially when future surveys (e.g. Euclid) will open the possibility to extend it up to z \sim 2.Comment: 34 pages, 15 figures, 6 tables, published in JCAP. It is a companion to Moresco et al. (2012b, http://arxiv.org/abs/1201.6658) and Jimenez et al. (2012, http://arxiv.org/abs/1201.3608). The H(z) data can be downloaded at http://www.physics-astronomy.unibo.it/en/research/areas/astrophysics/cosmology-with-cosmic-chronometer
    • 

    corecore