30 research outputs found

    An Evolutionary Approach for Solving the Rubik’s Cube Incorporating Exact Methods

    Full text link
    Abstract. Solutions calculated by Evolutionary Algorithms have come to surpass exact methods for solving various problems. The Rubik’s Cube multiobjective optimization problem is one such area. In this work we present an evolutionary approach to solve the Rubik’s Cube with a low number of moves by building upon the classic Thistlethwaite’s approach. We provide a group theoretic analysis of the subproblem complexity in-duced by Thistlethwaite’s group transitions and design an Evolutionary Algorithm from the ground up including detailed derivation of our cus-tom fitness functions. The implementation resulting from these observa-tions is thoroughly tested for integrity and random scrambles, revealing performance that is competitive with exact methods without the need for pre-calculated lookup-tables.

    Bayesian lasso binary quantile regression

    Get PDF
    In this paper, a Bayesian hierarchical model for variable selection and estimation in the context of binary quantile regression is proposed. Existing approaches to variable selection in a binary classification context are sensitive to outliers, heteroskedasticity or other anomalies of the latent response. The method proposed in this study overcomes these problems in an attractive and straightforward way. A Laplace likelihood and Laplace priors for the regression parameters are proposed and estimated with Bayesian Markov Chain Monte Carlo. The resulting model is equivalent to the frequentist lasso procedure. A conceptional result is that by doing so, the binary regression model is moved from a Gaussian to a full Laplacian framework without sacrificing much computational efficiency. In addition, an efficient Gibbs sampler to estimate the model parameters is proposed that is superior to the Metropolis algorithm that is used in previous studies on Bayesian binary quantile regression. Both the simulation studies and the real data analysis indicate that the proposed method performs well in comparison to the other methods. Moreover, as the base model is binary quantile regression, a much more detailed insight in the effects of the covariates is provided by the approach. An implementation of the lasso procedure for binary quantile regression models is available in the R-package bayesQR

    Exact computation of max weighted score estimators

    No full text
    International audienc

    Data for: Forecasting Solar Flares using magnetogram-based predictors and Machine Learning

    No full text
    Replication Files for the paper "Forecasting Solar Flares using magnetogram-based predictors and Machine Learning". The source code is in R and the data files are simple text files. Abstract of the associated paper: We propose a forecasting approach for solar flares based on data from Solar Cycle 24, taken by the Helioseismic and Magnetic Imager (HMI) on board the Solar Dynamics Observatory (SDO) mission. In particular, we use the Space-weather HMI Active Region Patches (SHARP) product that facilitates cut-out magnetograms of solar active regions (AR) in the Sun in near-realtime (NRT), taken over a five-year interval (2012 - 2016). Our approach utilizes a set of thirteen predictors, which are not included in the SHARP data, extracted from line-of-sight and vector photospheric magnetograms. We exploit several Machine Learning (ML) and Conventional Statistics techniques to predict flares of class >M1 and >C1, with a 24h forecast window. The ML methods used are Multi-Layer Perceptrons (MLP), Support Vector Machines (SVM) and Random Forests (RF). We conclude that Random Forests could be the prediction technique of choice for our sample, with the second best method being Multi-Layer Perceptrons, subject to an entropy objective function. A Monte Carlo simulation showed that the best performing method gives accuracy ACC=0.93(0.00), true skill statistic TSS=0.74(0.02) and Heidke skill score HSS=0.49(0.01) for a >M1 class flares prediction with probability threshold 15% and ACC=0.84(0.00), TSS=0.60(0.01) and HSS=0.59(0.01) for a >C1 class flares prediction with probability threshold 35%

    Solving the bi-objective multi-dimensional knapsack problem exploiting the concept of core

    Get PDF
    This paper deals with the bi-objective multi-dimensional knapsack problem. We propose the adaptation of the core concept that is effectively used in single-objective multi-dimensional knapsack problems. The main idea of the core concept is based on the “divide and conquer” principle. Namely, instead of solving one problem with n variables we solve several sub-problems with a fraction of n variables (core variables). The quality of the obtained solution can be adjusted according to the size of the core and there is always a trade off between the solution time and the quality of solution. In the specific study we define the core problem for the multi-objective multi-dimensional knapsack problem. After defining the core we solve the bi-objective integer programming that comprises only the core variables using the Multicriteria Branch and Bound algorithm that can generate the complete Pareto set in small and medium size multi-objective integer programming problems. A small example is used to illustrate the method while computational and economy issues are also discussed. Computational experiments are also presented using available or appropriately modified benchmarks in order to examine the quality of Pareto set approximation with respect to the solution time. Extensions to the general multi-objective case as well as to the computation of the exact solution are also mentioned

    Energy planning of a hospital using Mathematical Programming and Monte Carlo simulation for dealing with uncertainty in the economic parameters

    Get PDF
    For more than 40 years, Mathematical Programming is the traditional tool for energy planning at the national or regional level aiming at cost minimization subject to specific technological, political and demand satisfaction constraints. The liberalization of the energy market along with the ongoing technical progress increased the level of competition and forced energy consumers, even at the unit level, to make their choices among a large number of alternative or complementary energy technologies, fuels and/or suppliers. In the present work we develop a modelling framework for energy planning in units of the tertiary sector giving special emphasis to model reduction and to the uncertainty of the economic parameters. In the given case study, the energy rehabilitation of a hospital in Athens is examined and the installation of a cogeneration, absorption and compression unit is examined for the supply of the electricity, heating and cooling load. The basic innovation of the given energy model lies in the uncertainty modelling through the combined use of Mathematical Programming (namely, Mixed Integer Linear Programming, MILP) and Monte Carlo simulation that permits the risk management for the most volatile parameters of the objective function such as the fuel costs and the interest rate. The results come in the form of probability distributions that provide fruitful information to the decision maker. The effect of model reduction through appropriate data compression of the load data is also addressed
    corecore