30 research outputs found
An Evolutionary Approach for Solving the Rubik’s Cube Incorporating Exact Methods
Abstract. Solutions calculated by Evolutionary Algorithms have come to surpass exact methods for solving various problems. The Rubik’s Cube multiobjective optimization problem is one such area. In this work we present an evolutionary approach to solve the Rubik’s Cube with a low number of moves by building upon the classic Thistlethwaite’s approach. We provide a group theoretic analysis of the subproblem complexity in-duced by Thistlethwaite’s group transitions and design an Evolutionary Algorithm from the ground up including detailed derivation of our cus-tom fitness functions. The implementation resulting from these observa-tions is thoroughly tested for integrity and random scrambles, revealing performance that is competitive with exact methods without the need for pre-calculated lookup-tables.
Bayesian lasso binary quantile regression
In this paper, a Bayesian hierarchical model for variable selection and estimation in the context of binary quantile regression is proposed. Existing approaches to variable selection in a binary classification context are sensitive to outliers, heteroskedasticity or other anomalies of the latent response. The method proposed in this study overcomes these problems in an attractive and straightforward way. A Laplace likelihood and Laplace priors for the regression parameters are proposed and estimated with Bayesian Markov Chain Monte Carlo. The resulting model is equivalent to the frequentist lasso procedure. A conceptional result is that by doing so, the binary regression model is moved from a Gaussian to a full Laplacian framework without sacrificing much computational efficiency. In addition, an efficient Gibbs sampler to estimate the model parameters is proposed that is superior to the Metropolis algorithm that is used in previous studies on Bayesian binary quantile regression. Both the simulation studies and the real data analysis indicate that the proposed method performs well in comparison to the other methods. Moreover, as the base model is binary quantile regression, a much more detailed insight in the effects of the covariates is provided by the approach. An implementation of the lasso procedure for binary quantile regression models is available in the R-package bayesQR
Recommended from our members
The flare likelihood and region eruption forecasting (FLARECAST) project: flare forecasting in the big data & machine learning era
The European Union funded the FLARECAST project, that ran from January 2015 until February 2018. FLARECAST had a research-to-operations (R2O) focus, and accordingly introduced several innovations into the discipline of solar flare forecasting. FLARECAST innovations were: first, the treatment of hundreds of physical properties viewed as promising flare predictors on equal footing, extending multiple previous works; second, the use of fourteen (14) different machine learning techniques, also on equal footing, to optimize the immense Big Data parameter space created by these many predictors; third, the establishment of a robust, three-pronged communication effort oriented toward policy makers, space-weather stakeholders and the wider public. FLARECAST pledged to make all its data, codes and infrastructure openly available worldwide. The combined use of 170+ properties (a total of 209 predictors are now available) in multiple machine-learning algorithms, some of which were designed exclusively for the project, gave rise to changing sets of best-performing predictors for the forecasting of different flaring levels, at least for major flares. At the same time, FLARECAST reaffirmed the importance of rigorous training and testing practices to avoid overly optimistic pre-operational prediction performance. In addition, the project has (a) tested new and revisited physically intuitive flare predictors and (b) provided meaningful clues toward the transition from flares to eruptive flares, namely, events associated with coronal mass ejections (CMEs). These leads, along with the FLARECAST data, algorithms and infrastructure, could help facilitate integrated space-weather forecasting efforts that take steps to avoid effort duplication. In spite of being one of the most intensive and systematic flare forecasting efforts to-date, FLARECAST has not managed to convincingly lift the barrier of stochasticity in solar flare occurrence and forecasting: solar flare prediction thus remains inherently probabilistic
Data for: Forecasting Solar Flares using magnetogram-based predictors and Machine Learning
Replication Files for the paper "Forecasting Solar Flares using magnetogram-based predictors and Machine Learning". The source code is in R and the data files are simple text files.
Abstract of the associated paper:
We propose a forecasting approach for solar flares based on data from Solar Cycle 24, taken by the Helioseismic and Magnetic Imager (HMI) on board the Solar Dynamics Observatory (SDO) mission. In particular, we use the Space-weather HMI Active Region Patches (SHARP) product that facilitates cut-out magnetograms of solar active regions (AR) in the Sun in near-realtime (NRT), taken over a five-year interval (2012 - 2016). Our approach utilizes a set of thirteen predictors, which are not included in the SHARP data, extracted from line-of-sight and vector photospheric magnetograms. We exploit several Machine Learning (ML) and Conventional Statistics techniques to predict flares of class >M1 and >C1, with a 24h forecast window. The ML methods used are Multi-Layer Perceptrons (MLP), Support Vector Machines (SVM) and Random Forests (RF). We conclude that Random Forests could be the prediction technique of choice for our sample, with the second best method being Multi-Layer Perceptrons, subject to an entropy objective function. A Monte Carlo simulation showed that the best performing method gives accuracy ACC=0.93(0.00), true skill statistic TSS=0.74(0.02) and Heidke skill score HSS=0.49(0.01) for a >M1 class flares prediction with probability threshold 15% and ACC=0.84(0.00), TSS=0.60(0.01) and HSS=0.59(0.01) for a >C1 class flares prediction with probability threshold 35%
Solving the bi-objective multi-dimensional knapsack problem exploiting the concept of core
This paper deals with the bi-objective multi-dimensional knapsack problem. We propose the adaptation of the core concept that is effectively used in single-objective multi-dimensional knapsack problems. The main idea of the core concept is based on the “divide and conquer” principle. Namely, instead of solving one problem with n variables we solve several sub-problems with a fraction of n variables (core variables). The quality of the obtained solution can be adjusted according to the size of the core and there is always a trade off between the solution time and the quality of solution. In the specific study we define the core problem for the multi-objective multi-dimensional knapsack problem. After defining the core we solve the bi-objective integer programming that comprises only the core variables using the Multicriteria Branch and Bound algorithm that can generate the complete Pareto set in small and medium size multi-objective integer programming problems. A small example is used to illustrate the method while computational and economy issues are also discussed. Computational experiments are also presented using available or appropriately modified benchmarks in order to examine the quality of Pareto set approximation with respect to the solution time. Extensions to the general multi-objective case as well as to the computation of the exact solution are also mentioned
Energy planning of a hospital using Mathematical Programming and Monte Carlo simulation for dealing with uncertainty in the economic parameters
For more than 40 years, Mathematical Programming is the traditional tool for energy planning at the national or regional level aiming at cost minimization subject to specific technological, political and demand satisfaction constraints. The liberalization of the energy market along with the ongoing technical progress increased the level of competition and forced energy consumers, even at the unit level, to make their choices among a large number of alternative or complementary energy technologies, fuels and/or suppliers. In the present work we develop a modelling framework for energy planning in units of the tertiary sector giving special emphasis to model reduction and to the uncertainty of the economic parameters. In the given case study, the energy rehabilitation of a hospital in Athens is examined and the installation of a cogeneration, absorption and compression unit is examined for the supply of the electricity, heating and cooling load. The basic innovation of the given energy model lies in the uncertainty modelling through the combined use of Mathematical Programming (namely, Mixed Integer Linear Programming, MILP) and Monte Carlo simulation that permits the risk management for the most volatile parameters of the objective function such as the fuel costs and the interest rate. The results come in the form of probability distributions that provide fruitful information to the decision maker. The effect of model reduction through appropriate data compression of the load data is also addressed