35,431 research outputs found
Optimization of Trading Physics Models of Markets
We describe an end-to-end real-time S&P futures trading system. Inner-shell
stochastic nonlinear dynamic models are developed, and Canonical Momenta
Indicators (CMI) are derived from a fitted Lagrangian used by outer-shell
trading models dependent on these indicators. Recursive and adaptive
optimization using Adaptive Simulated Annealing (ASA) is used for fitting
parameters shared across these shells of dynamic and trading models
Facility layout problem: Bibliometric and benchmarking analysis
Facility layout problem is related to the location of departments in a facility area, with the aim of determining the most effective configuration. Researches based on different approaches have been published in the last six decades and, to prove the effectiveness of the results obtained, several instances have been developed. This paper presents a general overview on the extant literature on facility layout problems in order to identify the main research trends and propose future research questions. Firstly, in order to give the reader an overview of the literature, a bibliometric analysis is presented. Then, a clusterization of the papers referred to the main instances reported in literature was carried out in order to create a database that can be a useful tool in the benchmarking procedure for researchers that would approach this kind of problems
Best-case performance of quantum annealers on native spin-glass benchmarks: How chaos can affect success probabilities
Recent tests performed on the D-Wave Two quantum annealer have revealed no
clear evidence of speedup over conventional silicon-based technologies. Here,
we present results from classical parallel-tempering Monte Carlo simulations
combined with isoenergetic cluster moves of the archetypal benchmark problem-an
Ising spin glass-on the native chip topology. Using realistic uncorrelated
noise models for the D-Wave Two quantum annealer, we study the best-case
resilience, i.e., the probability that the ground-state configuration is not
affected by random fields and random-bond fluctuations found on the chip. We
thus compute classical upper-bound success probabilities for different types of
disorder used in the benchmarks and predict that an increase in the number of
qubits will require either error correction schemes or a drastic reduction of
the intrinsic noise found in these devices. We outline strategies to develop
robust, as well as hard benchmarks for quantum annealing devices, as well as
any other computing paradigm affected by noise.Comment: 8 pages, 5 figure
Applying Statistical Mechanics to Improve Computational Sampling Algorithms and Interatomic Potentials
In this dissertation the application of statistical mechanics is presented to improve classical simulated annealing and machine learning-based interatomic potentials.
Classical simulated annealing is known to be among the most robust global optimization methods. Therefore, many variations of this method have been developed over the last few decades. This dissertation introduces simulated annealing with adaptive cooling and shows its efficiency with respect to the classical simulated annealing. Adaptive cooling simulated annealing makes use of the on-the-fly evaluation of the sta- tistical mechanical properties to adaptively adjust the cooling rate. In this case, the cooling rate is adaptively adjusted based on the instantaneous evaluations of the heat capacities, with the possible future extension to the density of states. Results are presented for Lennard-Jones clusters optimized by adaptive cooling sim- ulated annealing and the classical simulated annealing. The adaptive cooling approach proved to be more efficient than the classical simulated annealing.
Statistical mechanics was also used to improve the quality and transferability of machine learning- based interatomic potentials. Machine learning (ML)-based interatomic potentials are currently garnering a lot of attention as they strive to achieve the accuracy of electronic structure methods at the computational cost of empirical potentials. Given their generic functional forms, the transferability of these potentials is highly dependent on the quality of the training set, the generation of which is a highly labor-intensive activity. Good training sets should at once contain a very diverse set of configurations while avoiding redundancies that incur cost without providing benefits. We formalize these requirements in a local entropy maximization framework and propose an automated sampling scheme to sample from this objective function. We show that this approach generates much more diverse training sets than unbiased sampling and is competitive with hand-crafted training sets[1]
Stochastic optimization methods for extracting cosmological parameters from CMBR power spectra
The reconstruction of the CMBR power spectrum from a map represents a major
computational challenge to which much effort has been applied. However, once
the power spectrum has been recovered there still remains the problem of
extracting cosmological parameters from it. Doing this involves optimizing a
complicated function in a many dimensional parameter space. Therefore efficient
algorithms are necessary in order to make this feasible. We have tested several
different types of algorithms and found that the technique known as simulated
annealing is very effective for this purpose. It is shown that simulated
annealing is able to extract the correct cosmological parameters from a set of
simulated power spectra, but even with such fast optimization algorithms, a
substantial computational effort is needed.Comment: 7 pages revtex, 3 figures, to appear in PR
Recommended from our members
The robust selection of predictive genes via a simple classifier
Identifying genes that direct the mechanism of a disease from expression data is extremely useful in understanding how that mechanism works.
This in turn may lead to better diagnoses and potentially can lead to a cure for that disease. This task becomes extremely challenging when the
data are characterised by only a small number of samples and a high number of dimensions, as it is often the case with gene expression data.
Motivated by this challenge, we present a general framework that focuses on simplicity and data perturbation. These are the keys for the robust
identification of the most predictive features in such data. Within this framework, we propose a simple selective naĀØıve Bayes classifier discovered using a global search technique, and combine it with data perturbation to
increase its robustness to small sample sizes.
An extensive validation of the method was carried out using two applied datasets from the field of microarrays and a simulated dataset, all
confounded by small sample sizes and high dimensionality. The method has been shown capable of identifying genes previously confirmed or associated with prostate cancer and viral infections
A hybrid CFGTSA based approach for scheduling problem: a case study of an automobile industry
In the global competitive world swift, reliable and cost effective production subject to uncertain situations, through an appropriate management of the available resources, has turned out to be the necessity for surviving in the market. This inspired the development of the more efficient and robust methods to counteract the existing complexities prevailing in the market. The present paper proposes a hybrid CFGTSA algorithm inheriting the salient features of GA, TS, SA, and chaotic theory to solve the complex scheduling problems commonly faced by most of the manufacturing industries. The proposed CFGTSA algorithm has been tested on a scheduling problem of an automobile industry, and its efficacy has been shown by comparing the results with GA, SA, TS, GTS, and hybrid TSA algorithms
- ā¦