328,441 research outputs found

    Tuning Kinetic Magnetism of Strongly Correlated Electrons via Staggered Flux

    Full text link
    We explore the kinetic magnetism of the infinite-UU repulsive Hubbard models at low hole densities on various lattices with nearest-neighbor hopping integrals modulated by a staggered magnetic flux ±ϕ\pm\phi. Tuning ϕ\phi from 0 to π\pi makes the ground state (GS) change from a Nagaoka-type ferromagnetic state to a Haerter-Shastry-type antiferromagnetic state at a critical ϕc\phi_c, with both states being of kinetic origin. Intra-plaquette spin correlation, as well as the GS energy, signals such a quantum criticality. This tunable kinetic magnetism is generic, and appears in chains, ladders and two-dimensional lattices with squares or triangles as elementary constituents.Comment: 4 pages, 5 figures, 1 tabl

    A new approach to the study of the ground-state properties of 2D Ising spin glass

    Full text link
    A new approach known as flat histogram method is used to study the +/-J Ising spin glass in two dimensions. Temperature dependence of the energy, the entropy, and other physical quantities can be easily calculated and we give the results for the zero-temperature limit. For the ground-state energy and entropy of an infinite system size, we estimate e0 = -1.4007 +/- 0.0085 and s0 = 0.0709 +/- 0.006, respectively. Both of them agree well with previous calculations. The time to find the ground-states as well as the tunneling times of the algorithm are also reported and compared with other methods.Comment: 11 pages, 4 figure

    Theoretical Analysis of Bayesian Optimisation with Unknown Gaussian Process Hyper-Parameters

    Full text link
    Bayesian optimisation has gained great popularity as a tool for optimising the parameters of machine learning algorithms and models. Somewhat ironically, setting up the hyper-parameters of Bayesian optimisation methods is notoriously hard. While reasonable practical solutions have been advanced, they can often fail to find the best optima. Surprisingly, there is little theoretical analysis of this crucial problem in the literature. To address this, we derive a cumulative regret bound for Bayesian optimisation with Gaussian processes and unknown kernel hyper-parameters in the stochastic setting. The bound, which applies to the expected improvement acquisition function and sub-Gaussian observation noise, provides us with guidelines on how to design hyper-parameter estimation methods. A simple simulation demonstrates the importance of following these guidelines.Comment: 16 pages, 1 figur
    corecore