2,456 research outputs found

    Entanglement, quantum phase transitions, and density matrix renormalization

    Get PDF
    We investigate the role of entanglement in quantum phase transitions, and show that the success of the density matrix renormalization group (DMRG) in understanding such phase transitions is due to the way it preserves entanglement under renormalization. We provide a reinterpretation of the DMRG in terms of the language and tools of quantum information science which allows us to rederive the DMRG in a physically transparent way. Motivated by our reinterpretation we suggest a modification of the DMRG which manifestly takes account of the entanglement in a quantum system. This modified renormalization scheme is shown,in certain special cases, to preserve more entanglement in a quantum system than traditional numerical renormalization methods.Comment: 5 pages, 1 eps figure, revtex4; added reference and qualifying remark

    Preconditioning Kernel Matrices

    Full text link
    The computational and storage complexity of kernel machines presents the primary barrier to their scaling to large, modern, datasets. A common way to tackle the scalability issue is to use the conjugate gradient algorithm, which relieves the constraints on both storage (the kernel matrix need not be stored) and computation (both stochastic gradients and parallelization can be used). Even so, conjugate gradient is not without its own issues: the conditioning of kernel matrices is often such that conjugate gradients will have poor convergence in practice. Preconditioning is a common approach to alleviating this issue. Here we propose preconditioned conjugate gradients for kernel machines, and develop a broad range of preconditioners particularly useful for kernel matrices. We describe a scalable approach to both solving kernel machines and learning their hyperparameters. We show this approach is exact in the limit of iterations and outperforms state-of-the-art approximations for a given computational budget

    Efficient Bayesian Nonparametric Modelling of Structured Point Processes

    Full text link
    This paper presents a Bayesian generative model for dependent Cox point processes, alongside an efficient inference scheme which scales as if the point processes were modelled independently. We can handle missing data naturally, infer latent structure, and cope with large numbers of observed processes. A further novel contribution enables the model to work effectively in higher dimensional spaces. Using this method, we achieve vastly improved predictive performance on both 2D and 1D real data, validating our structured approach.Comment: Presented at UAI 2014. Bibtex: @inproceedings{structcoxpp14_UAI, Author = {Tom Gunter and Chris Lloyd and Michael A. Osborne and Stephen J. Roberts}, Title = {Efficient Bayesian Nonparametric Modelling of Structured Point Processes}, Booktitle = {Uncertainty in Artificial Intelligence (UAI)}, Year = {2014}

    Probabilistic Numerics and Uncertainty in Computations

    Get PDF
    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data has led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimisers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.Comment: Author Generated Postprint. 17 pages, 4 Figures, 1 Tabl

    Raiders of the Lost Architecture: Kernels for Bayesian Optimization in Conditional Parameter Spaces

    Full text link
    In practical Bayesian optimization, we must often search over structures with differing numbers of parameters. For instance, we may wish to search over neural network architectures with an unknown number of layers. To relate performance data gathered for different architectures, we define a new kernel for conditional parameter spaces that explicitly includes information about which parameters are relevant in a given structure. We show that this kernel improves model quality and Bayesian optimization results over several simpler baseline kernels.Comment: 6 pages, 3 figures. Appeared in the NIPS 2013 workshop on Bayesian optimizatio

    AN EXAMINATION OF ECONOMIC EFFICIENCY OF RUSSIAN CROP OUTPUT IN THE REFORM PERIOD

    Get PDF
    This paper examines economic efficiency of Russian corporate farms for 1995-98. Economic efficiency declined over the period, due to declines in both technical and allocative inefficiency. According to the average technical efficiency scores, Russian agricultural production could improve from 17 to 43 percent according to DEA and SFA analysis, respectively. The efficiency scores show that Russian agriculture presently uses relatively too much fertilizer and fuel and too little land and labor. Russian agriculture inherited machinery-intensive technology from the Soviet era, which may be inappropriate given the relative abundance of labor in the post-reform environment. Investment constraints have prevented the replacement of old machinery-intensive technology with labor intensive technology.Crop Production/Industries, Productivity Analysis,
    corecore