628 research outputs found

    T3NS: three-legged tree tensor network states

    Get PDF
    We present a new variational tree tensor network state (TTNS) ansatz, the three-legged tree tensor network state (T3NS). Physical tensors are interspersed with branching tensors. Physical tensors have one physical index and at most two virtual indices, as in the matrix product state (MPS) ansatz of the density matrix renormalization group (DMRG). Branching tensors have no physical index, but up to three virtual indices. In this way, advantages of DMRG, in particular a low computational cost and a simple implementation of symmetries, are combined with advantages of TTNS, namely incorporating more entanglement. Our code is capable of simulating quantum chemical Hamiltonians, and we present several proof-of-principle calculations on LiF, N2_2 and the bis(μ\mu-oxo) and μ−η2:η2\mu - \eta^2 : \eta^2 peroxo isomers of [Cu2O2]2+[\mathrm{Cu}_2\mathrm{O}_2]^{2+}.Comment: 14 pages, 8 figure

    New Approaches for ab initio Calculations of Molecules with Strong Electron Correlation

    Get PDF
    Reliable quantum chemical methods for the description of molecules with dense-lying frontier orbitals are needed in the context of many chemical compounds and reactions. Here, we review developments that led to our newcomputational toolbo x which implements the quantum chemical density matrix renormalization group in a second-generation algorithm. We present an overview of the different components of this toolbox.Comment: 19 pages, 1 tabl

    Fluctuating Currents in Stochastic Thermodynamics II. Energy Conversion and Nonequilibrium Response in Kinesin Models

    Get PDF
    Unlike macroscopic engines, the molecular machinery of living cells is strongly affected by fluctuations. Stochastic Thermodynamics uses Markovian jump processes to model the random transitions between the chemical and configurational states of these biological macromolecules. A recently developed theoretical framework [Wachtel, Vollmer, Altaner: "Fluctuating Currents in Stochastic Thermodynamics I. Gauge Invariance of Asymptotic Statistics"] provides a simple algorithm for the determination of macroscopic currents and correlation integrals of arbitrary fluctuating currents. Here, we use it to discuss energy conversion and nonequilibrium response in different models for the molecular motor kinesin. Methodologically, our results demonstrate the effectiveness of the algorithm in dealing with parameter-dependent stochastic models. For the concrete biophysical problem our results reveal two interesting features in experimentally accessible parameter regions: The validity of a non-equilibrium Green--Kubo relation at mechanical stalling as well as negative differential mobility for superstalling forces.Comment: PACS numbers: 05.70.Ln, 05.40.-a, 87.10.Mn, 87.16.Nn. An accompanying publication "Fluctuating Currents in Stochastic Thermodynamics I. Gauge Invariance of Asymptotic Statistics" is available at http://arxiv.org/abs/1407.206

    Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

    Full text link
    Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.Comment: 232 page

    Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

    Full text link
    Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their applications in machine learning and data analytics. A particular emphasis is on the tensor train (TT) and Hierarchical Tucker (HT) decompositions, and their physically meaningful interpretations which reflect the scalability of the tensor network approach. Through a graphical approach, we also elucidate how, by virtue of the underlying low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volumes of data/parameters, thereby alleviating or even eliminating the curse of dimensionality. The usefulness of this concept is illustrated over a number of applied areas, including generalized regression and classification (support tensor machines, canonical correlation analysis, higher order partial least squares), generalized eigenvalue decomposition, Riemannian optimization, and in the optimization of deep neural networks. Part 1 and Part 2 of this work can be used either as stand-alone separate texts, or indeed as a conjoint comprehensive review of the exciting field of low-rank tensor networks and tensor decompositions.Comment: 232 page
    • …
    corecore