9,177 research outputs found

    Quantum electric-dipole liquid on a triangular lattice

    Get PDF
    Geometric frustrations and quantum mechanical fluctuations may prohibit the formation of long-range ordering even at the lowest temperature, and therefore liquid-like ground states could be expected. A good example is the quantum spin liquid in frustrated magnets that represents an exotic phase of matter and is attracting enormous interests. Geometric frustrations and quantum fluctuations can happen beyond magnetic systems. Here we propose that quantum electric-dipole liquids, analogs to quantum spin liquids, could emerge in frustrated dielectrics where antiferroelectrically coupled small electric dipoles reside on a triangular lattice. The quantum paraelectric hexaferrite BaFe12O19, in which small electric dipoles originated from the off-center displacement of Fe3+ in the FeO5 bipyramids constitute a two-dimensional triangular lattice, represents a promising candidate to generate the anticipated electric-dipole liquid. We present a series of experimental evidences, including dielectric permittivity, heat capacity, and thermal conductivity measured down to 66 mK, to reveal the existence of a nontrivial ground state in BaFe12O19, characterized by itinerant low-energy excitations with a small gap, to which we interpret as an exotic liquid-like quantum phase. The quantum electric-dipole liquids in frustrated dielectrics open up a fresh playground for fundamental physics and may find applications in quantum information and computation as well.Comment: 13 pages, 6 figure

    Exposing the Functionalities of Neurons for Gated Recurrent Unit Based Sequence-to-Sequence Model

    Full text link
    The goal of this paper is to report certain scientific discoveries about a Seq2Seq model. It is known that analyzing the behavior of RNN-based models at the neuron level is considered a more challenging task than analyzing a DNN or CNN models due to their recursive mechanism in nature. This paper aims to provide neuron-level analysis to explain why a vanilla GRU-based Seq2Seq model without attention can achieve token-positioning. We found four different types of neurons: storing, counting, triggering, and outputting and further uncover the mechanism for these neurons to work together in order to produce the right token in the right position.Comment: 9 pages (excluding reference), 10 figure
    • …
    corecore