127,319 research outputs found

    Random dispersion approximation for the Hubbard model

    Full text link
    We use the Random Dispersion Approximation (RDA) to study the Mott-Hubbard transition in the Hubbard model at half band filling. The RDA becomes exact for the Hubbard model in infinite dimensions. We implement the RDA on finite chains and employ the Lanczos exact diagonalization method in real space to calculate the ground-state energy, the average double occupancy, the charge gap, the momentum distribution, and the quasi-particle weight. We find a satisfactory agreement with perturbative results in the weak- and strong-coupling limits. A straightforward extrapolation of the RDA data for L14L\leq 14 lattice results in a continuous Mott-Hubbard transition at UcWU_{\rm c}\approx W. We discuss the significance of a possible signature of a coexistence region between insulating and metallic ground states in the RDA that would correspond to the scenario of a discontinuous Mott-Hubbard transition as found in numerical investigations of the Dynamical Mean-Field Theory for the Hubbard model.Comment: 10 pages, 11 figure

    RDA: an innovation in cataloguing

    Get PDF
    With effect from 31 March 2013, Resource Description and Access (RDA) has become the cataloguing content standard used by the British Library and the Library of Congress. Concurrent with these institutions, other libraries, principally in the English-speaking world, have also adopted, or are planning to adopt, RDA. This article will discuss what RDA is, how and why it is an innovation in cataloguing, and will then examine its adoption by libraries. It will also address implications for library catalogues. Particular emphasis will be placed on the pattern of adoption, applying Everett Rogers' categorization to libraries as they implement RDA

    Pemberdayaan masyarakat adat untuk pelestarian dan pewarisan nilai-nilai budaya (asesmen program revitalisasi desa adat)

    Get PDF
    Pemberdayaan desa adat menjadi menjadi salah satu langkah penting Direktorat Kepercayaan dan Tradisi dalam melestarikan kebudayaan Indonesia dan sejak tahun 2013 terus berlanjut hingga saat ini telah dilakukan program revitalisasi desa adat (RDA). Pusat Penelitian Kebijakan Pendidikan dan Kebudayaan bermaksud mengkaji proses dan ketercapaian program RDA ditinjau dari aspek bagaimana pelaksanaan Program RDA yang dilakukan oleh Direktorat Kepercayaan dan Tradisi, Ditjen Kebudayaan, Kementerian Pendidikan dan Kebudayaan, bagaimana ketercapaian tujuan program RDA tersebut, dan strategi apa yang dapat dilakukan untuk mengoptimalkan capaian tujuan program RDA tersebut

    Discourse cues: Further evidence for the core contributor distinction

    Get PDF
    Moser and Moore (1995, to appear) carried out a corpus study of discourse cues in tutorial dialogue. Their annotation uses Relational Discourse Analysis (RDA), which distinguishes core elements (nuclei-like) from contributors (satellite-like). In their discussion of these results, Moser and Moore propose that clauses in the contributor-core order are harder to understand than clauses in core-contributor order, but do not attempt to explain why the "hard'' order is ever used. Here, we recruit evidence from work by Stevenson and her collaborators, which substantiates the empirical claim. We then suggest that by distinguishing information structure (given-new) from intentional structure (core-contributor), wecan explain why hard orders are surprisingly frequent. We note, however, that this cannot be the whole story, and show how the hierarchical RDA structure helps account for differences between discourse cues such as since, so, this means, and therefore

    The Extended Regularized Dual Averaging Method for Composite Optimization

    Full text link
    We present a new algorithm, extended regularized dual averaging (XRDA), for solving composite optimization problems, which are a generalization of the regularized dual averaging (RDA) method. The main novelty of the method is that it allows more flexible control of the backward step size. For instance, the backward step size for RDA grows without bound, while XRDA the backward step size can be kept bounded
    corecore