13,493 research outputs found

    Fractional Quantum Hall Physics in Jaynes-Cummings-Hubbard Lattices

    Get PDF
    Jaynes-Cummings-Hubbard arrays provide unique opportunities for quantum emulation as they exhibit convenient state preparation and measurement, and in-situ tuning of parameters. We show how to realise strongly correlated states of light in Jaynes-Cummings-Hubbard arrays under the introduction of an effective magnetic field. The effective field is realised by dynamic tuning of the cavity resonances. We demonstrate the existence of Fractional Quantum Hall states by com- puting topological invariants, phase transitions between topologically distinct states, and Laughlin wavefunction overlap.Comment: 5 pages, 3 figure

    Bayesian Analysis of Continuous Time Models of the Australian Short Rate

    Get PDF
    This paper provides an empirical analysis of a range of alternative single-factor continuous time models for the Australian short-term interest rate. The models are indexed by the level effect parameter for the volatility in the short rate process. The inferential approach adopted is Bayesian, with estimation of the models proceeding via a Markov Chain Monte Carlo simulation scheme. Discrimination between the alternative models is based on Bayes factors, estimated from the simulation output using the Savage-Dickey density ratio. A data augmentation approach is used to improve the accuracy of the discrete time approximation of the continuous time models. An empirical investigation is conducted using weekly observations on the Australian 90 day interest rate from January 1990 to July 2000. The Bayes factors indicate that the square root diffusion model has the highest posterior probability of all the nested models.Interest Rate Models, Markov Chain Monte Carlo, Data Augmentation

    Simulation-Based Bayesian Estimation of Affine Term Structure Models

    Get PDF
    This paper demonstrates the application of Bayesian simulation-based estimation to a class of interest rate models known as Affine Term Structure (ATS) models. The technique used is based on a Markov Chain Monte Carlo algorithm, with the discrete observations on yields augmented by additional higher frequency latent data. The introduction of augmented yield data reduces the bias associated with estimating a continuous time model using discretely observed data. The technique is demon-strated using a one-factor ATS model, with the latent factor process that underlies the yields sampled via a single-move algorithm. Numerical application of the method is demonstrated using both simulated and empirical data. Extension of the method to a three-factor ATS model is also discussed, as well as the application of a multi-move sampler based on a Kalman Filtering and Smoothing algorithm.Interest Rate Models, Markov Chain Monte Carlo, Data Augmentation, Nonlinear State Space Models, Kalman Filtering.

    Bayesian Inference for Heterogeneous Event Counts

    Full text link
    This article presents an integrated set of Bayesian tools one can use to model heterogeneous event counts. While models for event count cross sections are now widely used, little has been written about how to model counts when contextual factors introduce heterogeneity. The author begins with a discussion of Bayesian cross-sectional count models and discusses an alternative model for counts with overdispersion. To illustrate the Bayesian framework, the author fits the model to the number of womenā€™s rights cosponsorships for each member of the 83rd to 102nd House of Representatives. The model is generalized to allow for contextual heterogeneity. The hierarchical model allows one to explicitly model contextual factors and test alternative contextual explanations, even with a small number of contextual units. The author compares the estimates from this model with traditional approaches and discusses software one can use to easily implement these Bayesian models with little start-up costPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/116234/1/smr03.pd

    The Scythe Statistical Library: An Open Source C++ Library for Statistical Computation

    Get PDF
    The Scythe Statistical Library is an open source C++ library for statistical computation. It includes a suite of matrix manipulation functions, a suite of pseudo-random number generators, and a suite of numerical optimization routines. Programs written using Scythe are generally much faster than those written in commonly used interpreted languages, such as R and \proglang{MATLAB}; and can be compiled on any system with the GNU GCC compiler (and perhaps with other C++ compilers). One of the primary design goals of the Scythe developers has been ease of use for non-expert C++ programmers. Ease of use is provided through three primary mechanisms: (1) operator and function over-loading, (2) numerous pre-fabricated utility functions, and (3) clear documentation and example programs. Additionally, Scythe is quite flexible and entirely extensible because the source code is available to all users under the GNU General Public License.

    LATEX For the Rest of Us

    Full text link
    http://deepblue.lib.umich.edu/bitstream/2027.42/116268/1/tpm02.pd

    Introduction: Empirical Research on Decision-Making in the Federal Courts

    Get PDF
    The invention of the desktop computer and the widespread use of the Internet has revolutionized life in many ways over the last twenty-five years. It has also fundamentally changed the way in which we are able to study law. Since the 1980s, a vast amount of data has been collected about courts, both in the United States and abroad. We now have the requisite computing power to process and analyze these data, ultimately with the goal of learning how law works. Many law schools now house scholars who conduct empirical legal scholarship: scholarship that uses data and modern social scientific methods to understand law. At Washington University in St. Louis, the Center for Empirical Research in the Law provides infrastructure for this type of scholarship. Volume 29 of the Washington University Journal of Law and Policy contains eight excellent examples of this type of work conducted by scholars working in the legal academy, the social sciences, and their intersection. The articles focus broadly on decision-making in the federal courts. The authors use innovative data sources and state-of-the-art methodology to study all three levels of the federal judiciary. Three of the Articles in the volume look at decision-making in the federal district courts. The importance of ideology and the methodology used to detect it in the Federal Courts of Appeals is the focus of two articles in the volume. Finally, three articles look at the United States Supreme Court
    • ā€¦
    corecore