2,131 research outputs found

    Coding against synchronisation and related errors

    Get PDF
    In this thesis, we study aspects of coding against synchronisation errors, such as deletions and replications, and related errors. Synchronisation errors are a source of fundamental open problems in information theory, because they introduce correlations between output symbols even when input symbols are independently distributed. We focus on random errors, and consider two complementary problems: We study the optimal rate of reliable information transmission through channels with synchronisation and related errors (the channel capacity). Unlike simpler error models, the capacity of such channels is unknown. We first consider the geometric sticky channel, which replicates input bits according to a geometric distribution. Previously, bounds on its capacity were known only via numerical methods, which do not aid our conceptual understanding of this quantity. We derive sharp analytical capacity upper bounds which approach, and sometimes surpass, numerical bounds. This opens the door to a mathematical treatment of its capacity. We consider also the geometric deletion channel, combining deletions and geometric replications. We derive analytical capacity upper bounds, and notably prove that the capacity is bounded away from the maximum when the deletion probability is small, meaning that this channel behaves differently than related well-studied channels in this regime. Finally, we adapt techniques developed to handle synchronisation errors to derive improved upper bounds and structural results on the capacity of the discrete-time Poisson channel, a model of optical communication. Motivated by portable DNA-based storage and trace reconstruction, we introduce and study the coded trace reconstruction problem, where the goal is to design efficiently encodable high-rate codes whose codewords can be efficiently reconstructed from few reads corrupted by deletions. Remarkably, we design such n-bit codes with rate 1-O(1/log n) that require exponentially fewer reads than average-case trace reconstruction algorithms.Open Acces

    Expressions for the entropy of binomial-type distributions

    Get PDF
    We develop a general method for computing logarithmic and log-gamma expectations of distributions. As a result, we derive series expansions and integral representations of the entropy for several fundamental distributions, including the Poisson, binomial, beta-binomial, negative binomial, and hypergeometric distributions. Our results also establish connections between the entropy functions and to the Riemann zeta function and its generalizations

    Monetary Policy Under Uncertainty in Micro-Founded Macroeconometric Models

    Get PDF
    We use a micro-founded macroeconometric modeling framework to investigate the design of monetary policy when the central bank faces uncertainty about the true structure of the economy. We apply Bayesian methods to estimate the parameters of the baseline specification using postwar U.S. data, and then determine the policy under commitment that maximizes household welfare. We find that the performance of the optimal policy is closely matched by a simple operational rule that focuses solely on stabilizing nominal wage inflation. Furthermore, this simple wage stabilization rule is remarkably robust to uncertainty about the model parameters and to various assumptions regarding the nature and incidence of the innovations. However, the characteristics of optimal policy are very sensitive to the specification of the wage contracting mechanism, thereby highlighting the importance of additional research regarding the structure of labor markets and wage determination.

    Imperfect Information and Aggregate Supply

    Get PDF
    This paper surveys the research in the past decade on imperfect information models of aggregate supply and the Phillips curve. This new work has emphasized that information is dispersed and disseminates slowly across a population of agents who strategically interact in their use of information. We discuss the foundations on which models of aggregate supply rest, as well as the micro-foundations for two classes of imperfect information models: models with partial information, where agents observe economic conditions with noise, and models with delayed information, where they observe economic conditions with a lag. We derive the implications of these two classes of models for: the existence of a non-vertical aggregate supply, the persistence of the real effects of monetary policy, the difference between idiosyncratic and aggregate shocks, the dynamics of disagreement, and the role of transparency in policy. Finally, we present some of the topics on the research frontier in this area
    corecore