43 research outputs found

    Iterative Approximate Consensus in the presence of Byzantine Link Failures

    Full text link
    This paper explores the problem of reaching approximate consensus in synchronous point-to-point networks, where each directed link of the underlying communication graph represents a communication channel between a pair of nodes. We adopt the transient Byzantine link failure model [15, 16], where an omniscient adversary controls a subset of the directed communication links, but the nodes are assumed to be fault-free. Recent work has addressed the problem of reaching approximate consen- sus in incomplete graphs with Byzantine nodes using a restricted class of iterative algorithms that maintain only a small amount of memory across iterations [22, 21, 23, 12]. However, to the best of our knowledge, we are the first to consider approximate consensus in the presence of Byzan- tine links. We extend our past work that provided exact characterization of graphs in which the iterative approximate consensus problem in the presence of Byzantine node failures is solvable [22, 21]. In particular, we prove a tight necessary and sufficient condition on the underlying com- munication graph for the existence of iterative approximate consensus algorithms under transient Byzantine link model. The condition answers (part of) the open problem stated in [16].Comment: arXiv admin note: text overlap with arXiv:1202.609

    Asymptotic behaviour of random tridiagonal Markov chains in biological applications

    Full text link
    Discrete-time discrete-state random Markov chains with a tridiagonal generator are shown to have a random attractor consisting of singleton subsets, essentially a random path, in the simplex of probability vectors. The proof uses the Hilbert projection metric and the fact that the linear cocycle generated by the Markov chain is a uniformly contractive mapping of the positive cone into itself. The proof does not involve probabilistic properties of the sample path and is thus equally valid in the nonautonomous deterministic context of Markov chains with, say, periodically varying transitions probabilities, in which case the attractor is a periodic path.Comment: 13 pages, 22 bibliography references, submitted to DCDS-B, added references and minor correction

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Quantum channels and memory effects

    Full text link
    Any physical process can be represented as a quantum channel mapping an initial state to a final state. Hence it can be characterized from the point of view of communication theory, i.e., in terms of its ability to transfer information. Quantum information provides a theoretical framework and the proper mathematical tools to accomplish this. In this context the notion of codes and communication capacities have been introduced by generalizing them from the classical Shannon theory of information transmission and error correction. The underlying assumption of this approach is to consider the channel not as acting on a single system, but on sequences of systems, which, when properly initialized allow one to overcome the noisy effects induced by the physical process under consideration. While most of the work produced so far has been focused on the case in which a given channel transformation acts identically and independently on the various elements of the sequence (memoryless configuration in jargon), correlated error models appear to be a more realistic way to approach the problem. A slightly different, yet conceptually related, notion of correlated errors applies to a single quantum system which evolves continuously in time under the influence of an external disturbance which acts on it in a non-Markovian fashion. This leads to the study of memory effects in quantum channels: a fertile ground where interesting novel phenomena emerge at the intersection of quantum information theory and other branches of physics. A survey is taken of the field of quantum channels theory while also embracing these specific and complex settings.Comment: Review article, 61 pages, 26 figures; 400 references. Final version of the manuscript, typos correcte

    From Wald to Savage: homo economicus becomes a Bayesian statistician

    Get PDF
    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. A rational agent in an economic model is one who maximizes her subjective expected utility and consistently revises her beliefs according to Bayes’s rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is far from trivial and of great historiographic importance. The story begins with Abraham Wald’s behaviorist approach to statistics and culminates with Leonard J. Savage’s elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. It is the latter’s acknowledged fiasco to achieve its planned goal, the reinterpretation of traditional inferential techniques along subjectivist and behaviorist lines, which raises the puzzle of how a failed project in statistics could turn into such a tremendous hit in economics. A couple of tentative answers are also offered, involving the role of the consistency requirement in neoclassical analysis and the impact of the postwar transformation of US business schools
    corecore