96,964 research outputs found

    Statistical Decisions Using Likelihood Information Without Prior Probabilities

    Get PDF
    This is a short 9-pp version of a longer working paper titled "Decision Making on the Sole Basis of Statistical Likelihood," School of Business Working Paper, Revised November 2004.This paper presents a decision-theoretic approach to statistical inference that satisfies the Likelihood Principle (LP) without using prior information. Unlike the Bayesian approach, which also satisfies LP, we do not assume knowledge of the prior distribution of the unknown parameter. With respect to information that can be obtained from an experiment, our solution is more efficient than Waldâ s minimax solution. However, with respect to information assumed to be known before the experiment, our solution demands less input than the Bayesian solution

    Coherent frequentism

    Full text link
    By representing the range of fair betting odds according to a pair of confidence set estimators, dual probability measures on parameter space called frequentist posteriors secure the coherence of subjective inference without any prior distribution. The closure of the set of expected losses corresponding to the dual frequentist posteriors constrains decisions without arbitrarily forcing optimization under all circumstances. This decision theory reduces to those that maximize expected utility when the pair of frequentist posteriors is induced by an exact or approximate confidence set estimator or when an automatic reduction rule is applied to the pair. In such cases, the resulting frequentist posterior is coherent in the sense that, as a probability distribution of the parameter of interest, it satisfies the axioms of the decision-theoretic and logic-theoretic systems typically cited in support of the Bayesian posterior. Unlike the p-value, the confidence level of an interval hypothesis derived from such a measure is suitable as an estimator of the indicator of hypothesis truth since it converges in sample-space probability to 1 if the hypothesis is true or to 0 otherwise under general conditions.Comment: The confidence-measure theory of inference and decision is explicitly extended to vector parameters of interest. The derivation of upper and lower confidence levels from valid and nonconservative set estimators is formalize

    Classes of decision analysis

    Get PDF
    The ultimate task of an engineer consists of developing a consistent decision procedure for the planning, design, construction and use and management of a project. Moreover, the utility over the entire lifetime of the project should be maximized, considering requirements with respect to safety of individuals and the environment as specified in regulations. Due to the fact that the information with respect to design parameters is usually incomplete or uncertain, decisions are made under uncertainty. In order to cope with this, Bayesian statistical decision theory can be used to incorporate objective as well as subjective information (e.g. engineering judgement). In this factsheet, the decision tree is presented and answers are given for questions on how new data can be combined with prior probabilities that have been assigned, and whether it is beneficial or not to collect more information before the final decision is made. Decision making based on prior analysis and posterior analysis is briefly explained. Pre-posterior analysis is considered in more detail and the Value of Information (VoI) is defined

    Topics in inference and decision-making with partial knowledge

    Get PDF
    Two essential elements needed in the process of inference and decision-making are prior probabilities and likelihood functions. When both of these components are known accurately and precisely, the Bayesian approach provides a consistent and coherent solution to the problems of inference and decision-making. In many situations, however, either one or both of the above components may not be known, or at least may not be known precisely. This problem of partial knowledge about prior probabilities and likelihood functions is addressed. There are at least two ways to cope with this lack of precise knowledge: robust methods, and interval-valued methods. First, ways of modeling imprecision and indeterminacies in prior probabilities and likelihood functions are examined; then how imprecision in the above components carries over to the posterior probabilities is examined. Finally, the problem of decision making with imprecise posterior probabilities and the consequences of such actions are addressed. Application areas where the above problems may occur are in statistical pattern recognition problems, for example, the problem of classification of high-dimensional multispectral remote sensing image data

    Keep Ballots Secret: On the Futility of Social Learning in Decision Making by Voting

    Full text link
    We show that social learning is not useful in a model of team binary decision making by voting, where each vote carries equal weight. Specifically, we consider Bayesian binary hypothesis testing where agents have any conditionally-independent observation distribution and their local decisions are fused by any L-out-of-N fusion rule. The agents make local decisions sequentially, with each allowed to use its own private signal and all precedent local decisions. Though social learning generally occurs in that precedent local decisions affect an agent's belief, optimal team performance is obtained when all precedent local decisions are ignored. Thus, social learning is futile, and secret ballots are optimal. This contrasts with typical studies of social learning because we include a fusion center rather than concentrating on the performance of the latest-acting agents
    • …
    corecore