45,948 research outputs found
Plausibility functions and exact frequentist inference
In the frequentist program, inferential methods with exact control on error
rates are a primary focus. The standard approach, however, is to rely on
asymptotic approximations, which may not be suitable. This paper presents a
general framework for the construction of exact frequentist procedures based on
plausibility functions. It is shown that the plausibility function-based tests
and confidence regions have the desired frequentist properties in finite
samples---no large-sample justification needed. An extension of the proposed
method is also given for problems involving nuisance parameters. Examples
demonstrate that the plausibility function-based method is both exact and
efficient in a wide variety of problems.Comment: 21 pages, 5 figures, 3 table
EEMCS final report for the causal modeling for air transport safety (CATS) project
This document reports on the work realized by the DIAM in relation to the completion of the CATS model as presented in Figure 1.6 and tries to explain some of the steps taken for its completion. The project spans over a period of time of three years. Intermediate reports have been presented throughout the projectâs progress. These are presented in Appendix 1. In this report the continuousâdiscrete distributionâfree BBNs are briefly discussed. The human reliability models developed for dealing with dependence in the model variables are described and the software application UniNet is presente
A Knowledge Gradient Policy for Sequencing Experiments to Identify the Structure of RNA Molecules Using a Sparse Additive Belief Model
We present a sparse knowledge gradient (SpKG) algorithm for adaptively
selecting the targeted regions within a large RNA molecule to identify which
regions are most amenable to interactions with other molecules. Experimentally,
such regions can be inferred from fluorescence measurements obtained by binding
a complementary probe with fluorescence markers to the targeted regions. We use
a biophysical model which shows that the fluorescence ratio under the log scale
has a sparse linear relationship with the coefficients describing the
accessibility of each nucleotide, since not all sites are accessible (due to
the folding of the molecule). The SpKG algorithm uniquely combines the Bayesian
ranking and selection problem with the frequentist regularized
regression approach Lasso. We use this algorithm to identify the sparsity
pattern of the linear model as well as sequentially decide the best regions to
test before experimental budget is exhausted. Besides, we also develop two
other new algorithms: batch SpKG algorithm, which generates more suggestions
sequentially to run parallel experiments; and batch SpKG with a procedure which
we call length mutagenesis. It dynamically adds in new alternatives, in the
form of types of probes, are created by inserting, deleting or mutating
nucleotides within existing probes. In simulation, we demonstrate these
algorithms on the Group I intron (a mid-size RNA molecule), showing that they
efficiently learn the correct sparsity pattern, identify the most accessible
region, and outperform several other policies
Bayesian Inference in Processing Experimental Data: Principles and Basic Applications
This report introduces general ideas and some basic methods of the Bayesian
probability theory applied to physics measurements. Our aim is to make the
reader familiar, through examples rather than rigorous formalism, with concepts
such as: model comparison (including the automatic Ockham's Razor filter
provided by the Bayesian approach); parametric inference; quantification of the
uncertainty about the value of physical quantities, also taking into account
systematic effects; role of marginalization; posterior characterization;
predictive distributions; hierarchical modelling and hyperparameters; Gaussian
approximation of the posterior and recovery of conventional methods, especially
maximum likelihood and chi-square fits under well defined conditions; conjugate
priors, transformation invariance and maximum entropy motivated priors; Monte
Carlo estimates of expectation, including a short introduction to Markov Chain
Monte Carlo methods.Comment: 40 pages, 2 figures, invited paper for Reports on Progress in Physic
Belief Evolution Network-based Probability Transformation and Fusion
Smets proposes the Pignistic Probability Transformation (PPT) as the decision
layer in the Transferable Belief Model (TBM), which argues when there is no
more information, we have to make a decision using a Probability Mass Function
(PMF). In this paper, the Belief Evolution Network (BEN) and the full causality
function are proposed by introducing causality in Hierarchical Hypothesis Space
(HHS). Based on BEN, we interpret the PPT from an information fusion view and
propose a new Probability Transformation (PT) method called Full Causality
Probability Transformation (FCPT), which has better performance under
Bi-Criteria evaluation. Besides, we heuristically propose a new probability
fusion method based on FCPT. Compared with Dempster Rule of Combination (DRC),
the proposed method has more reasonable result when fusing same evidence
Computational statistics using the Bayesian Inference Engine
This paper introduces the Bayesian Inference Engine (BIE), a general
parallel, optimised software package for parameter inference and model
selection. This package is motivated by the analysis needs of modern
astronomical surveys and the need to organise and reuse expensive derived data.
The BIE is the first platform for computational statistics designed explicitly
to enable Bayesian update and model comparison for astronomical problems.
Bayesian update is based on the representation of high-dimensional posterior
distributions using metric-ball-tree based kernel density estimation. Among its
algorithmic offerings, the BIE emphasises hybrid tempered MCMC schemes that
robustly sample multimodal posterior distributions in high-dimensional
parameter spaces. Moreover, the BIE is implements a full persistence or
serialisation system that stores the full byte-level image of the running
inference and previously characterised posterior distributions for later use.
Two new algorithms to compute the marginal likelihood from the posterior
distribution, developed for and implemented in the BIE, enable model comparison
for complex models and data sets. Finally, the BIE was designed to be a
collaborative platform for applying Bayesian methodology to astronomy. It
includes an extensible object-oriented and easily extended framework that
implements every aspect of the Bayesian inference. By providing a variety of
statistical algorithms for all phases of the inference problem, a scientist may
explore a variety of approaches with a single model and data implementation.
Additional technical details and download details are available from
http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GPL.Comment: Resubmitted version. Additional technical details and download
details are available from http://www.astro.umass.edu/bie. The BIE is
distributed under the GNU GP
- âŠ