290 research outputs found
RMCMC: A System for Updating Bayesian Models
A system to update estimates from a sequence of probability distributions is
presented. The aim of the system is to quickly produce estimates with a
user-specified bound on the Monte Carlo error. The estimates are based upon
weighted samples stored in a database. The stored samples are maintained such
that the accuracy of the estimates and quality of the samples is satisfactory.
This maintenance involves varying the number of samples in the database and
updating their weights. New samples are generated, when required, by a Markov
chain Monte Carlo algorithm. The system is demonstrated using a football league
model that is used to predict the end of season table. Correctness of the
estimates and their accuracy is shown in a simulation using a linear Gaussian
model
The chopthin algorithm for resampling
Resampling is a standard step in particle filters and more generally
sequential Monte Carlo methods. We present an algorithm, called chopthin, for
resampling weighted particles. In contrast to standard resampling methods the
algorithm does not produce a set of equally weighted particles; instead it
merely enforces an upper bound on the ratio between the weights. Simulation
studies show that the chopthin algorithm consistently outperforms standard
resampling methods. The algorithms chops up particles with large weight and
thins out particles with low weight, hence its name. It implicitly guarantees a
lower bound on the effective sample size. The algorithm can be implemented
efficiently, making it practically useful. We show that the expected
computational effort is linear in the number of particles. Implementations for
C++, R (on CRAN), Python and Matlab are available.Comment: 14 pages, 4 figure
Recommended from our members
Managing data quality in the health care industry: Some critical issues
Data quality analysis in health care is a large and ongoing problem. Because the scope oj health care information systems is extremely broad and because most such systems have either a direct or indirect impact on the provision of health care, their quality is a topic of critical importance for information management professionals. Of vital concern is the data flow from the point of patient service delivery to an end point of decision support and analysis of the data derived from the encounter. The historical use of these data, their current uses and how industry dynamics have created a need for improved l^ata quality provides the context for a discussion of health care data quality. We address the definition of health care data, assessing data quality, and the need for ejfective data quality management. Next, the paper reviews some popular methods employed in the health care industry. We then propose an integrated framework to improve data quality in health care, and finally summarize the conclusions of this study
Plateau Proposal Distributions for Adaptive Component-wise Multiple-Try Metropolis
Markov chain Monte Carlo (MCMC) methods are sampling methods that have become
a commonly used tool in statistics, for example to perform Monte Carlo
integration. As a consequence of the increase in computational power, many
variations of MCMC methods exist for generating samples from arbitrary,
possibly complex, target distributions. The performance of an MCMC method is
predominately governed by the choice of the so-called proposal distribution
used. In this paper, we introduce a new type of proposal distribution for the
use in MCMC methods that operates component-wise and with multiple trials per
iteration. Specifically, the novel class of proposal distributions, called
Plateau distributions, do not overlap, thus ensuring that the multiple trials
are drawn from different regions of the state space. Furthermore, the Plateau
proposal distributions allow for a bespoke adaptation procedure that lends
itself to a Markov chain with efficient problem dependent state space
exploration and improved burn-in properties. Simulation studies show that our
novel MCMC algorithm outperforms competitors when sampling from distributions
with a complex shape, highly correlated components or multiple modes.Comment: 24 pages, 12 figure
CORPORATE SOCIAL RESPONSIBILITY AND GREEN IT: THE LINKAGE AND CASE ANALYSIS
Corporate social responsibility (CSR) and Green information technology (Green IT) are two important disciplines that could be cooperatively work toward a common goal of achieving environmental sustainability and ultimately, reaching to ultimate sustainability in society. This study discussed a method of value model analysis that combines the operational procedures of CSR and Green IT. A case study is adopted to illustrate the four stages’ value creation process
Service Fairness and Customer Satisfaction in Internet Bank: From a Trust and Perceived Customer Value Perspective
Recent research has found that fairness plays a key role in customer satisfaction. Fairness in an online context and how it influences customer satisfaction, however, has yet been investigated. This research examines satisfaction from a fairness lens and explores the mediators of fairness to satisfaction in the internet bank context. 131 surveys were analyzed and results show that in internet bank, fairness that includes distributive fairness, procedural fairness and informational fairness is positively related to customer satisfaction. Trust and value are identified as two key mediators of fairness to customer satisfaction
- …