25,290 research outputs found
Sufficient burn-in for Gibbs samplers for a hierarchical random effects model
We consider Gibbs and block Gibbs samplers for a Bayesian hierarchical
version of the one-way random effects model. Drift and minorization conditions
are established for the underlying Markov chains. The drift and minorization
are used in conjunction with results from J. S. Rosenthal [J. Amer. Statist.
Assoc. 90 (1995) 558-566] and G. O. Roberts and R. L. Tweedie [Stochastic
Process. Appl. 80 (1999) 211-229] to construct analytical upper bounds on the
distance to stationarity. These lead to upper bounds on the amount of burn-in
that is required to get the chain within a prespecified (total variation)
distance of the stationary distribution. The results are illustrated with a
numerical example
Insurance: an R-Program to Model Insurance Data
Data sets from car insurance companies often have a high-dimensional complex dependency structure. The use of classical statistical methods such as generalized linear models or Tweedie?s compound Poisson model can yield problems in this case. Christmann (2004) proposed a general approach to model the pure premium by exploiting characteristic features of such data sets. In this paper we describe a program to use this approach based on a combination of multinomial logistic regression and [epsilon]-support vector regression from modern statistical machine learning. --Claim size,insurance tariff,logistic regression,statistical machine learning,support vector regression
Alien Registration- Tweedie, S. Louise (Greenville, Piscataquis County)
https://digitalmaine.com/alien_docs/8505/thumbnail.jp
Causality and Association: The Statistical and Legal Approaches
This paper discusses different needs and approaches to establishing
``causation'' that are relevant in legal cases involving statistical input
based on epidemiological (or more generally observational or population-based)
information. We distinguish between three versions of ``cause'': the first
involves negligence in providing or allowing exposure, the second involves
``cause'' as it is shown through a scientifically proved increased risk of an
outcome from the exposure in a population, and the third considers ``cause'' as
it might apply to an individual plaintiff based on the first two. The
population-oriented ``cause'' is that commonly addressed by statisticians, and
we propose a variation on the Bradford Hill approach to testing such causality
in an observational framework, and discuss how such a systematic series of
tests might be considered in a legal context. We review some current legal
approaches to using probabilistic statements, and link these with the
scientific methodology as developed here. In particular, we provide an approach
both to the idea of individual outcomes being caused on a balance of
probabilities, and to the idea of material contribution to such outcomes.
Statistical terminology and legal usage of terms such as ``proof on the balance
of probabilities'' or ``causation'' can easily become confused, largely due to
similar language describing dissimilar concepts; we conclude, however, that a
careful analysis can identify and separate those areas in which a legal
decision alone is required and those areas in which scientific approaches are
useful.Comment: Published in at http://dx.doi.org/10.1214/07-STS234 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
On PoissonâTweedie mixtures
Poisson-Tweedie mixtures are the Poisson mixtures for which the mixing measure is generated by those members of the family of Tweedie distributions whose support is non-negative. This class of non-negative integer-valued distributions is comprised of Neyman type A, back-shifted negative binomial, compound Poisson-negative binomial, discrete stable and exponentially tilted discrete stable laws. For a specific value of the âpowerâ parameter associated with the corresponding Tweedie distributions, such mixtures comprise an additive exponential dispersion model. We derive closed-form expressions for the related variance functions in terms of the exponential tilting invariants and particular special functions. We compare specific Poisson-Tweedie models with the corresponding Hinde-DemĂ©trio exponential dispersion models which possess a comparable unit variance function. We construct numerous local approximations for specific subclasses of Poisson-Tweedie mixtures and identify LĂ©vy measure for all the members of this three-parameter family
A generalized Fellner-Schall method for smoothing parameter estimation with application to Tweedie location, scale and shape models
We consider the estimation of smoothing parameters and variance components in
models with a regular log likelihood subject to quadratic penalization of the
model coefficients, via a generalization of the method of Fellner (1986) and
Schall (1991). In particular: (i) we generalize the original method to the case
of penalties that are linear in several smoothing parameters, thereby covering
the important cases of tensor product and adaptive smoothers; (ii) we show why
the method's steps increase the restricted marginal likelihood of the model,
that it tends to converge faster than the EM algorithm, or obvious
accelerations of this, and investigate its relation to Newton optimization;
(iii) we generalize the method to any Fisher regular likelihood. The method
represents a considerable simplification over existing methods of estimating
smoothing parameters in the context of regular likelihoods, without sacrificing
generality: for example, it is only necessary to compute with the same first
and second derivatives of the log-likelihood required for coefficient
estimation, and not with the third or fourth order derivatives required by
alternative approaches. Examples are provided which would have been impossible
or impractical with pre-existing Fellner-Schall methods, along with an example
of a Tweedie location, scale and shape model which would be a challenge for
alternative methods
- âŠ