54,446 research outputs found

    A Maximum Entropy Procedure to Solve Likelihood Equations

    Get PDF
    In this article, we provide initial findings regarding the problem of solving likelihood equations by means of a maximum entropy (ME) approach. Unlike standard procedures that require equating the score function of the maximum likelihood problem at zero, we propose an alternative strategy where the score is instead used as an external informative constraint to the maximization of the convex Shannon\u2019s entropy function. The problem involves the reparameterization of the score parameters as expected values of discrete probability distributions where probabilities need to be estimated. This leads to a simpler situation where parameters are searched in smaller (hyper) simplex space. We assessed our proposal by means of empirical case studies and a simulation study, the latter involving the most critical case of logistic regression under data separation. The results suggested that the maximum entropy reformulation of the score problem solves the likelihood equation problem. Similarly, when maximum likelihood estimation is difficult, as is the case of logistic regression under separation, the maximum entropy proposal achieved results (numerically) comparable to those obtained by the Firth\u2019s bias-corrected approach. Overall, these first findings reveal that a maximum entropy solution can be considered as an alternative technique to solve the likelihood equation

    Selection of proposal distributions for generalized importance sampling estimators

    Get PDF
    The standard importance sampling (IS) estimator, generally does not work well in examples involving simultaneous inference on several targets as the importance weights can take arbitrarily large values making the estimator highly unstable. In such situations, alternative generalized IS estimators involving samples from multiple proposal distributions are preferred. Just like the standard IS, the success of these multiple IS estimators crucially depends on the choice of the proposal distributions. The selection of these proposal distributions is the focus of this article. We propose three methods based on (i) a geometric space filling coverage criterion, (ii) a minimax variance approach, and (iii) a maximum entropy approach. The first two methods are applicable to any multi-proposal IS estimator, whereas the third approach is described in the context of Doss's (2010) two-stage IS estimator. For the first method we propose a suitable measure of coverage based on the symmetric Kullback-Leibler divergence, while the second and third approaches use estimates of asymptotic variances of Doss's (2010) IS estimator and Geyer's (1994) reverse logistic estimator, respectively. Thus, we provide consistent spectral variance estimators for these asymptotic variances. The proposed methods for selecting proposal densities are illustrated using various detailed examples

    Partially Adaptive Estimation via Maximum Entropy Densities

    Get PDF
    We propose a partially adaptive estimator based on information theoretic maximum entropy estimates of the error distribution. The maximum entropy (maxent) densities have simple yet flexible functional forms to nest most of the mathematical distributions. Unlike the nonparametric fully adaptive estimators, our parametric estimators do not involve choosing a bandwidth or trimming, and only require estimating a small number of nuisance parameters, which is desirable when the sample size is small. Monte Carlo simulations suggest that the proposed estimators fare well with non-normal error distributions. When the errors are normal, the efficiency loss due to redundant nuisance parameters is negligible as the proposed error densities nest the normal. The proposed partially adaptive estimator compares favorably with existing methods, especially when the sample size is small. We apply the estimator to a bio-pharmaceutical example and a stochastic frontier model.

    Heisenberg-style bounds for arbitrary estimates of shift parameters including prior information

    Full text link
    A rigorous lower bound is obtained for the average resolution of any estimate of a shift parameter, such as an optical phase shift or a spatial translation. The bound has the asymptotic form k_I/ where G is the generator of the shift (with an arbitrary discrete or continuous spectrum), and hence establishes a universally applicable bound of the same form as the usual Heisenberg limit. The scaling constant k_I depends on prior information about the shift parameter. For example, in phase sensing regimes, where the phase shift is confined to some small interval of length L, the relative resolution \delta\hat{\Phi}/L has the strict lower bound (2\pi e^3)^{-1/2}/, where m is the number of probes, each with generator G_1, and entangling joint measurements are permitted. Generalisations using other resource measures and including noise are briefly discussed. The results rely on the derivation of general entropic uncertainty relations for continuous observables, which are of interest in their own right.Comment: v2:new bound added for 'ignorance respecting estimates', some clarification

    An extended orthogonal forward regression algorithm for system identification using entropy

    Get PDF
    In this paper, a fast identification algorithm for nonlinear dynamic stochastic system identification is presented. The algorithm extends the classical Orthogonal Forward Regression (OFR) algorithm so that instead of using the Error Reduction Ratio (ERR) for term selection, a new optimality criterion —Shannon’s Entropy Power Reduction Ratio(EPRR) is introduced to deal with both Gaussian and non-Gaussian signals. It is shown that the new algorithm is both fast and reliable and examples are provided to illustrate the effectiveness of the new approach

    Entropic Heisenberg limits and uncertainty relations from the Holevo information bound

    Full text link
    Strong and general entropic and geometric Heisenberg limits are obtained, for estimates of multiparameter unitary displacements in quantum metrology, such as the estimation of a magnetic field from the induced rotation of a probe state in three dimensions. A key ingredient is the Holevo bound on the Shannon mutual information of a quantum communication channel. This leads to a Bayesian bound on performance, in terms of the prior distribution of the displacement and the asymmetry of the input probe state with respect to the displacement group. A geometric measure of performance related to entropy is proposed for general parameter estimation. It is also shown how strong entropic uncertainty relations for mutually unbiased observables, such as number and phase, position and momentum, energy and time, and orthogonal spin-1/2 directions, can be obtained from elementary applications of Holevo's bound. A geometric interpretation of results is emphasised, in terms of the 'volumes' of quantum and classical statistical ensembles.Comment: Submitted to JPA special issue "Shannon's Information Theory 70 years on: applications in classical and quantum physics". v2: shortened, minor corrections and improvement
    • …
    corecore