22,024 research outputs found

    On the "Poisson Trick" and its Extensions for Fitting Multinomial Regression Models

    Full text link
    This article is concerned with the fitting of multinomial regression models using the so-called "Poisson Trick". The work is motivated by Chen & Kuo (2001) and Malchow-M{\o}ller & Svarer (2003) which have been criticized for being computationally inefficient and sometimes producing nonsense results. We first discuss the case of independent data and offer a parsimonious fitting strategy when all covariates are categorical. We then propose a new approach for modelling correlated responses based on an extension of the Gamma-Poisson model, where the likelihood can be expressed in closed-form. The parameters are estimated via an Expectation/Conditional Maximization (ECM) algorithm, which can be implemented using functions for fitting generalized linear models readily available in standard statistical software packages. Compared to existing methods, our approach avoids the need to approximate the intractable integrals and thus the inference is exact with respect to the approximating Gamma-Poisson model. The proposed method is illustrated via a reanalysis of the yogurt data discussed by Chen & Kuo (2001)

    Using ultra-short pulses to determine particle size and density distributions

    Get PDF
    We analyze the time dependent response of strongly scattering media (SSM) to ultra-short pulses of light. A random walk technique is used to model the optical scattering of ultra-short pulses of light propagating through media with random shapes and various packing densities. The pulse spreading was found to be strongly dependent on the average particle size, particle size distribution, and the packing fraction. We also show that the intensity as a function of time-delay can be used to analyze the particle size distribution and packing fraction of an optically thick sample independently of the presence of absorption features. Finally, we propose an all new way to measure the shape of ultra-short pulses that have propagated through a SSM.Comment: 15 pages, 29 figures, accepted for publication in Optics Express will update with full reference when it is availabl

    Prototype selection for parameter estimation in complex models

    Full text link
    Parameter estimation in astrophysics often requires the use of complex physical models. In this paper we study the problem of estimating the parameters that describe star formation history (SFH) in galaxies. Here, high-dimensional spectral data from galaxies are appropriately modeled as linear combinations of physical components, called simple stellar populations (SSPs), plus some nonlinear distortions. Theoretical data for each SSP is produced for a fixed parameter vector via computer modeling. Though the parameters that define each SSP are continuous, optimizing the signal model over a large set of SSPs on a fine parameter grid is computationally infeasible and inefficient. The goal of this study is to estimate the set of parameters that describes the SFH of each galaxy. These target parameters, such as the average ages and chemical compositions of the galaxy's stellar populations, are derived from the SSP parameters and the component weights in the signal model. Here, we introduce a principled approach of choosing a small basis of SSP prototypes for SFH parameter estimation. The basic idea is to quantize the vector space and effective support of the model components. In addition to greater computational efficiency, we achieve better estimates of the SFH target parameters. In simulations, our proposed quantization method obtains a substantial improvement in estimating the target parameters over the common method of employing a parameter grid. Sparse coding techniques are not appropriate for this problem without proper constraints, while constrained sparse coding methods perform poorly for parameter estimation because their objective is signal reconstruction, not estimation of the target parameters.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS500 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Biometric surveillance in schools : cause for concern or case for curriculum?

    Get PDF
    This article critically examines the draft consultation paper issued by the Scottish Government to local authorities on the use of biometric technologies in schools in September 2008 (see http://www.scotland.gov.uk/Publications/2008/09/08135019/0). Coming at a time when a number of schools are considering using biometric systems to register and confirm the identity of pupils in a number of settings (cashless catering systems, automated registration of pupils' arrival in school and school library automation), this guidance is undoubtedly welcome. The present focus seems to be on using fingerprints, but as the guidance acknowledges, the debate in future may encompass iris prints, voice prints and facial recognition systems, which are already in use in non-educational settings. The article notes broader developments in school surveillance in Scotland and in the rest of the UK and argues that serious attention must be given to the educational considerations which arise. Schools must prepare pupils for life in the newly emergent 'surveillance society', not by uncritically habituating them to the surveillance systems installed in their schools, but by critically engaging them in thought about the way surveillance technologies work in the wider world, the various rationales given to them, and the implications - in terms of privacy, safety and inclusion - of being a 'surveilled subject'
    • …
    corecore