63,621 research outputs found

    Lambert W random variables - a new family of generalized skewed distributions with applications to risk estimation

    Full text link
    Originating from a system theory and an input/output point of view, I introduce a new class of generalized distributions. A parametric nonlinear transformation converts a random variable XX into a so-called Lambert WW random variable YY, which allows a very flexible approach to model skewed data. Its shape depends on the shape of XX and a skewness parameter γ\gamma. In particular, for symmetric XX and nonzero γ\gamma the output YY is skewed. Its distribution and density function are particular variants of their input counterparts. Maximum likelihood and method of moments estimators are presented, and simulations show that in the symmetric case additional estimation of γ\gamma does not affect the quality of other parameter estimates. Applications in finance and biomedicine show the relevance of this class of distributions, which is particularly useful for slightly skewed data. A practical by-result of the Lambert WW framework: data can be "unskewed." The RR package http://cran.r-project.org/web/packages/LambertWLambertW developed by the author is publicly available (http://cran.r-project.orgCRAN).Comment: Published in at http://dx.doi.org/10.1214/11-AOAS457 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Neural networks in geophysical applications

    Get PDF
    Neural networks are increasingly popular in geophysics. Because they are universal approximators, these tools can approximate any continuous function with an arbitrary precision. Hence, they may yield important contributions to finding solutions to a variety of geophysical applications. However, knowledge of many methods and techniques recently developed to increase the performance and to facilitate the use of neural networks does not seem to be widespread in the geophysical community. Therefore, the power of these tools has not yet been explored to their full extent. In this paper, techniques are described for faster training, better overall performance, i.e., generalization,and the automatic estimation of network size and architecture

    The Hyper Suprime-Cam Software Pipeline

    Full text link
    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.Comment: 39 pages, 21 figures, 2 tables. Submitted to Publications of the Astronomical Society of Japa
    • …
    corecore