163 research outputs found
Skellam shrinkage: Wavelet-based intensity estimation for inhomogeneous Poisson data
The ubiquity of integrating detectors in imaging and other applications
implies that a variety of real-world data are well modeled as Poisson random
variables whose means are in turn proportional to an underlying vector-valued
signal of interest. In this article, we first show how the so-called Skellam
distribution arises from the fact that Haar wavelet and filterbank transform
coefficients corresponding to measurements of this type are distributed as sums
and differences of Poisson counts. We then provide two main theorems on Skellam
shrinkage, one showing the near-optimality of shrinkage in the Bayesian setting
and the other providing for unbiased risk estimation in a frequentist context.
These results serve to yield new estimators in the Haar transform domain,
including an unbiased risk estimate for shrinkage of Haar-Fisz
variance-stabilized data, along with accompanying low-complexity algorithms for
inference. We conclude with a simulation study demonstrating the efficacy of
our Skellam shrinkage estimators both for the standard univariate wavelet test
functions as well as a variety of test images taken from the image processing
literature, confirming that they offer substantial performance improvements
over existing alternatives.Comment: 27 pages, 8 figures, slight formatting changes; submitted for
publicatio
A Noise-Aware Coding Scheme for Texture Classification
Texture-based analysis of images is a very common and much discussed issue in the fields of computer vision and image processing. Several methods have already been proposed to codify texture micro-patterns (texlets) in images. Most of these methods perform well when a given image is noise-free, but real world images contain different types of signal-independent as well as signal-dependent noises originated from different sources, even from the camera sensor itself. Hence, it is necessary to differentiate false textures appearing due to the noises, and thus, to achieve a reliable representation of texlets. In this proposal, we define an adaptive noise band (ANB) to approximate the amount of noise contamination around a pixel up to a certain extent. Based on this ANB, we generate reliable codes named noise tolerant ternary pattern (NTTP) to represent the texlets in an image. Extensive experiments on several datasets from renowned texture databases, such as the Outex and the Brodatz database, show that NTTP performs much better than the state-of-the-art methods
Fractional Calculus and the Future of Science
Newton foresaw the limitations of geometry’s description of planetary behavior and developed fluxions (differentials) as the new language for celestial mechanics and as the way to implement his laws of mechanics. Two hundred years later Mandelbrot introduced the notion of fractals into the scientific lexicon of geometry, dynamics, and statistics and in so doing suggested ways to see beyond the limitations of Newton’s laws. Mandelbrot’s mathematical essays suggest how fractals may lead to the understanding of turbulence, viscoelasticity, and ultimately to end of dominance of the Newton’s macroscopic world view.Fractional Calculus and the Future of Science examines the nexus of these two game-changing contributions to our scientific understanding of the world. It addresses how non-integer differential equations replace Newton’s laws to describe the many guises of complexity, most of which lay beyond Newton’s experience, and many had even eluded Mandelbrot’s powerful intuition. The book’s authors look behind the mathematics and examine what must be true about a phenomenon’s behavior to justify the replacement of an integer-order with a noninteger-order (fractional) derivative. This window into the future of specific science disciplines using the fractional calculus lens suggests how what is seen entails a difference in scientific thinking and understanding
The Skellam Distribution revisited -Estimating the unobserved incoming and outgoing ICU COVID-19 patients on a regional level in Germany
With the beginning of the COVID-19 pandemic, we became aware of the need for
comprehensive data collection and its provision to scientists and experts for
proper data analyses. In Germany, the Robert Koch Institute (RKI) has tried to
keep up with this demand for data on COVID-19, but there were (and still are)
relevant data missing that are needed to understand the whole picture of the
pandemic. In this paper, we take a closer look at the severity of the course of
COVID-19 in Germany, for which ideal information would be the number of
incoming patients to ICU units. This information was (and still is) not
available. Instead, the current occupancy of ICU units on the district level
was reported daily. We demonstrate how this information can be used to predict
the number of incoming as well as released COVID-19 patients using a stochastic
version of the Expectation Maximisation algorithm (SEM). This in turn, allows
for estimating the influence of district-specific and age-specific infection
rates as well as further covariates, including spatial effects, on the number
of incoming patients. The paper demonstrates that even if relevant data are not
recorded or provided officially, statistical modelling allows for
reconstructing them. This also includes the quantification of uncertainty which
naturally results from the application of the SEM algorithm.Comment: 30 pages, 10 figure
- …