39 research outputs found
Modeling for seasonal marked point processes: An analysis of evolving hurricane occurrences
Seasonal point processes refer to stochastic models for random events which
are only observed in a given season. We develop nonparametric Bayesian
methodology to study the dynamic evolution of a seasonal marked point process
intensity. We assume the point process is a nonhomogeneous Poisson process and
propose a nonparametric mixture of beta densities to model dynamically evolving
temporal Poisson process intensities. Dependence structure is built through a
dependent Dirichlet process prior for the seasonally-varying mixing
distributions. We extend the nonparametric model to incorporate time-varying
marks, resulting in flexible inference for both the seasonal point process
intensity and for the conditional mark distribution. The motivating application
involves the analysis of hurricane landfalls with reported damages along the
U.S. Gulf and Atlantic coasts from 1900 to 2010. We focus on studying the
evolution of the intensity of the process of hurricane landfall occurrences,
and the respective maximum wind speed and associated damages. Our results
indicate an increase in the number of hurricane landfall occurrences and a
decrease in the median maximum wind speed at the peak of the season.
Introducing standardized damage as a mark, such that reported damages are
comparable both in time and space, we find that there is no significant rising
trend in hurricane damages over time.Comment: Published at http://dx.doi.org/10.1214/14-AOAS796 in the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
BASS: An R Package for Fitting and Performing Sensitivity Analysis of Bayesian Adaptive Spline Surfaces
We present the R package BASS as a tool for nonparametric regression. The primary focus of the package is fitting fully Bayesian adaptive spline surface (BASS) models and performing global sensitivity analyses of these models. The BASS framework is similar to that of Bayesian multivariate adaptive regression splines (BMARS) from Denison, Mallick, and Smith (1998), but with many added features. The software is built to efficiently handle significant amounts of data with many continuous or categorical predictors and with functional response. Under our Bayesian framework, most priors are automatic but these can be modified by the user to focus on parsimony and the avoidance of overfitting. If directed to do so, the software uses parallel tempering to improve the reversible jump Markov chain Monte Carlo (RJMCMC) methods used to perform inference. We discuss the implementation of these features and present the performance of BASS in a number of analyses of simulated and real data
A Heterogeneous Spatial Model for Soil Carbon Mapping of the Contiguous United States Using VNIR Spectra
The Rapid Carbon Assessment, conducted by the U.S. Department of Agriculture,
was implemented in order to obtain a representative sample of soil organic
carbon across the contiguous United States. In conjunction with a statistical
model, the dataset allows for mapping of soil carbon prediction across the
U.S., however there are two primary challenges to such an effort. First, there
exists a large degree of heterogeneity in the data, whereby both the first and
second moments of the data generating process seem to vary both spatially and
for different land-use categories. Second, the majority of the sampled
locations do not actually have lab measured values for soil organic carbon.
Rather, visible and near-infrared (VNIR) spectra were measured at most
locations, which act as a proxy to help predict carbon content. Thus, we
develop a heterogeneous model to analyze this data that allows both the mean
and the variance to vary as a function of space as well as land-use category,
while incorporating VNIR spectra as covariates. After a cross-validation study
that establishes the effectiveness of the model, we construct a complete map of
soil organic carbon for the contiguous U.S. along with uncertainty
quantification
Nonparametric Dark Energy Reconstruction from Supernova Data
Understanding the origin of the accelerated expansion of the Universe poses
one of the greatest challenges in physics today. Lacking a compelling
fundamental theory to test, observational efforts are targeted at a better
characterization of the underlying cause. If a new form of mass-energy, dark
energy, is driving the acceleration, the redshift evolution of the equation of
state parameter w(z) will hold essential clues as to its origin. To best
exploit data from observations it is necessary to develop a robust and accurate
reconstruction approach, with controlled errors, for w(z). We introduce a new,
nonparametric method for solving the associated statistical inverse problem
based on Gaussian Process modeling and Markov chain Monte Carlo sampling.
Applying this method to recent supernova measurements, we reconstruct the
continuous history of w out to redshift z=1.5.Comment: 4 pages, 2 figures, accepted for publication in Physical Review
Letter
Nonparametric Reconstruction of the Dark Energy Equation of State from Diverse Data Sets
The cause of the accelerated expansion of the Universe poses one of the most
fundamental questions in physics today. In the absence of a compelling theory
to explain the observations, a first task is to develop a robust phenomenology.
If the acceleration is driven by some form of dark energy, then, the
phenomenology is determined by the dark energy equation of state w. A major aim
of ongoing and upcoming cosmological surveys is to measure w and its time
dependence at high accuracy. Since w(z) is not directly accessible to
measurement, powerful reconstruction methods are needed to extract it reliably
from observations. We have recently introduced a new reconstruction method for
w(z) based on Gaussian process modeling. This method can capture nontrivial
time-dependences in w(z) and, most importantly, it yields controlled and
unbaised error estimates. In this paper we extend the method to include a
diverse set of measurements: baryon acoustic oscillations, cosmic microwave
background measurements, and supernova data. We analyze currently available
data sets and present the resulting constraints on w(z), finding that current
observations are in very good agreement with a cosmological constant. In
addition we explore how well our method captures nontrivial behavior of w(z) by
analyzing simulated data assuming high-quality observations from future
surveys. We find that the baryon acoustic oscillation measurements by
themselves already lead to remarkably good reconstruction results and that the
combination of different high-quality probes allows us to reconstruct w(z) very
reliably with small error bounds.Comment: 14 pages, 9 figures, 3 table
Nonparametric Reconstruction of the Dark Energy Equation of State
A basic aim of ongoing and upcoming cosmological surveys is to unravel the
mystery of dark energy. In the absence of a compelling theory to test, a
natural approach is to better characterize the properties of dark energy in
search of clues that can lead to a more fundamental understanding. One way to
view this characterization is the improved determination of the
redshift-dependence of the dark energy equation of state parameter, w(z). To do
this requires a robust and bias-free method for reconstructing w(z) from data
that does not rely on restrictive expansion schemes or assumed functional forms
for w(z). We present a new nonparametric reconstruction method that solves for
w(z) as a statistical inverse problem, based on a Gaussian Process
representation. This method reliably captures nontrivial behavior of w(z) and
provides controlled error bounds. We demonstrate the power of the method on
different sets of simulated supernova data; the approach can be easily extended
to include diverse cosmological probes.Comment: 16 pages, 11 figures, accepted for publication in Physical Review
Recommended from our members
Bayesian Non-Parametric Inference for Multivariate Peaks-over-Threshold Models
We consider a constructive definition of the multivariate Pareto that factorizes the random vector into a radial component and an independent angular component. The former follows a univariate Pareto distribution, and the latter is defined on the surface of the positive orthant of the infinity norm unit hypercube. We propose a method for inferring the distribution of the angular component by identifying its support as the limit of the positive orthant of the unit p-norm spheres and introduce a projected gamma family of distributions defined through the normalization of a vector of independent random gammas to the space. This serves to construct a flexible family of distributions obtained as a Dirichlet process mixture of projected gammas. For model assessment, we discuss scoring methods appropriate to distributions on the unit hypercube. In particular, working with the energy score criterion, we develop a kernel metric that produces a proper scoring rule and presents a simulation study to compare different modeling choices using the proposed metric. Using our approach, we describe the dependence structure of extreme values in the integrated vapor transport (IVT), data describing the flow of atmospheric moisture along the coast of California. We find clear but heterogeneous geographical dependence