743 research outputs found
Regularized multivariate von Mises distribution
Regularization is necessary to avoid overfitting when the
number of data samples is low compared to the number of parameters
of the model. In this paper, we introduce a flexible L1 regularization
for the multivariate von Mises distribution. We also propose a circular
distance that can be used to estimate the Kullback-Leibler divergence
between two circular distributions by means of sampling, and also serves
as goodness-of-fit measure. We compare the models on synthetic data
and real morphological data from human neurons and show that the
regularized model achieves better results than non regularized von Mises
model
Frobenius norm regularization for the multivariate von Misses distribution
Penalizing the model complexity is necessary to avoid overfittingwhen the number of data samples
is low with respect to the number of model parameters. In this paper, we introduce a penalization
term that places an independent prior distribution for each parameter of the multivariate von Mises
distribution.We also propose a circular distance that can be used to estimate the Kullback–Leibler
divergence between any two circular distributions as goodness-of-fit measure. We compare the
resulting regularized von Mises models on synthetic data and real neuroanatomical data to show
that the distribution fitted using the penalized estimator generally achieves better results than
nonpenalized multivariate von Mises estimator
Recent advances in directional statistics
Mainstream statistical methodology is generally applicable to data observed
in Euclidean space. There are, however, numerous contexts of considerable
scientific interest in which the natural supports for the data under
consideration are Riemannian manifolds like the unit circle, torus, sphere and
their extensions. Typically, such data can be represented using one or more
directions, and directional statistics is the branch of statistics that deals
with their analysis. In this paper we provide a review of the many recent
developments in the field since the publication of Mardia and Jupp (1999),
still the most comprehensive text on directional statistics. Many of those
developments have been stimulated by interesting applications in fields as
diverse as astronomy, medicine, genetics, neurology, aeronautics, acoustics,
image analysis, text mining, environmetrics, and machine learning. We begin by
considering developments for the exploratory analysis of directional data
before progressing to distributional models, general approaches to inference,
hypothesis testing, regression, nonparametric curve estimation, methods for
dimension reduction, classification and clustering, and the modelling of time
series, spatial and spatio-temporal data. An overview of currently available
software for analysing directional data is also provided, and potential future
developments discussed.Comment: 61 page
Local polynomial regression for circular predictors
We consider local smoothing of datasets where the design space is the d-dimensional (d >= 1) torus and the response variable is real-valued. Our purpose is to extend least squares local polynomial fitting to this situation. We give both theoretical and empirical results
- …