37,592 research outputs found
Supervised learning on graphs of spatio-temporal similarity in satellite image sequences
High resolution satellite image sequences are multidimensional signals
composed of spatio-temporal patterns associated to numerous and various
phenomena. Bayesian methods have been previously proposed in (Heas and Datcu,
2005) to code the information contained in satellite image sequences in a graph
representation using Bayesian methods. Based on such a representation, this
paper further presents a supervised learning methodology of semantics
associated to spatio-temporal patterns occurring in satellite image sequences.
It enables the recognition and the probabilistic retrieval of similar events.
Indeed, graphs are attached to statistical models for spatio-temporal
processes, which at their turn describe physical changes in the observed scene.
Therefore, we adjust a parametric model evaluating similarity types between
graph patterns in order to represent user-specific semantics attached to
spatio-temporal phenomena. The learning step is performed by the incremental
definition of similarity types via user-provided spatio-temporal pattern
examples attached to positive or/and negative semantics. From these examples,
probabilities are inferred using a Bayesian network and a Dirichlet model. This
enables to links user interest to a specific similarity model between graph
patterns. According to the current state of learning, semantic posterior
probabilities are updated for all possible graph patterns so that similar
spatio-temporal phenomena can be recognized and retrieved from the image
sequence. Few experiments performed on a multi-spectral SPOT image sequence
illustrate the proposed spatio-temporal recognition method
Monitoring a PGD solver for parametric power flow problems with goal-oriented error assessment
This is the peer reviewed version of the following article: [GarcĂa-Blanco, R., Borzacchiello, D., Chinesta, F., and Diez, P. (2017) Monitoring a PGD solver for parametric power flow problems with goal-oriented error assessment. Int. J. Numer. Meth. Engng, 111: 529â552. doi: 10.1002/nme.5470], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/nme.5470/full. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.The parametric analysis of electric grids requires carrying out a large number of Power Flow computations. The different parameters describe loading conditions and grid properties. In this framework, the Proper Generalized Decomposition (PGD) provides a numerical solution explicitly accounting for the parametric dependence. Once the PGD solution is available, exploring the multidimensional parametric space is computationally inexpensive. The aim of this paper is to provide tools to monitor the error associated with this significant computational gain and to guarantee the quality of the PGD solution. In this case, the PGD algorithm consists in three nested loops that correspond to 1) iterating algebraic solver, 2) number of terms in the separable greedy expansion and 3) the alternated directions for each term. In the proposed approach, the three loops are controlled by stopping criteria based on residual goal-oriented error estimates. This allows one for using only the computational resources necessary to achieve the accuracy prescribed by the end- user. The paper discusses how to compute the goal-oriented error estimates. This requires linearizing the error equation and the Quantity of Interest to derive an efficient error representation based on an adjoint problem. The efficiency of the proposed approach is demonstrated on benchmark problems.Peer ReviewedPostprint (author's final draft
Copulas in finance and insurance
Copulas provide a potential useful modeling tool to represent the dependence structure
among variables and to generate joint distributions by combining given marginal
distributions. Simulations play a relevant role in finance and insurance. They are used to
replicate efficient frontiers or extremal values, to price options, to estimate joint risks, and so
on. Using copulas, it is easy to construct and simulate from multivariate distributions based
on almost any choice of marginals and any type of dependence structure. In this paper we
outline recent contributions of statistical modeling using copulas in finance and insurance.
We review issues related to the notion of copulas, copula families, copula-based dynamic and
static dependence structure, copulas and latent factor models and simulation of copulas.
Finally, we outline hot topics in copulas with a special focus on model selection and
goodness-of-fit testing
Reducing and meta-analysing estimates from distributed lag non-linear models.
BACKGROUND: The two-stage time series design represents a powerful analytical tool in environmental epidemiology. Recently, models for both stages have been extended with the development of distributed lag non-linear models (DLNMs), a methodology for investigating simultaneously non-linear and lagged relationships, and multivariate meta-analysis, a methodology to pool estimates of multi-parameter associations. However, the application of both methods in two-stage analyses is prevented by the high-dimensional definition of DLNMs. METHODS: In this contribution we propose a method to synthesize DLNMs to simpler summaries, expressed by a reduced set of parameters of one-dimensional functions, which are compatible with current multivariate meta-analytical techniques. The methodology and modelling framework are implemented in R through the packages dlnm and mvmeta. RESULTS: As an illustrative application, the method is adopted for the two-stage time series analysis of temperature-mortality associations using data from 10 regions in England and Wales. R code and data are available as supplementary online material. DISCUSSION AND CONCLUSIONS: The methodology proposed here extends the use of DLNMs in two-stage analyses, obtaining meta-analytical estimates of easily interpretable summaries from complex non-linear and delayed associations. The approach relaxes the assumptions and avoids simplifications required by simpler modelling approaches
- âŠ