48,112 research outputs found
Hybrid group recommendations for a travel service
Recommendation techniques have proven their usefulness as a tool to cope with the information overload problem in many classical domains such as movies, books, and music. Additional challenges for recommender systems emerge in the domain of tourism such as acquiring metadata and feedback, the sparsity of the rating matrix, user constraints, and the fact that traveling is often a group activity. This paper proposes a recommender system that offers personalized recommendations for travel destinations to individuals and groups. These recommendations are based on the users' rating profile, personal interests, and specific demands for their next destination. The recommendation algorithm is a hybrid approach combining a content-based, collaborative filtering, and knowledge-based solution. For groups of users, such as families or friends, individual recommendations are aggregated into group recommendations, with an additional opportunity for users to give feedback on these group recommendations. A group of test users evaluated the recommender system using a prototype web application. The results prove the usefulness of individual and group recommendations and show that users prefer the hybrid algorithm over each individual technique. This paper demonstrates the added value of various recommendation algorithms in terms of different quality aspects, compared to an unpersonalized list of the most-popular destinations
Approximate Bayesian Computation in State Space Models
A new approach to inference in state space models is proposed, based on
approximate Bayesian computation (ABC). ABC avoids evaluation of the likelihood
function by matching observed summary statistics with statistics computed from
data simulated from the true process; exact inference being feasible only if
the statistics are sufficient. With finite sample sufficiency unattainable in
the state space setting, we seek asymptotic sufficiency via the maximum
likelihood estimator (MLE) of the parameters of an auxiliary model. We prove
that this auxiliary model-based approach achieves Bayesian consistency, and
that - in a precise limiting sense - the proximity to (asymptotic) sufficiency
yielded by the MLE is replicated by the score. In multiple parameter settings a
separate treatment of scalar parameters, based on integrated likelihood
techniques, is advocated as a way of avoiding the curse of dimensionality. Some
attention is given to a structure in which the state variable is driven by a
continuous time process, with exact inference typically infeasible in this case
as a result of intractable transitions. The ABC method is demonstrated using
the unscented Kalman filter as a fast and simple way of producing an
approximation in this setting, with a stochastic volatility model for financial
returns used for illustration
Recommended from our members
Estimating the uncertainty of areal precipitation using data assimilation
We present a method to estimate spatially and temporally variable uncertainty of areal precipitation data. The aim of the method is to merge measurements from different sources, remote sensing and in situ, into a combined precipitation product and to provide an associated dynamic uncertainty estimate. This estimate should provide an accurate representation of uncertainty both in time and space, an adjustment to additional observations merged into the product through data assimilation, and flow dependency. Such a detailed uncertainty description is important for example to generate precipitation ensembles for probabilistic hydrological modelling or to specify accurate error covariances when using precipitation observations for data assimilation into numerical weather prediction models. The presented method uses the Local Ensemble Transform Kalman Filter and an ensemble nowcasting model. The model provides information about the precipitation displacement over time and is continuously updated by assimilation of observations. In this way, the precipitation product and its uncertainty estimate provided by the nowcasting ensemble evolve consistently in time and become flow-dependent. The method is evaluated in a proof of concept study focusing on weather radar data of four precipitation events. The study demonstrates that the dynamic areal uncertainty estimate outperforms a constant benchmark uncertainty value in all cases for one of the evaluated scores, and in half the number of cases for the other score. Thus, the flow dependency introduced by the coupling of data assimilation and nowcasting enables a more accurate spatial and temporal distribution of uncertainty. The mixed results achieved in the second score point out the importance of a good probabilistic nowcasting scheme for the performance of the method
Fast Detection of Curved Edges at Low SNR
Detecting edges is a fundamental problem in computer vision with many
applications, some involving very noisy images. While most edge detection
methods are fast, they perform well only on relatively clean images. Indeed,
edges in such images can be reliably detected using only local filters.
Detecting faint edges under high levels of noise cannot be done locally at the
individual pixel level, and requires more sophisticated global processing.
Unfortunately, existing methods that achieve this goal are quite slow. In this
paper we develop a novel multiscale method to detect curved edges in noisy
images. While our algorithm searches for edges over a huge set of candidate
curves, it does so in a practical runtime, nearly linear in the total number of
image pixels. As we demonstrate experimentally, our algorithm is orders of
magnitude faster than previous methods designed to deal with high noise levels.
Nevertheless, it obtains comparable, if not better, edge detection quality on a
variety of challenging noisy images.Comment: 9 pages, 11 figure
Auxiliary Likelihood-Based Approximate Bayesian Computation in State Space Models
A computationally simple approach to inference in state space models is
proposed, using approximate Bayesian computation (ABC). ABC avoids evaluation
of an intractable likelihood by matching summary statistics for the observed
data with statistics computed from data simulated from the true process, based
on parameter draws from the prior. Draws that produce a 'match' between
observed and simulated summaries are retained, and used to estimate the
inaccessible posterior. With no reduction to a low-dimensional set of
sufficient statistics being possible in the state space setting, we define the
summaries as the maximum of an auxiliary likelihood function, and thereby
exploit the asymptotic sufficiency of this estimator for the auxiliary
parameter vector. We derive conditions under which this approach - including a
computationally efficient version based on the auxiliary score - achieves
Bayesian consistency. To reduce the well-documented inaccuracy of ABC in
multi-parameter settings, we propose the separate treatment of each parameter
dimension using an integrated likelihood technique. Three stochastic volatility
models for which exact Bayesian inference is either computationally
challenging, or infeasible, are used for illustration. We demonstrate that our
approach compares favorably against an extensive set of approximate and exact
comparators. An empirical illustration completes the paper.Comment: This paper is forthcoming at the Journal of Computational and
Graphical Statistics. It also supersedes the earlier arXiv paper "Approximate
Bayesian Computation in State Space Models" (arXiv:1409.8363
Extending the square root method to account for additive forecast noise in ensemble methods
A square root approach is considered for the problem of accounting for model noise in the forecast step of the ensemble Kalman filter (EnKF) and related algorithms. The primary aim is to replace the method of simulated, pseudo-random additive so as to eliminate the associated sampling errors. The core method is based on the analysis step of ensemble square root filters, and consists in the deterministic computation of a transform matrix. The theoretical advantages regarding dynamical consistency are surveyed, applying equally well to the square root method in the analysis step. A fundamental problem due to the limited size of the ensemble subspace is discussed, and novel solutions that complement the core method are suggested and studied. Benchmarks from twin experiments with simple, low-order dynamics indicate improved performance over standard approaches such as additive, simulated noise, and multiplicative inflation
- …