112 research outputs found
Perseus: Randomized Point-based Value Iteration for POMDPs
Partially observable Markov decision processes (POMDPs) form an attractive
and principled framework for agent planning under uncertainty. Point-based
approximate techniques for POMDPs compute a policy based on a finite set of
points collected in advance from the agents belief space. We present a
randomized point-based value iteration algorithm called Perseus. The algorithm
performs approximate value backup stages, ensuring that in each backup stage
the value of each point in the belief set is improved; the key observation is
that a single backup may improve the value of many belief points. Contrary to
other point-based methods, Perseus backs up only a (randomly selected) subset
of points in the belief set, sufficient for improving the value of each belief
point in the set. We show how the same idea can be extended to dealing with
continuous action spaces. Experimental results show the potential of Perseus in
large scale POMDP problems
Gaussian Mixture Model of Heart Rate Variability
Heart rate variability (HRV) is an important measure of sympathetic and parasympathetic functions of the autonomic nervous system and a key indicator of cardiovascular condition. This paper proposes a novel method to investigate HRV, namely by modelling it as a linear combination of Gaussians. Results show that three Gaussians are enough to describe the stationary statistics of heart variability and to provide a straightforward interpretation of the HRV power spectrum. Comparisons have been made also with synthetic data generated from different physiologically based models showing the plausibility of the Gaussian mixture parameters
Local Dimensionality Reduction for Non-Parametric Regression
Locally-weighted regression is a computationally-efficient technique for
non-linear regression. However, for high-dimensional data, this technique becomes numerically
brittle and computationally too expensive if many local models need to be maintained
simultaneously. Thus, local linear dimensionality reduction combined with locally-weighted
regression seems to be a promising solution. In this context, we review linear dimensionalityreduction
methods, compare their performance on non-parametric locally-linear regression,
and discuss their ability to extend to incremental learning. The considered methods belong to
the following three groups: (1) reducing dimensionality only on the input data, (2) modeling
the joint input-output data distribution, and (3) optimizing the correlation between projection
directions and output data. Group 1 contains principal component regression (PCR);
group 2 contains principal component analysis (PCA) in joint input and output space, factor
analysis, and probabilistic PCA; and group 3 contains reduced rank regression (RRR) and
partial least squares (PLS) regression. Among the tested methods, only group 3 managed
to achieve robust performance even for a non-optimal number of components (factors or
projection directions). In contrast, group 1 and 2 failed for fewer components since these
methods rely on the correct estimate of the true intrinsic dimensionality. In group 3, PLS is
the only method for which a computationally-efficient incremental implementation exists
- …