9,486 research outputs found
Variance-constrained Hβ filtering for a class of nonlinear time-varying systems with multiple missing measurements: The finite-horizon case
Copyright [2010] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected].
By choosing to view this document, you agree to all provisions of the copyright laws protecting it.This paper is concerned with the robust H β finite-horizon filtering problem for a class of uncertain nonlinear discrete time-varying stochastic systems with multiple missing measurements and error variance constraints. All the system parameters are time-varying and the uncertainty enters into the state matrix. The measurement missing phenomenon occurs in a random way, and the missing probability for each sensor is governed by an individual random variable satisfying a certain probabilistic distribution in the interval . The stochastic nonlinearities under consideration here are described by statistical means which can cover several classes of well-studied nonlinearities. Sufficient conditions are derived for a finite-horizon filter to satisfy both the estimation error variance constraints and the prescribed H β performance requirement. These conditions are expressed in terms of the feasibility of a series of recursive linear matrix inequalities (RLMIs). Simulation results demonstrate the effectiveness of the developed filter design scheme.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the U.K. by Grant GR/S27658/01, the Royal Society of the U.K., National Natural Science Foundation of China by Grants 60825303 and 60834003, National 973 Project of China by Grant 2009CB320600, Fok Ying Tung Education Foundation by Grant
111064, the Youth Science Fund of Heilongjiang Province of China by Grant
QC2009C63, and by the Alexander von Humboldt Foundation of Germany
A spectral mean for point sampled closed curves
We propose a spectral mean for closed curves described by sample points on
its boundary subject to mis-alignment and noise. First, we ignore mis-alignment
and derive maximum likelihood estimators of the model and noise parameters in
the Fourier domain. We estimate the unknown curve by back-transformation and
derive the distribution of the integrated squared error. Then, we model
mis-alignment by means of a shifted parametric diffeomorphism and minimise a
suitable objective function simultaneously over the unknown curve and the
mis-alignment parameters. Finally, the method is illustrated on simulated data
as well as on photographs of Lake Tana taken by astronauts during a Shuttle
mission
Black-Litterman model with intuitionistic fuzzy posterior return
The main objective is to present a some variant of the Black - Litterman
model. We consider the canonical case when priori return is determined by means
such excess return from the CAPM market portfolio which is derived using
reverse optimization method. Then the a priori return is at risk quantified
uncertainty. On the side, intensive discussion shows that the experts' views
are under knightian uncertainty. For this reason, we propose such variant of
the Black - Litterman model in which the experts' views are described as
intuitionistic fuzzy number. The existence of posterior return is proved for
this case.We show that then posterior return is an intuitionistic fuzzy
probabilistic set.Comment: SSRN Electronic Journal 201
Evolving Ensemble Fuzzy Classifier
The concept of ensemble learning offers a promising avenue in learning from
data streams under complex environments because it addresses the bias and
variance dilemma better than its single model counterpart and features a
reconfigurable structure, which is well suited to the given context. While
various extensions of ensemble learning for mining non-stationary data streams
can be found in the literature, most of them are crafted under a static base
classifier and revisits preceding samples in the sliding window for a
retraining step. This feature causes computationally prohibitive complexity and
is not flexible enough to cope with rapidly changing environments. Their
complexities are often demanding because it involves a large collection of
offline classifiers due to the absence of structural complexities reduction
mechanisms and lack of an online feature selection mechanism. A novel evolving
ensemble classifier, namely Parsimonious Ensemble pENsemble, is proposed in
this paper. pENsemble differs from existing architectures in the fact that it
is built upon an evolving classifier from data streams, termed Parsimonious
Classifier pClass. pENsemble is equipped by an ensemble pruning mechanism,
which estimates a localized generalization error of a base classifier. A
dynamic online feature selection scenario is integrated into the pENsemble.
This method allows for dynamic selection and deselection of input features on
the fly. pENsemble adopts a dynamic ensemble structure to output a final
classification decision where it features a novel drift detection scenario to
grow the ensemble structure. The efficacy of the pENsemble has been numerically
demonstrated through rigorous numerical studies with dynamic and evolving data
streams where it delivers the most encouraging performance in attaining a
tradeoff between accuracy and complexity.Comment: this paper has been published by IEEE Transactions on Fuzzy System
The entropy of keys derived from laser speckle
Laser speckle has been proposed in a number of papers as a high-entropy
source of unpredictable bits for use in security applications. Bit strings
derived from speckle can be used for a variety of security purposes such as
identification, authentication, anti-counterfeiting, secure key storage, random
number generation and tamper protection. The choice of laser speckle as a
source of random keys is quite natural, given the chaotic properties of
speckle. However, this same chaotic behaviour also causes reproducibility
problems. Cryptographic protocols require either zero noise or very low noise
in their inputs; hence the issue of error rates is critical to applications of
laser speckle in cryptography. Most of the literature uses an error reduction
method based on Gabor filtering. Though the method is successful, it has not
been thoroughly analysed.
In this paper we present a statistical analysis of Gabor-filtered speckle
patterns. We introduce a model in which perturbations are described as random
phase changes in the source plane. Using this model we compute the second and
fourth order statistics of Gabor coefficients. We determine the mutual
information between perturbed and unperturbed Gabor coefficients and the bit
error rate in the derived bit string. The mutual information provides an
absolute upper bound on the number of secure bits that can be reproducibly
extracted from noisy measurements
- β¦