3,097 research outputs found
The Hough transform estimator
This article pursues a statistical study of the Hough transform, the
celebrated computer vision algorithm used to detect the presence of lines in a
noisy image. We first study asymptotic properties of the Hough transform
estimator, whose objective is to find the line that ``best'' fits a set of
planar points. In particular, we establish strong consistency and rates of
convergence, and characterize the limiting distribution of the Hough transform
estimator. While the convergence rates are seen to be slower than those found
in some standard regression methods, the Hough transform estimator is shown to
be more robust as measured by its breakdown point. We next study the Hough
transform in the context of the problem of detecting multiple lines. This is
addressed via the framework of excess mass functionals and modality testing.
Throughout, several numerical examples help illustrate various properties of
the estimator. Relations between the Hough transform and more mainstream
statistical paradigms and methods are discussed as well.Comment: Published at http://dx.doi.org/10.1214/009053604000000760 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Adaptive filtering techniques for gravitational wave interferometric data: Removing long-term sinusoidal disturbances and oscillatory transients
It is known by the experience gained from the gravitational wave detector
proto-types that the interferometric output signal will be corrupted by a
significant amount of non-Gaussian noise, large part of it being essentially
composed of long-term sinusoids with slowly varying envelope (such as violin
resonances in the suspensions, or main power harmonics) and short-term ringdown
noise (which may emanate from servo control systems, electronics in a
non-linear state, etc.). Since non-Gaussian noise components make the detection
and estimation of the gravitational wave signature more difficult, a denoising
algorithm based on adaptive filtering techniques (LMS methods) is proposed to
separate and extract them from the stationary and Gaussian background noise.
The strength of the method is that it does not require any precise model on the
observed data: the signals are distinguished on the basis of their
autocorrelation time. We believe that the robustness and simplicity of this
method make it useful for data preparation and for the understanding of the
first interferometric data. We present the detailed structure of the algorithm
and its application to both simulated data and real data from the LIGO 40meter
proto-type.Comment: 16 pages, 9 figures, submitted to Phys. Rev.
Distributed Estimation and Control of Algebraic Connectivity over Random Graphs
In this paper we propose a distributed algorithm for the estimation and
control of the connectivity of ad-hoc networks in the presence of a random
topology. First, given a generic random graph, we introduce a novel stochastic
power iteration method that allows each node to estimate and track the
algebraic connectivity of the underlying expected graph. Using results from
stochastic approximation theory, we prove that the proposed method converges
almost surely (a.s.) to the desired value of connectivity even in the presence
of imperfect communication scenarios. The estimation strategy is then used as a
basic tool to adapt the power transmitted by each node of a wireless network,
in order to maximize the network connectivity in the presence of realistic
Medium Access Control (MAC) protocols or simply to drive the connectivity
toward a desired target value. Numerical results corroborate our theoretical
findings, thus illustrating the main features of the algorithm and its
robustness to fluctuations of the network graph due to the presence of random
link failures.Comment: To appear in IEEE Transactions on Signal Processin
The Rise of Complex Beliefs Dynamics
We prove that complex beliefs dynamics may emerge in linear stochastic models as the outcome of bounded rationality learning. If agents believe in a misspecified law of motion (which is correctly specified at the Rational Expectations Equilibria of the model) and update their beliefs observing the evolving economy, their beliefs can follow in the limit a beliefs cycle which is not a self-fulfilling solution of the model. The stochastic process induced by the learning rule is analyzed by means of an associated ordinary differential equation (ODE). The existence of a uniformly asymptotically stable attractor for the ODE implies the existence of a beliefs attractor, to which the learning process converges. We prove almost sure convergence by assuming that agents employ a projection facility and convergence with positive probability dropping this assumption. The rise of a limit cycle and of even more complex attractors is established in some monetary economics models assuming that agents update their beliefs with the Recursive Ordinary Least Squares and the Least Mean Squares algorithm
Convergence Rate Analysis of Distributed Gossip (Linear Parameter) Estimation: Fundamental Limits and Tradeoffs
The paper considers gossip distributed estimation of a (static) distributed
random field (a.k.a., large scale unknown parameter vector) observed by
sparsely interconnected sensors, each of which only observes a small fraction
of the field. We consider linear distributed estimators whose structure
combines the information \emph{flow} among sensors (the \emph{consensus} term
resulting from the local gossiping exchange among sensors when they are able to
communicate) and the information \emph{gathering} measured by the sensors (the
\emph{sensing} or \emph{innovations} term.) This leads to mixed time scale
algorithms--one time scale associated with the consensus and the other with the
innovations. The paper establishes a distributed observability condition
(global observability plus mean connectedness) under which the distributed
estimates are consistent and asymptotically normal. We introduce the
distributed notion equivalent to the (centralized) Fisher information rate,
which is a bound on the mean square error reduction rate of any distributed
estimator; we show that under the appropriate modeling and structural network
communication conditions (gossip protocol) the distributed gossip estimator
attains this distributed Fisher information rate, asymptotically achieving the
performance of the optimal centralized estimator. Finally, we study the
behavior of the distributed gossip estimator when the measurements fade (noise
variance grows) with time; in particular, we consider the maximum rate at which
the noise variance can grow and still the distributed estimator being
consistent, by showing that, as long as the centralized estimator is
consistent, the distributed estimator remains consistent.Comment: Submitted for publication, 30 page
- …