650 research outputs found
A Randomized Greedy Algorithm for Near-Optimal Sensor Scheduling in Large-Scale Sensor Networks
We study the problem of scheduling sensors in a resource-constrained linear
dynamical system, where the objective is to select a small subset of sensors
from a large network to perform the state estimation task. We formulate this
problem as the maximization of a monotone set function under a matroid
constraint. We propose a randomized greedy algorithm that is significantly
faster than state-of-the-art methods. By introducing the notion of curvature
which quantifies how close a function is to being submodular, we analyze the
performance of the proposed algorithm and find a bound on the expected mean
square error (MSE) of the estimator that uses the selected sensors in terms of
the optimal MSE. Moreover, we derive a probabilistic bound on the curvature for
the scenario where{\color{black}{ the measurements are i.i.d. random vectors
with bounded norm.}} Simulation results demonstrate efficacy of the
randomized greedy algorithm in a comparison with greedy and semidefinite
programming relaxation methods
A Randomized Greedy Algorithm for Near-Optimal Sensor Scheduling in Large-Scale Sensor Networks
We study the problem of scheduling sensors in a resource-constrained linear
dynamical system, where the objective is to select a small subset of sensors
from a large network to perform the state estimation task. We formulate this
problem as the maximization of a monotone set function under a matroid
constraint. We propose a randomized greedy algorithm that is significantly
faster than state-of-the-art methods. By introducing the notion of curvature
which quantifies how close a function is to being submodular, we analyze the
performance of the proposed algorithm and find a bound on the expected mean
square error (MSE) of the estimator that uses the selected sensors in terms of
the optimal MSE. Moreover, we derive a probabilistic bound on the curvature for
the scenario where{\color{black}{ the measurements are i.i.d. random vectors
with bounded norm.}} Simulation results demonstrate efficacy of the
randomized greedy algorithm in a comparison with greedy and semidefinite
programming relaxation methods
Real-time futures graph tracking visualization and analysis tool
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 34-35).A hybrid envisionment is a novel representation of a simulated state graph, specifying all possible states and transitions of the system, characterized by both qualitative and quantitative state variables. The Deep Green project creates a hybrid envisionment, called a futures graph, to depict all possible occurrences and outcomes of a combat engagement between friendly and enemy units on a battlefield. During combat, Al state estimation techniques are utilized to efficiently track the state of the battle in a futures graph, giving the commander an up-to-date analysis of what is taking place on the battlefield and how it the battle could turn out. Because state estimation of highly complex hybrid envisionments is a relatively unexplored and novel process, it is important to ensure that it is handled efficiently and accurately enough for usage on the field. This paper explores an approach for discerning the behavior in state estimation through the use of an analysis suite. By accompanying Deep Green state estimation with the analysis suite developed, estimation techniques could be benchmarked and analyzed over various implementations through both numerical and graphical metrics. The metrics generated greatly helped to improve the estimation algorithm over the course of its development.by Joseph M. Fahey.M.Eng
Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions
To minimize the average of a set of log-convex functions, the stochastic
Newton method iteratively updates its estimate using subsampled versions of the
full objective's gradient and Hessian. We contextualize this optimization
problem as sequential Bayesian inference on a latent state-space model with a
discriminatively-specified observation process. Applying Bayesian filtering
then yields a novel optimization algorithm that considers the entire history of
gradients and Hessians when forming an update. We establish matrix-based
conditions under which the effect of older observations diminishes over time,
in a manner analogous to Polyak's heavy ball momentum. We illustrate various
aspects of our approach with an example and review other relevant innovations
for the stochastic Newton method
Scribbles to vectors : preparation of scribble drawings for CAD interpretation
This paper describes the work carried out on off-line paper based scribbles such that they can be incorporated into a sketch-based interface without forcing designers to change their natural drawing habits. In this work, the scribbled drawings are converted into a vectorial format which can be recognized by a CAD system. This is achieved by using pattern analysis techniques, namely the Gabor filter to simplify the scribbled drawing. Vector line are then extracted from the resulting drawing by means of Kalman filtering.peer-reviewe
Sketch-based skeleton-driven 2D animation and motion capture.
This research is concerned with the development of a set of novel sketch-based skeleton-driven 2D animation techniques, which allow the user to produce realistic 2D character animation efficiently. The technique consists of three parts: sketch-based skeleton-driven 2D animation production, 2D motion capture and a cartoon animation filter. For 2D animation production, the traditional way is drawing the key-frames by experienced animators manually. It is a laborious and time-consuming process. With the proposed techniques, the user only inputs one image ofa character and sketches a skeleton for each subsequent key-frame. The system then deforms the character according to the sketches and produces animation automatically. To perform 2D shape deformation, a variable-length needle model is developed, which divides the deformation into two stages: skeleton driven deformation and nonlinear deformation in joint areas. This approach preserves the local geometric features and global area during animation. Compared with existing 2D shape deformation algorithms, it reduces the computation complexity while still yielding plausible deformation results. To capture the motion of a character from exiting 2D image sequences, a 2D motion capture technique is presented. Since this technique is skeleton-driven, the motion of a 2D character is captured by tracking the joint positions. Using both geometric and visual features, this problem can be solved by ptimization, which prevents self-occlusion and feature disappearance. After tracking, the motion data are retargeted to a new character using the deformation algorithm proposed in the first part. This facilitates the reuse of the characteristics of motion contained in existing moving images, making the process of cartoon generation easy for artists and novices alike. Subsequent to the 2D animation production and motion capture,"Cartoon Animation Filter" is implemented and applied. Following the animation principles, this filter processes two types of
cartoon input: a single frame of a cartoon character and motion capture data from an image sequence. It adds anticipation and follow-through to the motion with related squash and stretch effect
- …