5,730 research outputs found
Stable Nonlinear Identification From Noisy Repeated Experiments via Convex Optimization
This paper introduces new techniques for using convex optimization to fit
input-output data to a class of stable nonlinear dynamical models. We present
an algorithm that guarantees consistent estimates of models in this class when
a small set of repeated experiments with suitably independent measurement noise
is available. Stability of the estimated models is guaranteed without any
assumptions on the input-output data. We first present a convex optimization
scheme for identifying stable state-space models from empirical moments. Next,
we provide a method for using repeated experiments to remove the effect of
noise on these moment and model estimates. The technique is demonstrated on a
simple simulated example
Nonlinear system modeling based on constrained Volterra series estimates
A simple nonlinear system modeling algorithm designed to work with limited
\emph{a priori }knowledge and short data records, is examined. It creates an
empirical Volterra series-based model of a system using an -constrained
least squares algorithm with . If the system
is a continuous and bounded map with a finite memory no longer than some known
, then (for a parameter model and for a number of measurements )
the difference between the resulting model of the system and the best possible
theoretical one is guaranteed to be of order , even for
. The performance of models obtained for and is tested
on the Wiener-Hammerstein benchmark system. The results suggest that the models
obtained for are better suited to characterize the nature of the system,
while the sparse solutions obtained for yield smaller error values in
terms of input-output behavior
Proof of Convergence and Performance Analysis for Sparse Recovery via Zero-point Attracting Projection
A recursive algorithm named Zero-point Attracting Projection (ZAP) is
proposed recently for sparse signal reconstruction. Compared with the reference
algorithms, ZAP demonstrates rather good performance in recovery precision and
robustness. However, any theoretical analysis about the mentioned algorithm,
even a proof on its convergence, is not available. In this work, a strict proof
on the convergence of ZAP is provided and the condition of convergence is put
forward. Based on the theoretical analysis, it is further proved that ZAP is
non-biased and can approach the sparse solution to any extent, with the proper
choice of step-size. Furthermore, the case of inaccurate measurements in noisy
scenario is also discussed. It is proved that disturbance power linearly
reduces the recovery precision, which is predictable but not preventable. The
reconstruction deviation of -compressible signal is also provided. Finally,
numerical simulations are performed to verify the theoretical analysis.Comment: 29 pages, 6 figure
Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)
The implicit objective of the biennial "international - Traveling Workshop on
Interactions between Sparse models and Technology" (iTWIST) is to foster
collaboration between international scientific teams by disseminating ideas
through both specific oral/poster presentations and free discussions. For its
second edition, the iTWIST workshop took place in the medieval and picturesque
town of Namur in Belgium, from Wednesday August 27th till Friday August 29th,
2014. The workshop was conveniently located in "The Arsenal" building within
walking distance of both hotels and town center. iTWIST'14 has gathered about
70 international participants and has featured 9 invited talks, 10 oral
presentations, and 14 posters on the following themes, all related to the
theory, application and generalization of the "sparsity paradigm":
Sparsity-driven data sensing and processing; Union of low dimensional
subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph
sensing/processing; Blind inverse problems and dictionary learning; Sparsity
and computational neuroscience; Information theory, geometry and randomness;
Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?;
Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website:
http://sites.google.com/site/itwist1
- …