9,682 research outputs found
The Statistical Performance of Collaborative Inference
The statistical analysis of massive and complex data sets will require the
development of algorithms that depend on distributed computing and
collaborative inference. Inspired by this, we propose a collaborative framework
that aims to estimate the unknown mean of a random variable . In
the model we present, a certain number of calculation units, distributed across
a communication network represented by a graph, participate in the estimation
of by sequentially receiving independent data from while
exchanging messages via a stochastic matrix defined over the graph. We give
precise conditions on the matrix under which the statistical precision of
the individual units is comparable to that of a (gold standard) virtual
centralized estimate, even though each unit does not have access to all of the
data. We show in particular the fundamental role played by both the non-trivial
eigenvalues of and the Ramanujan class of expander graphs, which provide
remarkable performance for moderate algorithmic cost
Computational Complexity versus Statistical Performance on Sparse Recovery Problems
We show that several classical quantities controlling compressed sensing
performance directly match classical parameters controlling algorithmic
complexity. We first describe linearly convergent restart schemes on
first-order methods solving a broad range of compressed sensing problems, where
sharpness at the optimum controls convergence speed. We show that for sparse
recovery problems, this sharpness can be written as a condition number, given
by the ratio between true signal sparsity and the largest signal size that can
be recovered by the observation matrix. In a similar vein, Renegar's condition
number is a data-driven complexity measure for convex programs, generalizing
classical condition numbers for linear systems. We show that for a broad class
of compressed sensing problems, the worst case value of this algorithmic
complexity measure taken over all signals matches the restricted singular value
of the observation matrix which controls robust recovery performance. Overall,
this means in both cases that, in compressed sensing problems, a single
parameter directly controls both computational complexity and recovery
performance. Numerical experiments illustrate these points using several
classical algorithms.Comment: Final version, to appear in information and Inferenc
Statistical performance analysis with dynamic workload using S-NET
Volkmar Wieser, Philip K. F. Hölzenspies, Michael RoĂbory, and Raimund Kirner, 'Statistical performance analysis with dynamic workload using S-NET'. Paper presented at the Workshop on Feedback-Directed Compiler Optimization for Multi-Core Architectures. Paris, France 23-25 January 2012In this paper the ADVANCE approach for engineering con- current software systems with well-balanced hardware ef- ficiency is adressed using the stream processing language S-Net. To obtain the cost information in the concurrent system the metrics throughput, latency, and jitter are evalu- ated by analyzing generated synthetical data as well as using an industrial related application in the future. As fall-out an Eclipse plugin for S-Net has been developed to provide sup- port for syntax highlighting, content assistance, hover help, and more, for easier and faster development. The presented results of the current work are on the one hand an indicator for the status quo of the ADVANCE vision and on the other hand used to improve the applied statistical analysis tech- niques within ADVANCE. Like the ADVANCE project, this work is still under development, but further improvements and speedups are expected in the near future
Statistical performance analysis of a fast super-resolution technique using noisy translations
It is well known that the registration process is a key step for
super-resolution reconstruction. In this work, we propose to use a
piezoelectric system that is easily adaptable on all microscopes and telescopes
for controlling accurately their motion (down to nanometers) and therefore
acquiring multiple images of the same scene at different controlled positions.
Then a fast super-resolution algorithm \cite{eh01} can be used for efficient
super-resolution reconstruction. In this case, the optimal use of images
for a resolution enhancement factor is generally not enough to obtain
satisfying results due to the random inaccuracy of the positioning system. Thus
we propose to take several images around each reference position. We study the
error produced by the super-resolution algorithm due to spatial uncertainty as
a function of the number of images per position. We obtain a lower bound on the
number of images that is necessary to ensure a given error upper bound with
probability higher than some desired confidence level.Comment: 15 pages, submitte
Statistical Performance Analysis of an Ant-Colony Optimisation Application in S-NET
Kenneth MacKenzie, Philip K. F. Hölzenspies, Kevin Hammond, Raimund Kirner, Vu Thien Nga Nguyen, Iraneus te Boekhorst, Clemens Grelck, Raphael Poss, Merijn Verstraaten, 'Statistical Performance Analysis of an Ant-Colony Optimisation Application in S-NET'. Paper presented at the 2nd Workshop on Feedback-Directed Compiler Optimization for Multi-Core Architectures. Berlin, Germany, 12 January 2013.We consider an ant-colony optimsation problem implemented on a multicore system as a collection of asynchronous stream- processing components under the control of the S-NET coordina- tion language. Statistical analysis and visualisation techniques are used to study the behaviour of the application, and this enables us to discover and correct problems in both the application program and the run-time system underlying S-NET
Sensitivity of Inequality Measures to Extreme Values
We examine the sensitivity of estimates and inequality indices to extreme values, in the sense of their robustness properties and of their statistical performance. We establish that these measures are very sensitive to the properties of the income distribution. Estimation and inference can be dramatically affected, especially when the tail of the income distribution is heavy.Inequality measures, statistical performance, robustness.
Income distribution and inequality measurement: The problem of extreme values
We examine the statistical performance of inequality indices in the presence of extreme values in the data and show that these indices are very sensitive to the properties of the income distribution. Estimation and inference can be dramatically affected, especially when the tail of the income distribution is heavy, even when standard bootstrap methods are employed. However, use of appropriate semiparametric methods for modelling the upper tail can greatly improve the performance of even those inequality indices that are normally considered particularly sensitive to extreme values.inequality measures ; statistical performance ; robustness
- âŠ