11,802 research outputs found
Innovation Rate Sampling of Pulse Streams with Application to Ultrasound Imaging
Signals comprised of a stream of short pulses appear in many applications
including bio-imaging and radar. The recent finite rate of innovation
framework, has paved the way to low rate sampling of such pulses by noticing
that only a small number of parameters per unit time are needed to fully
describe these signals. Unfortunately, for high rates of innovation, existing
sampling schemes are numerically unstable. In this paper we propose a general
sampling approach which leads to stable recovery even in the presence of many
pulses. We begin by deriving a condition on the sampling kernel which allows
perfect reconstruction of periodic streams from the minimal number of samples.
We then design a compactly supported class of filters, satisfying this
condition. The periodic solution is extended to finite and infinite streams,
and is shown to be numerically stable even for a large number of pulses. High
noise robustness is also demonstrated when the delays are sufficiently
separated. Finally, we process ultrasound imaging data using our techniques,
and show that substantial rate reduction with respect to traditional ultrasound
sampling schemes can be achieved.Comment: 14 pages, 13 figure
Zero-One Laws for Sliding Windows and Universal Sketches
Given a stream of data, a typical approach in streaming algorithms is to design a sophisticated algorithm with small memory that computes a specific statistic over the streaming data. Usually, if one wants to compute a different statistic after the stream is gone, it is impossible. But what if we want to compute a different statistic after the fact? In this paper, we consider the following fascinating possibility: can we collect some small amount of specific data during the stream that is "universal," i.e., where we do not know anything about the statistics we will want to later compute, other than the guarantee that had we known the statistic ahead of time, it would have been possible to do so with small memory? This is indeed what we introduce (and show) in this paper with matching upper and lower bounds: we show that it is possible to collect universal statistics of polylogarithmic size, and prove that these universal statistics allow us after the fact to compute all other statistics that are computable with similar amounts of memory. We show that this is indeed possible, both for the standard unbounded streaming model and the sliding window streaming model
Hydrologic drought indices analysis by regionalization methods in southwest of Iran
11 p.International audienceDrought is a recurrent extreme climate event with tremendous hazard for every specter of natural environment and human lives. Drought analysis usually involves characterizing drought severity, duration and intensity. Usually, long-term datasets of hydrometric and hydrochemical information are needed to begin an evaluation of dominant low flow (as hydrologic drought indices) producing processes however, in many catchments, these data are not available. A major research challenge in ungauged basins is to quickly assess the dominant hydrological processes of watersheds. In this paper, for developing regional models, low flow analysis has been performed by 3 regression methods (multivariate regression, low flow index method, regionalization model of frequency formula parameters) and Hybrid low flow model in Karkheh basin (southwestern of Iran). Estimated error for four methods show although hybrid method can also use for low flow regionalization analysis but multivariate regression and low flow index methods are more suitable for this purpose
Estimation of the Degree of Polarization for Hybrid/Compact and Linear Dual-Pol SAR Intensity Images: Principles and Applications
Analysis and comparison of linear and hybrid/compact dual-polarization (dual-pol) synthetic aperture radar (SAR) imagery have gained a wholly new importance in the last few years, in particular, with the advent of new spaceborne SARs such as the Japanese ALOS PALSAR, the Canadian RADARSAT-2, and the German TerraSAR-X. Compact polarimetry,
hybrid dual-pol, and quad-pol modes are newly promoted in the literature for future SAR missions. In this paper, we investigate and compare different hybrid/compact and linear dual-pol modes in terms of the estimation of the degree of polarization (DoP). The DoP has long been recognized as one of the most important parameters characterizing a partially polarized electromagnetic wave. It can be effectively used to characterize the information content of SAR data. We study and compare the information content of the intensity data provided by different hybrid/compact and linear dual-pol SAR modes. For this purpose, we derive the joint distribution of multilook SAR intensity images. We use this
distribution to derive the maximum likelihood and moment-based estimators of the DoP in hybrid/compact and linear dual-pol modes.We evaluate and compare the performance of these estimators for different modes on both synthetic and real data, which are acquired by RADARSAT-2 spaceborne and NASA/JPL airborne SAR systems, over various terrain types such as urban, vegetation, and ocean
Efficient Triangle Counting in Large Graphs via Degree-based Vertex Partitioning
The number of triangles is a computationally expensive graph statistic which
is frequently used in complex network analysis (e.g., transitivity ratio), in
various random graph models (e.g., exponential random graph model) and in
important real world applications such as spam detection, uncovering of the
hidden thematic structure of the Web and link recommendation. Counting
triangles in graphs with millions and billions of edges requires algorithms
which run fast, use small amount of space, provide accurate estimates of the
number of triangles and preferably are parallelizable.
In this paper we present an efficient triangle counting algorithm which can
be adapted to the semistreaming model. The key idea of our algorithm is to
combine the sampling algorithm of Tsourakakis et al. and the partitioning of
the set of vertices into a high degree and a low degree subset respectively as
in the Alon, Yuster and Zwick work treating each set appropriately. We obtain a
running time
and an approximation (multiplicative error), where is the number
of vertices, the number of edges and the maximum number of
triangles an edge is contained.
Furthermore, we show how this algorithm can be adapted to the semistreaming
model with space usage and a constant number of passes (three) over the graph
stream. We apply our methods in various networks with several millions of edges
and we obtain excellent results. Finally, we propose a random projection based
method for triangle counting and provide a sufficient condition to obtain an
estimate with low variance.Comment: 1) 12 pages 2) To appear in the 7th Workshop on Algorithms and Models
for the Web Graph (WAW 2010
High Probability Frequency Moment Sketches
We consider the problem of sketching the p-th frequency moment of a vector, p>2, with multiplicative error at most 1 +/- epsilon and with high confidence 1-delta. Despite the long sequence of work on this problem, tight bounds on this quantity are only known for constant delta. While one can obtain an upper bound with error probability delta by repeating a sketching algorithm with constant error probability O(log(1/delta)) times in parallel, and taking the median of the outputs, we show this is a suboptimal algorithm! Namely, we show optimal upper and lower bounds of Theta(n^{1-2/p} log(1/delta) + n^{1-2/p} log^{2/p} (1/delta) log n) on the sketching dimension, for any constant approximation. Our result should be contrasted with results for estimating frequency moments for 1 <= p <= 2, for which we show the optimal algorithm for general delta is obtained by repeating the optimal algorithm for constant error probability O(log(1/delta)) times and taking the median output. We also obtain a matching lower bound for this problem, up to constant factors
- âŠ