318,735 research outputs found

    High Dimensional Low Rank plus Sparse Matrix Decomposition

    Full text link
    This paper is concerned with the problem of low rank plus sparse matrix decomposition for big data. Conventional algorithms for matrix decomposition use the entire data to extract the low-rank and sparse components, and are based on optimization problems with complexity that scales with the dimension of the data, which limits their scalability. Furthermore, existing randomized approaches mostly rely on uniform random sampling, which is quite inefficient for many real world data matrices that exhibit additional structures (e.g. clustering). In this paper, a scalable subspace-pursuit approach that transforms the decomposition problem to a subspace learning problem is proposed. The decomposition is carried out using a small data sketch formed from sampled columns/rows. Even when the data is sampled uniformly at random, it is shown that the sufficient number of sampled columns/rows is roughly O(r\mu), where \mu is the coherency parameter and r the rank of the low rank component. In addition, adaptive sampling algorithms are proposed to address the problem of column/row sampling from structured data. We provide an analysis of the proposed method with adaptive sampling and show that adaptive sampling makes the required number of sampled columns/rows invariant to the distribution of the data. The proposed approach is amenable to online implementation and an online scheme is proposed.Comment: IEEE Transactions on Signal Processin

    Adaptive processing with signal contaminated training samples

    Get PDF
    We consider the adaptive beamforming or adaptive detection problem in the case of signal contaminated training samples, i.e., when the latter may contain a signal-like component. Since this results in a significant degradation of the signal to interference and noise ratio at the output of the adaptive filter, we investigate a scheme to jointly detect the contaminated samples and subsequently take this information into account for estimation of the disturbance covariance matrix. Towards this end, a Bayesian model is proposed, parameterized by binary variables indicating the presence/absence of signal-like components in the training samples. These variables, together with the signal amplitudes and the disturbance covariance matrix are jointly estimated using a minimum mean-square error (MMSE) approach. Two strategies are proposed to implement the MMSE estimator. First, a stochastic Markov Chain Monte Carlo method is presented based on Gibbs sampling. Then a computationally more efficient scheme based on variational Bayesian analysis is proposed. Numerical simulations attest to the improvement achieved by this method compared to conventional methods such as diagonal loading. A successful application to real radar data is also presented

    Implementation of elastic prestack reverse-time migration using an efficient finite-difference scheme

    Get PDF
    Elastic reverse-time migration (RTM) can reflect the underground elastic information more comprehensively than single-component P-wave migration. One of the most important requirements of elastic RTM is to solve wave equations. The imaging accuracy and efficiency of RTM depends heavily on the algorithms used for solving wave equations. In this paper, we propose an efficient staggered-grid finite-difference (SFD) scheme based on a sampling approximation method with adaptive variable difference operator lengths to implement elastic prestack RTM. Numerical dispersion analysis and wavefield extrapolation results show that the sampling approximation SFD scheme has greater accuracy than the conventional Taylor-series expansion SFD scheme. We also test the elastic RTM algorithm on theoretical models and a field data set, respectively. Experiments presented demonstrate that elastic RTM using the proposed SFD scheme can generate better images than that using the Taylor-series expansion SFD scheme, particularly for PS images. Furthermore, the application of adaptive variable difference operator lengths can effectively improve the computational efficiency of elastic RTM

    Not a COINcidence: Sub-Quadratic Asynchronous Byzantine Agreement WHP

    Get PDF
    King and Saia were the first to break the quadratic word complexity bound for Byzantine Agreement in synchronous systems against an adaptive adversary, and Algorand broke this bound with near-optimal resilience (first in the synchronous model and then with eventual-synchrony). Yet the question of asynchronous sub-quadratic Byzantine Agreement remained open. To the best of our knowledge, we are the first to answer this question in the affirmative. A key component of our solution is a shared coin algorithm based on a VRF. A second essential ingredient is VRF-based committee sampling, which we formalize and utilize in the asynchronous model for the first time. Our algorithms work against a delayed-adaptive adversary, which cannot perform after-the-fact removals but has full control of Byzantine processes and full information about communication in earlier rounds. Using committee sampling and our shared coin, we solve Byzantine Agreement with high probability, with a word complexity of O~(n)\widetilde{O}(n) and O(1)O(1) expected time, breaking the O(n2)O(n^2) bit barrier for asynchronous Byzantine Agreement

    Modeling and Analyzing Adaptive User-Centric Systems in Real-Time Maude

    Full text link
    Pervasive user-centric applications are systems which are meant to sense the presence, mood, and intentions of users in order to optimize user comfort and performance. Building such applications requires not only state-of-the art techniques from artificial intelligence but also sound software engineering methods for facilitating modular design, runtime adaptation and verification of critical system requirements. In this paper we focus on high-level design and analysis, and use the algebraic rewriting language Real-Time Maude for specifying applications in a real-time setting. We propose a generic component-based approach for modeling pervasive user-centric systems and we show how to analyze and prove crucial properties of the system architecture through model checking and simulation. For proving time-dependent properties we use Metric Temporal Logic (MTL) and present analysis algorithms for model checking two subclasses of MTL formulas: time-bounded response and time-bounded safety MTL formulas. The underlying idea is to extend the Real-Time Maude model with suitable clocks, to transform the MTL formulas into LTL formulas over the extended specification, and then to use the LTL model checker of Maude. It is shown that these analyses are sound and complete for maximal time sampling. The approach is illustrated by a simple adaptive advertising scenario in which an adaptive advertisement display can react to actions of the users in front of the display.Comment: In Proceedings RTRTS 2010, arXiv:1009.398
    corecore