432 research outputs found

    Learning detectors quickly using structured covariance matrices

    Full text link
    Computer vision is increasingly becoming interested in the rapid estimation of object detectors. Canonical hard negative mining strategies are slow as they require multiple passes of the large negative training set. Recent work has demonstrated that if the distribution of negative examples is assumed to be stationary, then Linear Discriminant Analysis (LDA) can learn comparable detectors without ever revisiting the negative set. Even with this insight, however, the time to learn a single object detector can still be on the order of tens of seconds on a modern desktop computer. This paper proposes to leverage the resulting structured covariance matrix to obtain detectors with identical performance in orders of magnitude less time and memory. We elucidate an important connection to the correlation filter literature, demonstrating that these can also be trained without ever revisiting the negative set

    Preconditioners for ill-conditioned Toeplitz matrices

    Get PDF
    This paper is concerned with the solution of systems of linear equations AN&#967

    A fast, preconditioned conjugate gradient Toeplitz solver

    Get PDF
    A simple factorization is given of an arbitrary hermitian, positive definite matrix in which the factors are well-conditioned, hermitian, and positive definite. In fact, given knowledge of the extreme eigenvalues of the original matrix A, an optimal improvement can be achieved, making the condition numbers of each of the two factors equal to the square root of the condition number of A. This technique is to applied to the solution of hermitian, positive definite Toeplitz systems. Large linear systems with hermitian, positive definite Toeplitz matrices arise in some signal processing applications. A stable fast algorithm is given for solving these systems that is based on the preconditioned conjugate gradient method. The algorithm exploits Toeplitz structure to reduce the cost of an iteration to O(n log n) by applying the fast Fourier Transform to compute matrix-vector products. Matrix factorization is used as a preconditioner

    Preconditioned Lanczos Methods for the Minimum Eigenvalue of a Symmetric Positive Definite Toeplitz Matrix

    Get PDF
    In this paper, we apply the preconditioned Lanczos (PL) method to compute the minimum eigenvalue of a symmetric positive definite Toeplitz matrix. The sine transform-based preconditioner is used to speed up the convergence rate of the PL method. The resulting method involves only Toeplitz and sine transform matrix-vector multiplications and hence can be computed efficiently by fast transform algorithms. We show that if the symmetric Toeplitz matrix is generated by a positive 2π2 \pi-periodic even continuous function, then the PL method will converge sufficiently fast. Numerical results including Toeplitz and non-Toeplitz matrices are reported to illustrate the effectiveness of the method.published_or_final_versio
    corecore