29 research outputs found

    A weakly stable algorithm for general Toeplitz systems

    Full text link
    We show that a fast algorithm for the QR factorization of a Toeplitz or Hankel matrix A is weakly stable in the sense that R^T.R is close to A^T.A. Thus, when the algorithm is used to solve the semi-normal equations R^T.Rx = A^Tb, we obtain a weakly stable method for the solution of a nonsingular Toeplitz or Hankel linear system Ax = b. The algorithm also applies to the solution of the full-rank Toeplitz or Hankel least squares problem.Comment: 17 pages. An old Technical Report with postscript added. For further details, see http://wwwmaths.anu.edu.au/~brent/pub/pub143.htm

    On recursive least-squares filtering algorithms and implementations

    Get PDF
    In many real-time signal processing applications, fast and numerically stable algorithms for solving least-squares problems are necessary and important. In particular, under non-stationary conditions, these algorithms must be able to adapt themselves to reflect the changes in the system and take appropriate adjustments to achieve optimum performances. Among existing algorithms, the QR-decomposition (QRD)-based recursive least-squares (RLS) methods have been shown to be useful and effective for adaptive signal processing. In order to increase the speed of processing and achieve high throughput rate, many algorithms are being vectorized and/or pipelined to facilitate high degrees of parallelism. A time-recursive formulation of RLS filtering employing block QRD will be considered first. Several methods, including a new non-continuous windowing scheme based on selectively rejecting contaminated data, were investigated for adaptive processing. Based on systolic triarrays, many other forms of systolic arrays are shown to be capable of implementing different algorithms. Various updating and downdating systolic algorithms and architectures for RLS filtering are examined and compared in details, which include Householder reflector, Gram-Schmidt procedure, and Givens rotation. A unified approach encompassing existing square-root-free algorithms is also proposed. For the sinusoidal spectrum estimation problem, a judicious method of separating the noise from the signal is of great interest. Various truncated QR methods are proposed for this purpose and compared to the truncated SVD method. Computer simulations provided for detailed comparisons show the effectiveness of these methods. This thesis deals with fundamental issues of numerical stability, computational efficiency, adaptivity, and VLSI implementation for the RLS filtering problems. In all, various new and modified algorithms and architectures are proposed and analyzed; the significance of any of the new method depends crucially on specific application

    On the Stability of Sequential Updates and Downdates

    Get PDF
    The updating and downdating of QR decompositions has important applications in a number of areas. There is essentially one standard updating algorithm, based on plane rotations, which is backwards stable. Three downdating algorithms have been treated in the literature: the LINPACK algorithm, the method of hyperbolic transformations, and Chambers' algorithm. Although none of these algorithms is backwards stable, the first and third satisfy a relational stability condition. In this paper, it is shown that relational stability extends to a sequence of updates and downdates. In consequence, other things being equal, if the final decomposition in the sequence is well conditioned, it will be accurately computed, even though intermediate decompositions may be almost completely inaccurate. These results are also applied to the two-sided orthogonal decompositions, such as the URV decomposition. (Also cross-referenced as UMIACS-TR-94-30

    Estimating large-scale general linear and seemingly unrelated regressions models after deleting observations

    Get PDF
    A new numerical method to solve the downdating problem (and variants thereof), namely removing the effect of some observations from the generalized least squares (GLS) estimator of the general linear model (GLM) after it has been estimated, is extensively investigated. It is verified that the solution of the downdated least squares problem can be obtained from the estimation of an equivalent GLM, where the original model is updated with the imaginary deleted observations. This updated GLM has a non positive definite dispersion matrix which comprises complex covariance values and it is proved herein to yield the same normal equations as the downdated model. Additionally, the problem of deleting observations from the seemingly unrelated regressions model is addressed, demonstrating the direct applicability of this method to other multivariate linear models. The algorithms which implement the novel downdating method utilize efficiently the previous computations from the estimation of the original model. As a result, the computational cost is significantly reduced. This shows the great usability potential of the downdating method in computationally intensive problems. The downdating algorithms have been applied to real and synthetic data to illustrate their efficiency

    Subspace-Based Noise Reduction for Speech Signals via Diagonal and Triangular Matrix Decompositions

    Get PDF

    A recursive three-stage least squares method for large-scale systems of simultaneous equations

    Get PDF
    A new numerical method is proposed that uses the QR decomposition (and its variants) to derive recursively the three-stage least squares (3SLS) estimator of large-scale simultaneous equations models (SEM). The 3SLS estimator is obtained sequentially, once the underlying model is modified, by adding or deleting rows of data. A new theoretical pseudo SEM is developed which has a non positive definite dispersion matrix and is proved to yield the 3SLS estimator that would be derived if the modified SEM was estimated afresh. In addition, the computation of the iterative 3SLS estimator of the updated observations SEM is considered. The new recursive method utilizes efficiently previous computations, exploits sparsity in the pseudo SEM and uses as main computational tool orthogonal and hyperbolic matrix factorizations. This allows the estimation of large-scale SEMs which previously could have been considered computationally infeasible to tackle. Numerical trials have confirmed the effectiveness of the new estimation procedures. The new method is illustrated through a macroeconomic application

    Computing 3SLS Solutions of Simultaneous Equation Models with a Possible Singular Variance-Convariance Matrix

    Get PDF
    Algorithms for computing the three-stage least squares (3SLS) estimator usually require the disturbance convariance matrix to be non-singular. However, the solution of a reformulated simultaneous equation model (SEM) results into the redundancy of this condition. Having as a basic tool the QR decomposition, the 3SLS estimator, its dispersion matrix and methods for estimating the singular disturbance covariance matrix and derived. Expressions revealing linear combinations between the observations which become redundant have also been presented. Algorithms for computing the 3SLS estimator after the SEM have been modified by deleting or adding new observations or variables are found not to be very efficient, due to the necessity of removing the endogeneity of the new data or by re-estimating the disturbance covariance matrix. Three methods have been described for solving SEMs subject to separable linear equalities constraints. The first method considers the constraints as additional precise observations while the other two methods reparameterized the constraints to solve reduced unconstrained SEMs. Method for computing the main matrix factorizations illustrate the basic principles to be adopted for solving SEMs on serial or parallel computer

    Downdating a Rank-Revealing URV Decomposition

    Get PDF
    Abstract. The rank-revealing URV decomposition is a useful tool for the subspace tracking problem in digital signal processing. Updating the decomposition is a stable process. However, downdating a rank-revealing URV decomposition could be unstable because the R factor is ill-conditioned. In this paper, we review some existing downdating algorithms for the full-rank URV decomposition in the absence of U and develop a new combined algorithm. We also show that the combined algorithm has relational stability. For the rank-revealing URV decomposition, we review a two-step method that applies full-rank downdating algorithms to the signal and noise parts separately. We compare several combinations of the full-rank algorithms and demonstrate good performance of our combined algorithm

    Efficient strategies for deriving the subset VAR models

    Get PDF
    Abstract.: Algorithms for computing the subset Vector Autoregressive (VAR) models are proposed. These algorithms can be used to choose a subset of the most statistically-significant variables of a VAR model. In such cases, the selection criteria are based on the residual sum of squares or the estimated residual covariance matrix. The VAR model with zero coefficient restrictions is formulated as a Seemingly Unrelated Regressions (SUR) model. Furthermore, the SUR model is transformed into one of smaller size, where the exogenous matrices comprise columns of a triangular matrix. Efficient algorithms which exploit the common columns of the exogenous matrices, sparse structure of the variance-covariance of the disturbances and special properties of the SUR models are investigated. The main computational tool of the selection strategies is the generalized QR decomposition and its modificatio
    corecore