13,071 research outputs found

    Randomized Dimension Reduction on Massive Data

    Full text link
    Scalability of statistical estimators is of increasing importance in modern applications and dimension reduction is often used to extract relevant information from data. A variety of popular dimension reduction approaches can be framed as symmetric generalized eigendecomposition problems. In this paper we outline how taking into account the low rank structure assumption implicit in these dimension reduction approaches provides both computational and statistical advantages. We adapt recent randomized low-rank approximation algorithms to provide efficient solutions to three dimension reduction methods: Principal Component Analysis (PCA), Sliced Inverse Regression (SIR), and Localized Sliced Inverse Regression (LSIR). A key observation in this paper is that randomization serves a dual role, improving both computational and statistical performance. This point is highlighted in our experiments on real and simulated data.Comment: 31 pages, 6 figures, Key Words:dimension reduction, generalized eigendecompositon, low-rank, supervised, inverse regression, random projections, randomized algorithms, Krylov subspace method

    Matrix-interpolation-based parametric model order reduction for multiconductor transmission lines with delays

    Get PDF
    A novel parametric model order reduction technique based on matrix interpolation for multiconductor transmission lines (MTLs) with delays having design parameter variations is proposed in this brief. Matrix interpolation overcomes the oversize problem caused by input-output system-level interpolation-based parametric macromodels. The reduced state-space matrices are obtained using a higher-order Krylov subspace-based model order reduction technique, which is more efficient in comparison to the Gramian-based parametric modeling in which the projection matrix is computed using a Cholesky factorization. The design space is divided into cells, and then the Krylov subspaces computed for each cell are merged and then truncated using an adaptive truncation algorithm with respect to their singular values to obtain a compact common projection matrix. The resulting reduced-order state-space matrices and the delays are interpolated using positive interpolation schemes, making it computationally cheap and accurate for repeated system evaluations under different design parameter settings. The proposed technique is successfully applied to RLC (R-resistor, L-inductor, C-capacitance) and MTL circuits with delays

    Structure Preserving Model Reduction of Parametric Hamiltonian Systems

    Get PDF
    While reduced-order models (ROMs) have been popular for efficiently solving large systems of differential equations, the stability of reduced models over long-time integration is of present challenges. We present a greedy approach for ROM generation of parametric Hamiltonian systems that captures the symplectic structure of Hamiltonian systems to ensure stability of the reduced model. Through the greedy selection of basis vectors, two new vectors are added at each iteration to the linear vector space to increase the accuracy of the reduced basis. We use the error in the Hamiltonian due to model reduction as an error indicator to search the parameter space and identify the next best basis vectors. Under natural assumptions on the set of all solutions of the Hamiltonian system under variation of the parameters, we show that the greedy algorithm converges with exponential rate. Moreover, we demonstrate that combining the greedy basis with the discrete empirical interpolation method also preserves the symplectic structure. This enables the reduction of the computational cost for nonlinear Hamiltonian systems. The efficiency, accuracy, and stability of this model reduction technique is illustrated through simulations of the parametric wave equation and the parametric Schrodinger equation
    • …
    corecore