17 research outputs found

    Random Projections For Large-Scale Regression

    Full text link
    Fitting linear regression models can be computationally very expensive in large-scale data analysis tasks if the sample size and the number of variables are very large. Random projections are extensively used as a dimension reduction tool in machine learning and statistics. We discuss the applications of random projections in linear regression problems, developed to decrease computational costs, and give an overview of the theoretical guarantees of the generalization error. It can be shown that the combination of random projections with least squares regression leads to similar recovery as ridge regression and principal component regression. We also discuss possible improvements when averaging over multiple random projections, an approach that lends itself easily to parallel implementation.Comment: 13 pages, 3 Figure

    ДослідТСння точності розв’язання дискрСтних Π½Π΅ΠΊΠΎΡ€Π΅ΠΊΡ‚Π½ΠΈΡ… Π·Π°Π΄Π°Ρ‡ ΠΌΠ΅Ρ‚ΠΎΠ΄ΠΎΠΌ Π²ΠΈΠΏΠ°Π΄ΠΊΠΎΠ²ΠΈΡ… ΠΏΡ€ΠΎΠ΅ΠΊΡ†Ρ–ΠΉ

    No full text
    Для розв’язання дискрСтних Π½Π΅ΠΊΠΎΡ€Π΅ΠΊΡ‚Π½ΠΈΡ… Π·Π°Π΄Π°Ρ‡ Π½Π° основі Π°Π½Π°Π»Ρ–Ρ‚ΠΈΡ‡Π½ΠΎΠ³ΠΎ усСрСднСння Π²ΠΈΠΏΠ°Π΄ΠΊΠΎΠ²ΠΎΠ³ΠΎ проСктування Ρ€ΠΎΠ·Ρ€ΠΎΠ±Π»Π΅Π½ΠΎ ΠΌΠ΅Ρ‚ΠΎΠ΄ визначСння ΠΎΠΏΡ‚ΠΈΠΌΠ°Π»ΡŒΠ½ΠΎΡ— розмірності Π²ΠΈΠΏΠ°Π΄ΠΊΠΎΠ²ΠΎΡ— ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ–, який Π·Π°Π±Π΅Π·ΠΏΠ΅Ρ‡ΡƒΡ” ΠΏΠΎΠΌΠΈΠ»ΠΊΡƒ розв’язку Ρ‚Π°ΠΊΠΈΡ… Π·Π°Π΄Π°Ρ‡, Π±Π»ΠΈΠ·ΡŒΠΊΡƒ Π΄ΠΎ ΠΌΡ–Π½Ρ–ΠΌΠ°Π»ΡŒΠ½ΠΎΡ—.Для Ρ€Π΅ΡˆΠ΅Π½ΠΈΡ дискрСтных Π½Π΅ΠΊΠΎΡ€Ρ€Π΅ΠΊΡ‚Π½Ρ‹Ρ… Π·Π°Π΄Π°Ρ‡ Π½Π° основС аналитичСского усрСднСния случайного проСцирования Ρ€Π°Π·Ρ€Π°Π±ΠΎΡ‚Π°Π½ ΠΌΠ΅Ρ‚ΠΎΠ΄ опрСдСлСния ΠΎΠΏΡ‚ΠΈΠΌΠ°Π»ΡŒΠ½ΠΎΠΉ размСрности случайной ΠΌΠ°Ρ‚Ρ€ΠΈΡ†Ρ‹, ΠΎΠ±Π΅ΡΠΏΠ΅Ρ‡ΠΈΠ²Π°ΡŽΡ‰ΠΈΠΉ ΠΎΡˆΠΈΠ±ΠΊΡƒ Ρ€Π΅ΡˆΠ΅Π½ΠΈΡ Ρ‚Π°ΠΊΠΈΡ… Π·Π°Π΄Π°Ρ‡, Π±Π»ΠΈΠ·ΠΊΡƒΡŽ ΠΊ минимальной.The aim is develop a method for determining the optimal size of a random matrix for the method of DIP solving based on the refined evaluation of the input vector. Results and conclusions. The method of DIP solving based on the analytical averaging of random projection has been proposed. For this method, we have developed the criterion for determining the number of rows of a random matrix which provides the error of DIP solution close to the minimum. We conducted an experimental investigation of the accuracy of DIP solution by a deterministic method based on the analytical averaging of random projection with the search for the optimal solution by the model selection criteria of Mallows, Akaike, Minimum description length, and . The experiments showed that and Akaike criteria provide the error value close to the minimum

    A Generalized Framework on Beamformer Design and CSI Acquisition for Single-Carrier Massive MIMO Systems in Millimeter Wave Channels

    Get PDF
    In this paper, we establish a general framework on the reduced dimensional channel state information (CSI) estimation and pre-beamformer design for frequency-selective massive multiple-input multiple-output MIMO systems employing single-carrier (SC) modulation in time division duplex (TDD) mode by exploiting the joint angle-delay domain channel sparsity in millimeter (mm) wave frequencies. First, based on a generic subspace projection taking the joint angle-delay power profile and user-grouping into account, the reduced rank minimum mean square error (RR-MMSE) instantaneous CSI estimator is derived for spatially correlated wideband MIMO channels. Second, the statistical pre-beamformer design is considered for frequency-selective SC massive MIMO channels. We examine the dimension reduction problem and subspace (beamspace) construction on which the RR-MMSE estimation can be realized as accurately as possible. Finally, a spatio-temporal domain correlator type reduced rank channel estimator, as an approximation of the RR-MMSE estimate, is obtained by carrying out least square (LS) estimation in a proper reduced dimensional beamspace. It is observed that the proposed techniques show remarkable robustness to the pilot interference (or contamination) with a significant reduction in pilot overhead
    corecore