603,981 research outputs found

    Principal Regression Analysis and the index leverage effect

    Full text link
    We revisit the index leverage effect, that can be decomposed into a volatility effect and a correlation effect. We investigate the latter using a matrix regression analysis, that we call `Principal Regression Analysis' (PRA) and for which we provide some analytical (using Random Matrix Theory) and numerical benchmarks. We find that downward index trends increase the average correlation between stocks (as measured by the most negative eigenvalue of the conditional correlation matrix), and makes the market mode more uniform. Upward trends, on the other hand, also increase the average correlation between stocks but rotates the corresponding market mode {\it away} from uniformity. There are two time scales associated to these effects, a short one on the order of a month (20 trading days), and a longer time scale on the order of a year. We also find indications of a leverage effect for sectorial correlations as well, which reveals itself in the second and third mode of the PRA.Comment: 10 pages, 7 figure

    Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

    Full text link
    We propose a new two stage algorithm LING for large scale regression problems. LING has the same risk as the well known Ridge Regression under the fixed design setting and can be computed much faster. Our experiments have shown that LING performs well in terms of both prediction accuracy and computational efficiency compared with other large scale regression algorithms like Gradient Descent, Stochastic Gradient Descent and Principal Component Regression on both simulated and real datasets

    Least Squares Regression Principal Component Analysis

    Get PDF
    Dimension reduction is an important technique in surrogate modeling and machine learning. In this thesis, we present three existing dimension reduction methods in detail and then we propose a novel supervised dimension reduction method, `Least Squares Regression Principal Component Analysis" (LSR-PCA), applicable to both classification and regression dimension reduction tasks. To show the efficacy of this method, we present different examples in visualization, classification and regression problems, comparing it to state-of-the-art dimension reduction methods. Furthermore, we present the kernel version of LSR-PCA for problems where the input are correlated non-linearly. The examples demonstrated that LSR-PCA can be a competitive dimension reduction method.La reducción de dimensiones es una técnica importante en el modelado de sustitución y el aprendizaje automático. En esta tesis, presentamos en detalle los tres métodos de reducción de dimensiones existentes y proponemos un novedoso método supervisado de reducción de dimensiones, el "Least Squares Regression Principal Component Analysis" (LSR-PCA), aplicable tanto a las tareas de clasificación como a las de reducción de dimensiones de regresión. Para demostrar la eficacia de este método, presentamos diferentes ejemplos de problemas de visualización, clasificación y regresión, comparándolo con los métodos más avanzados de reducción de dimensiones. Además, presentamos la versión del núcleo de LSR-PCA para problemas en los que las entradas están correlacionadas de forma no lineal. Los ejemplos demostraron que LSR-PCA puede ser un método competitivo de reducción de dimensiones.Reducció de dimensions és una tècnica important dins del Machine-Learning. En aquesta tesi, presentem tres mètodes existents de reducció de dimensions en detall i llavors proposem un nou mètode supervisat de reducció de dimensions, "Least Squares Regression Principal Component Analysis" (LSR-PCA), aplicable a tant a tasques de classificació com a tasques de regressió. Per mostrar l'eficàcia d'aquest mètode, presentem exemples diferents en visualització, classificació i problemes de regressió, comparant-la a mètodes de dimensió de reducció moderns. A més, presentem la versió de nucli de LSR-PCA per a problemes on l'entrada és correlacionada no-linearment. Els exemples han demostrat que LSR-PCA pot ser un mètode de reducció de dimensions competitiu

    Principal Component Regression Analysis of CO2 Emission

    Get PDF
    Principal component regression (PCR) model is developed, in this study, for predicting and forecasting the abundance of CO2 emission which is the most important greenhouse gas in the atmosphere that contributes to global warming. The model was compared with supervised principal component regression (SPCR) model and was found to have more predictive power than it using the values of Akaike information criterion (AIC) and Swartz information criterion (SIC) of the models.Keywords: Global warming, CO2, Principal component regression (PCR), Supervised principal component regression (SPCR), Akaike information criterion (AIC) and Swartz information criterion (SIC

    Functional linear regression via canonical analysis

    Full text link
    We study regression models for the situation where both dependent and independent variables are square-integrable stochastic processes. Questions concerning the definition and existence of the corresponding functional linear regression models and some basic properties are explored for this situation. We derive a representation of the regression parameter function in terms of the canonical components of the processes involved. This representation establishes a connection between functional regression and functional canonical analysis and suggests alternative approaches for the implementation of functional linear regression analysis. A specific procedure for the estimation of the regression parameter function using canonical expansions is proposed and compared with an established functional principal component regression approach. As an example of an application, we present an analysis of mortality data for cohorts of medflies, obtained in experimental studies of aging and longevity.Comment: Published in at http://dx.doi.org/10.3150/09-BEJ228 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Dynamics of Interest Rate Curve by Functional Auto-regression

    Get PDF
    The paper applies methods of functional data analysis – functional auto-regression, principal components and canonical correlations – to the study of the dynamics of interest rate curve. In addition, it introduces a novel statistical tool based on the singular value decomposition of the functional cross-covariance operator. This tool is better suited for prediction purposes as opposed to either principal components or canonical correlations. Based on this tool, the paper provides a consistent method for estimating the functional auto-regression of interest rate curve. The theory is applied to estimating dynamics of Eurodollar futures rates. The results suggest that future movements of interest rates are predictable only at very short and very long horizonsFunctional auto-regression, term structure dynamics, principal components, canonical correlations, singular value decomposition
    corecore