92,871 research outputs found

    A blocking and regularization approach to high dimensional realized covariance estimation

    Get PDF
    We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results

    Self-Organizing Time Map: An Abstraction of Temporal Multivariate Patterns

    Full text link
    This paper adopts and adapts Kohonen's standard Self-Organizing Map (SOM) for exploratory temporal structure analysis. The Self-Organizing Time Map (SOTM) implements SOM-type learning to one-dimensional arrays for individual time units, preserves the orientation with short-term memory and arranges the arrays in an ascending order of time. The two-dimensional representation of the SOTM attempts thus twofold topology preservation, where the horizontal direction preserves time topology and the vertical direction data topology. This enables discovering the occurrence and exploring the properties of temporal structural changes in data. For representing qualities and properties of SOTMs, we adapt measures and visualizations from the standard SOM paradigm, as well as introduce a measure of temporal structural changes. The functioning of the SOTM, and its visualizations and quality and property measures, are illustrated on artificial toy data. The usefulness of the SOTM in a real-world setting is shown on poverty, welfare and development indicators

    Will the US Economy Recover in 2010? A Minimal Spanning Tree Study

    Full text link
    We calculated the cross correlations between the half-hourly times series of the ten Dow Jones US economic sectors over the period February 2000 to August 2008, the two-year intervals 2002--2003, 2004--2005, 2008--2009, and also over 11 segments within the present financial crisis, to construct minimal spanning trees (MSTs) of the US economy at the sector level. In all MSTs, a core-fringe structure is found, with consumer goods, consumer services, and the industrials consistently making up the core, and basic materials, oil and gas, healthcare, telecommunications, and utilities residing predominantly on the fringe. More importantly, we find that the MSTs can be classified into two distinct, statistically robust, topologies: (i) star-like, with the industrials at the center, associated with low-volatility economic growth; and (ii) chain-like, associated with high-volatility economic crisis. Finally, we present statistical evidence, based on the emergence of a star-like MST in Sep 2009, and the MST staying robustly star-like throughout the Greek Debt Crisis, that the US economy is on track to a recovery.Comment: elsarticle class, includes amsmath.sty, graphicx.sty and url.sty. 68 pages, 16 figures, 8 tables. Abridged version of the manuscript presented at the Econophysics Colloquim 2010, incorporating reviewer comment

    A new method for automatic Multiple Partial Discharge Classification

    No full text
    A new wavelet based feature parameter have been developed to represent the characteristics of PD activities, i.e. the wavelet decomposition energy of PD pulses measured from non-conventional ultra wide bandwidth PD sensors such as capacitive couplers (CC) or high frequency current transformers (HFCT). The generated feature vectors can contain different dimensions depending on the length of recorded pulses. These high dimensional feature vectors can then be processed using Principal Component Analysis (PCA) to map the data into a three dimensional space whilst the first three most significant components representing the feature vector are preserved. In the three dimensional mapped space, an automatic Density-Based Spatial Clustering of Applications with Noise (DBSCAN) algorithm is then applied to classify the data cluster(s) produced by the PCA. As the procedure is undertaken in a three dimensional space, the obtained clustering results can be easily assessed. The classified PD sub-data sets are then reconstructed in the time domain as phase-resolved patterns to facilitate PD source type identification. The proposed approach has been successfully applied to PD data measured from electrical machines and power cables where measurements were undertaken in different laboratories
    • …
    corecore