7,019 research outputs found

    Estimated Correlation Matrices and Portfolio Optimization

    Full text link
    Financial correlations play a central role in financial theory and also in many practical applications. From theoretical point of view, the key interest is in a proper description of the structure and dynamics of correlations. From practical point of view, the emphasis is on the ability of the developed models to provide the adequate input for the numerous portfolio and risk management procedures used in the financial industry. This is crucial, since it has been long argued that correlation matrices determined from financial series contain a relatively large amount of noise and, in addition, most of the portfolio and risk management techniques used in practice can be quite sensitive to the inputs. In this paper we introduce a model (simulation)-based approach which can be used for a systematic investigation of the effect of the different sources of noise in financial correlations in the portfolio and risk management context. To illustrate the usefulness of this framework, we develop several toy models for the structure of correlations and, by considering the finiteness of the time series as the only source of noise, we compare the performance of several correlation matrix estimators introduced in the academic literature and which have since gained also a wide practical use. Based on this experience, we believe that our simulation-based approach can also be useful for the systematic investigation of several other problems of much interest in finance

    ZAP -- Enhanced PCA Sky Subtraction for Integral Field Spectroscopy

    Full text link
    We introduce Zurich Atmosphere Purge (ZAP), an approach to sky subtraction based on principal component analysis (PCA) that we have developed for the Multi Unit Spectrographic Explorer (MUSE) integral field spectrograph. ZAP employs filtering and data segmentation to enhance the inherent capabilities of PCA for sky subtraction. Extensive testing shows that ZAP reduces sky emission residuals while robustly preserving the flux and line shapes of astronomical sources. The method works in a variety of observational situations from sparse fields with a low density of sources to filled fields in which the target source fills the field of view. With the inclusion of both of these situations the method is generally applicable to many different science cases and should also be useful for other instrumentation. ZAP is available for download at http://muse-vlt.eu/science/tools.Comment: 12 pages, 7 figures, 1 table. Accepted to MNRA

    A Generalized Framework on Beamformer Design and CSI Acquisition for Single-Carrier Massive MIMO Systems in Millimeter Wave Channels

    Get PDF
    In this paper, we establish a general framework on the reduced dimensional channel state information (CSI) estimation and pre-beamformer design for frequency-selective massive multiple-input multiple-output MIMO systems employing single-carrier (SC) modulation in time division duplex (TDD) mode by exploiting the joint angle-delay domain channel sparsity in millimeter (mm) wave frequencies. First, based on a generic subspace projection taking the joint angle-delay power profile and user-grouping into account, the reduced rank minimum mean square error (RR-MMSE) instantaneous CSI estimator is derived for spatially correlated wideband MIMO channels. Second, the statistical pre-beamformer design is considered for frequency-selective SC massive MIMO channels. We examine the dimension reduction problem and subspace (beamspace) construction on which the RR-MMSE estimation can be realized as accurately as possible. Finally, a spatio-temporal domain correlator type reduced rank channel estimator, as an approximation of the RR-MMSE estimate, is obtained by carrying out least square (LS) estimation in a proper reduced dimensional beamspace. It is observed that the proposed techniques show remarkable robustness to the pilot interference (or contamination) with a significant reduction in pilot overhead

    Hyperspectral colon tissue cell classification

    Get PDF
    A novel algorithm to discriminate between normal and malignant tissue cells of the human colon is presented. The microscopic level images of human colon tissue cells were acquired using hyperspectral imaging technology at contiguous wavelength intervals of visible light. While hyperspectral imagery data provides a wealth of information, its large size normally means high computational processing complexity. Several methods exist to avoid the so-called curse of dimensionality and hence reduce the computational complexity. In this study, we experimented with Principal Component Analysis (PCA) and two modifications of Independent Component Analysis (ICA). In the first stage of the algorithm, the extracted components are used to separate four constituent parts of the colon tissue: nuclei, cytoplasm, lamina propria, and lumen. The segmentation is performed in an unsupervised fashion using the nearest centroid clustering algorithm. The segmented image is further used, in the second stage of the classification algorithm, to exploit the spatial relationship between the labeled constituent parts. Experimental results using supervised Support Vector Machines (SVM) classification based on multiscale morphological features reveal the discrimination between normal and malignant tissue cells with a reasonable degree of accuracy

    Unsupervised Visual and Textual Information Fusion in Multimedia Retrieval - A Graph-based Point of View

    Full text link
    Multimedia collections are more than ever growing in size and diversity. Effective multimedia retrieval systems are thus critical to access these datasets from the end-user perspective and in a scalable way. We are interested in repositories of image/text multimedia objects and we study multimodal information fusion techniques in the context of content based multimedia information retrieval. We focus on graph based methods which have proven to provide state-of-the-art performances. We particularly examine two of such methods : cross-media similarities and random walk based scores. From a theoretical viewpoint, we propose a unifying graph based framework which encompasses the two aforementioned approaches. Our proposal allows us to highlight the core features one should consider when using a graph based technique for the combination of visual and textual information. We compare cross-media and random walk based results using three different real-world datasets. From a practical standpoint, our extended empirical analysis allow us to provide insights and guidelines about the use of graph based methods for multimodal information fusion in content based multimedia information retrieval.Comment: An extended version of the paper: Visual and Textual Information Fusion in Multimedia Retrieval using Semantic Filtering and Graph based Methods, by J. Ah-Pine, G. Csurka and S. Clinchant, submitted to ACM Transactions on Information System

    Characterizing Signal Loss in the 21 cm Reionization Power Spectrum: A Revised Study of PAPER-64

    Get PDF
    The Epoch of Reionization (EoR) is an uncharted era in our Universe's history during which the birth of the first stars and galaxies led to the ionization of neutral hydrogen in the intergalactic medium. There are many experiments investigating the EoR by tracing the 21cm line of neutral hydrogen. Because this signal is very faint and difficult to isolate, it is crucial to develop analysis techniques that maximize sensitivity and suppress contaminants in data. It is also imperative to understand the trade-offs between different analysis methods and their effects on power spectrum estimates. Specifically, with a statistical power spectrum detection in HERA's foreseeable future, it has become increasingly important to understand how certain analysis choices can lead to the loss of the EoR signal. In this paper, we focus on signal loss associated with power spectrum estimation. We describe the origin of this loss using both toy models and data taken by the 64-element configuration of the Donald C. Backer Precision Array for Probing the Epoch of Reionization (PAPER). In particular, we highlight how detailed investigations of signal loss have led to a revised, higher 21cm power spectrum upper limit from PAPER-64. Additionally, we summarize errors associated with power spectrum error estimation that were previously unaccounted for. We focus on a subset of PAPER-64 data in this paper; revised power spectrum limits from the PAPER experiment are presented in a forthcoming paper by Kolopanis et al. (in prep.) and supersede results from previously published PAPER analyses.Comment: 25 pages, 18 figures, Accepted by Ap
    • …
    corecore