379 research outputs found

    Functional dynamic factor models with application to yield curve forecasting

    Get PDF
    Accurate forecasting of zero coupon bond yields for a continuum of maturities is paramount to bond portfolio management and derivative security pricing. Yet a universal model for yield curve forecasting has been elusive, and prior attempts often resulted in a trade-off between goodness of fit and consistency with economic theory. To address this, herein we propose a novel formulation which connects the dynamic factor model (DFM) framework with concepts from functional data analysis: a DFM with functional factor loading curves. This results in a model capable of forecasting functional time series. Further, in the yield curve context we show that the model retains economic interpretation. Model estimation is achieved through an expectation-maximization algorithm, where the time series parameters and factor loading curves are simultaneously estimated in a single step. Efficient computing is implemented and a data-driven smoothing parameter is nicely incorporated. We show that our model performs very well on forecasting actual yield data compared with existing approaches, especially in regard to profit-based assessment for an innovative trading exercise. We further illustrate the viability of our model to applications outside of yield forecasting.Comment: Published in at http://dx.doi.org/10.1214/12-AOAS551 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Robust regularized singular value decomposition with application to mortality data

    Get PDF
    We develop a robust regularized singular value decomposition (RobRSVD) method for analyzing two-way functional data. The research is motivated by the application of modeling human mortality as a smooth two-way function of age group and year. The RobRSVD is formulated as a penalized loss minimization problem where a robust loss function is used to measure the reconstruction error of a low-rank matrix approximation of the data, and an appropriately defined two-way roughness penalty function is used to ensure smoothness along each of the two functional domains. By viewing the minimization problem as two conditional regularized robust regressions, we develop a fast iterative reweighted least squares algorithm to implement the method. Our implementation naturally incorporates missing values. Furthermore, our formulation allows rigorous derivation of leave-one-row/column-out cross-validation and generalized cross-validation criteria, which enable computationally efficient data-driven penalty parameter selection. The advantages of the new robust method over nonrobust ones are shown via extensive simulation studies and the mortality rate application.Comment: Published in at http://dx.doi.org/10.1214/13-AOAS649 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Functional principal components analysis via penalized rank one approximation

    Get PDF
    Two existing approaches to functional principal components analysis (FPCA) are due to Rice and Silverman (1991) and Silverman (1996), both based on maximizing variance but introducing penalization in different ways. In this article we propose an alternative approach to FPCA using penalized rank one approximation to the data matrix. Our contributions are four-fold: (1) by considering invariance under scale transformation of the measurements, the new formulation sheds light on how regularization should be performed for FPCA and suggests an efficient power algorithm for computation; (2) it naturally incorporates spline smoothing of discretized functional data; (3) the connection with smoothing splines also facilitates construction of cross-validation or generalized cross-validation criteria for smoothing parameter selection that allows efficient computation; (4) different smoothing parameters are permitted for different FPCs. The methodology is illustrated with a real data example and a simulation.Comment: Published in at http://dx.doi.org/10.1214/08-EJS218 the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A two-way regularization method for MEG source reconstruction

    Get PDF
    The MEG inverse problem refers to the reconstruction of the neural activity of the brain from magnetoencephalography (MEG) measurements. We propose a two-way regularization (TWR) method to solve the MEG inverse problem under the assumptions that only a small number of locations in space are responsible for the measured signals (focality), and each source time course is smooth in time (smoothness). The focality and smoothness of the reconstructed signals are ensured respectively by imposing a sparsity-inducing penalty and a roughness penalty in the data fitting criterion. A two-stage algorithm is developed for fast computation, where a raw estimate of the source time course is obtained in the first stage and then refined in the second stage by the two-way regularization. The proposed method is shown to be effective on both synthetic and real-world examples.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS531 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Programmable base editing of zebrafish genome using a modified CRISPR-Cas9 system.

    Get PDF
    Precise genetic modifications in model animals are essential for biomedical research. Here, we report a programmable "base editing" system to induce precise base conversion with high efficiency in zebrafish. Using cytidine deaminase fused to Cas9 nickase, up to 28% of site-specific single-base mutations are achieved in multiple gene loci. In addition, an engineered Cas9-VQR variant with 5'-NGA PAM specificities is used to induce base conversion in zebrafish. This shows that Cas9 variants can be used to expand the utility of this technology. Collectively, the targeted base editing system represents a strategy for precise and effective genome editing in zebrafish.The use of base editing enables precise genetic modifications in model animals. Here the authors show high efficient single-base editing in zebrafish using modified Cas9 and its VQR variant with an altered PAM specificity

    Key pathways and genes controlling the development and progression of clear cell renal cell carcinoma (ccRCC) based on gene set enrichment analysis

    Get PDF
    BACKGROUND: Clear-cell renal cell carcinoma (ccRCC) is one of the most common types of kidney cancer in adults; however, its causes are not completely understood. The study was designed to filter the key pathways and genes associated with the occurrence or development of ccRCC, acquaint its pathogenesis at gene and pathway level, to provide more theory evidence and targeted therapy for ccRCC. METHODS: Gene set enrichment analysis (GSEA) and meta-analysis (Meta) were used to screen the critical pathways and genes which may affect the occurrence and progression of ccRCC on the transcription level. Corresponding pathways of significant genes were obtained with the online website DAVID (http://david.abcc.ncifcrf.gov/). RESULTS: Thirty seven consistent pathways and key genes in these pathways related to ccRCC were obtained with combined GSEA and meta-analysis. These pathways were mainly involved in metabolism, organismal systems, cellular processes and environmental information processing. CONCLUSION: The gene pathways that we identified could provide insight concerning the development of ccRCC. Further studies are needed to determine the biological function for the positive genes
    • …
    corecore