5 research outputs found

    A Unified Framework for Fast Large-Scale Portfolio Optimization

    Full text link
    We introduce a unified framework for rapid, large-scale portfolio optimization that incorporates both shrinkage and regularization techniques. This framework addresses multiple objectives, including minimum variance, mean-variance, and the maximum Sharpe ratio, and also adapts to various portfolio weight constraints. For each optimization scenario, we detail the translation into the corresponding quadratic programming (QP) problem and then integrate these solutions into a new open-source Python library. Using 50 years of return data from US mid to large-sized companies, and 33 distinct firm-specific characteristics, we utilize our framework to assess the out-of-sample monthly rebalanced portfolio performance of widely-adopted covariance matrix estimators and factor models, examining both daily and monthly returns. These estimators include the sample covariance matrix, linear and nonlinear shrinkage estimators, and factor portfolios based on Asset Pricing (AP) Trees, Principal Component Analysis (PCA), Risk Premium PCA (RP-PCA), and Instrumented PCA (IPCA). Our findings emphasize that AP-Trees and PCA-based factor models consistently outperform all other approaches in out-of-sample portfolio performance. Finally, we develop new l1 and l2 regularizations of factor portfolio norms which not only elevate the portfolio performance of AP-Trees and PCA-based factor models but they have a potential to reduce an excessive turnover and transaction costs often associated with these models.Comment: 35 pages, 11 figure

    Optimized collaborative filtering algorithm based on item rating prediction

    No full text
    Conference Name:2012 2nd International Conference on Instrumentation and Measurement, Computer, Communication and Control, IMCCC 2012. Conference Address: Harbin, Heilongjiang, China. Time:December 8, 2012 - December 10, 2012.Collaborative filtering recommendation algorithm is currently the most widely used personalized recommendation algorithm. Sparsity problem of user rating data led to the recommendation quality of traditional collaborative filtering algorithms are far from ideal. To solve the problem, the paper first cloud model and project characteristic attributes to calculate the similarity between the project has taken into consideration in computing project similarity scores were similar between the project and consider the project between the characteristic attribute similarity, and then to predict ungraded items rated. Finally, the cloud model to calculate the similarity between users to obtain the target user's nearest neighbor. Experimental results show that the algorithm improves the accuracy of the similarity of the calculated project, and effectively solve the problem of data sparsity, and improve the quality of the recommendation system recommended. 漏 2012 IEEE

    Study on discrete particle swarm optimization algorithm

    No full text
    Conference Name:2nd International Conference on Advanced Design and Manufacturing Engineering, ADME 2012. Conference Address: Taiyuan, China. Time:August 16, 2012 - August 18, 2012.The particle swarm optimization (PSO) algorithm is a new type global searching method, which mostly focus on the continuous variables and little on discrete variables. The discrete forms and discretized methods have received more attention in recent years. This paper introduces the basic principles and mechanisms of PSO algorithm firstly, then points out the process of PSO algorithm and depict the operation rules of discrete PSO algorithm. Various improvements and applications of discrete PSO algorithms are reviewed. The mechanisms and characteristics of two different discretized strategies are presented. Some development trends and future research directions about discrete PSO are proposed. 漏 (2012) Trans Tech Publications, Switzerland
    corecore