515 research outputs found
Social Networks, Asset Allocation and Portfolio Diversification
In this thesis we consider the problem of choosing financial assets from the equity
markets for portfolio construction purposes. We adapt various measures to model the
dependence structure among financial assets, taking both the linear and the non-linear relationships into consideration. The dependence structure is reflected by the social networks. We apply the data clustering technique (Frey and Dueck, 2007) to the social networks and study the equity selections based on different dependence measures. The regime switching model (Perlin, 2014) is considered as well in order to identify the changes in the market phases. The performance of the equity selections is evaluated within the mean-variance framework. In addition, we present a diversification analysis of the equity selections with the methodology proposed by Meucci (2009). The numerical tests are applied on three major Chinese equity markets. Through changing the market environment, we acquire a good understanding of the influencing factors for choosing financial assets
Affinity-Based Reinforcement Learning : A New Paradigm for Agent Interpretability
The steady increase in complexity of reinforcement learning (RL) algorithms is accompanied by a corresponding increase in opacity that obfuscates insights into their devised strategies. Methods in explainable artificial intelligence seek to mitigate this opacity by either creating transparent algorithms or extracting explanations post hoc. A third category exists that allows the developer to affect what agents learn: constrained RL has been used in safety-critical applications and prohibits agents from visiting certain states; preference-based RL agents have been used in robotics applications and learn state-action preferences instead of traditional reward functions. We propose a new affinity-based RL paradigm in which agents learn strategies that are partially decoupled from reward functions. Unlike entropy regularisation, we regularise the objective function with a distinct action distribution that represents a desired behaviour; we encourage the agent to act according to a prior while learning to maximise rewards. The result is an inherently interpretable agent that solves problems with an intrinsic affinity for certain actions. We demonstrate the utility of our method in a financial application: we learn continuous time-variant compositions of prototypical policies, each interpretable by its action affinities, that are globally interpretable according to customersâ financial personalities.
Our method combines advantages from both constrained RL and preferencebased RL: it retains the reward function but generalises the policy to match a defined behaviour, thus avoiding problems such as reward shaping and hacking. Unlike Boolean task composition, our method is a fuzzy superposition of different prototypical strategies to arrive at a more complex, yet interpretable, strategy.publishedVersio
Representation learning in finance
Finance studies often employ heterogeneous datasets from different sources with different structures and frequencies. Some data are noisy, sparse, and unbalanced with missing values; some are unstructured, containing text or networks. Traditional techniques often struggle to combine and effectively extract information from these datasets. This work explores representation learning as a proven machine learning technique in learning informative embedding from complex, noisy, and dynamic financial data. This dissertation proposes novel factorization algorithms and network modeling techniques to learn the local and global representation of data in two specific financial applications: analystsâ earnings forecasts and asset pricing.
Financial analystsâ earnings forecast is one of the most critical inputs for security valuation and investment decisions. However, it is challenging to fully utilize this type of data due to the missing values. This work proposes one matrix-based algorithm, âCoupled Matrix Factorization,â and one tensor-based algorithm, âNonlinear Tensor Coupling and Completion Framework,â to impute missing values in analystsâ earnings forecasts and then use the imputed data to predict firmsâ future earnings. Experimental analysis shows that missing value imputation and representation learning by coupled matrix/tensor factorization from the observed entries improve the accuracy of firm earnings prediction. The results confirm that representing financial time-series in their natural third-order tensor form improves the latent representation of the data. It learns high-quality embedding by overcoming information loss of flattening data in spatial or temporal dimensions.
Traditional asset pricing models focus on linear relationships among asset pricing factors and often ignore nonlinear interaction among firms and factors. This dissertation formulates novel methods to identify nonlinear asset pricing factors and develops asset pricing models that capture global and local properties of data. First, this work proposes an artificial neural network âauto enco derâ based model to capture the latent asset pricing factors from the global representation of an equity index. It also shows that autoencoder effectively identifies communal and non-communal assets in an index to facilitate portfolio optimization. Second, the global representation is augmented by propagating information from local communities, where the network determines the strength of this information propagation. Based on the Laplacian spectrum of the equity market network, a network factor âZ-scoreâ is proposed to facilitate pertinent information propagation and capture dynamic changes in network structures. Finally, a âDynamic Graph Learning Framework for Asset Pricingâ is proposed to combine both global and local representations of data into one end-to-end asset pricing model. Using graph attention mechanism and information diffusion function, the proposed model learns new connections for implicit networks and refines connections of explicit networks. Experimental analysis shows that the proposed model incorporates information from negative and positive connections, captures the network evolution of the equity market over time, and outperforms other state-of-the-art asset pricing and predictive machine learning models in stock return prediction.
In a broader context, this is a pioneering work in FinTech, particularly in understanding complex financial market structures and developing explainable artificial intelligence models for finance applications. This work effectively demonstrates the application of machine learning to model financial networks, capture nonlinear interactions on data, and provide investors with powerful data-driven techniques for informed decision-making
A FRAMEWORK FOR STRATEGIC PROJECT ANALYSIS AND PRIORITIZATION
Projects that support the long-term strategic intent and alignment are considered strategic projects. Therefore, these projects must consider their alignment with the organizationâs current strategy and focus on the risk, organizational capability, resources availability, political influence, and socio-cultural factors. Quantitative and qualitative methods prioritize the projects; however, they are usually suitable for specific industries. Although prioritization models are used in the private sector, the same in the public sector is not widely seen in the literature. The lack of models in the public sector has happened because of the projectsâ social implications, the value perception of different projects in the public sector, and potentially differing value perceptions attached to the types of projects in different decision-making environments in the public sector.
The thesis proposes a generic framework to develop a priority list of the available basket of projects and decide on projects for the next undertaking. The focus of the thesis is on public projects. The analysis in the framework considers the critical factors for prioritization obtained from the literature clustered through the agglomerative text clustering technique. In the proposed framework, 13 critical clusters are identified and weighted using the Criteria Importance Through Intercriteria Correlation (CRITIC) method to develop their ranking using the Technique for Order of Preference Similarity Ideal Solution (TOPSIS) method. In addition, the proposed framework uses vector weighting to prioritize projects across industries.
The applicability of the framework is demonstrated through Qatarâs real estate and transportation projects. The outcome obtained from the framework is compared with those obtained through the experts using the System Usability Scale (SUS). The comparison shows that the framework provides good predictability of the projects for implementation
Recommended from our members
Nature inspired computational intelligence for financial contagion modelling
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Financial contagion refers to a scenario in which small shocks, which initially affect only a few financial institutions or a particular region of the economy, spread to the rest of the financial sector and other countries whose economies were previously healthy. This resembles the âtransmissionâ of a medical disease. Financial contagion happens both at domestic level and international level. At domestic level, usually the failure of a domestic bank or financial intermediary triggers transmission by defaulting on inter-bank liabilities, selling assets in a fire sale, and undermining confidence in similar banks. An example of this phenomenon is the failure of Lehman Brothers and the subsequent turmoil in the US financial markets. International financial contagion happens in both advanced economies and developing economies, and is the transmission of financial crises across financial markets. Within the current globalise financial system, with large volumes of cash flow and cross-regional operations of large banks and hedge funds, financial contagion usually happens simultaneously among both domestic institutions and across countries. There is no conclusive definition of financial contagion, most research papers study contagion by analyzing the change in the variance-covariance matrix during the period of market turmoil. King and Wadhwani (1990) first test the correlations between the US, UK and Japan, during the US stock market crash of 1987. Boyer (1997) finds significant increases in correlation during financial crises, and reinforces a definition of financial contagion as a correlation changing during the crash period. Forbes and Rigobon (2002) give a definition of financial contagion. In their work, the term interdependence is used as the alternative to contagion. They claim that for the period they study, there is no contagion but only interdependence. Interdependence leads to common price movements during periods both of stability and turmoil. In the past two decades, many studies (e.g. Kaminsky et at., 1998; Kaminsky 1999) have developed early warning systems focused on the origins of financial crises rather than on financial contagion. Further authors (e.g. Forbes and Rigobon, 2002; Caporale et al, 2005), on the other hand, have focused on studying contagion or interdependence. In this thesis, an overall mechanism is proposed that simulates characteristics of propagating crisis through contagion. Within that scope, a new co-evolutionary market model is developed, where some of the technical traders change their behaviour during crisis to transform into herd traders making their decisions based on market sentiment rather than underlying strategies or factors. The thesis focuses on the transformation of market interdependence into contagion and on the contagion effects. The author first build a multi-national platform to allow different type of players to trade implementing their own rules and considering information from the domestic and a foreign market. Tradersâ strategies and the performance of the simulated domestic market are trained using historical prices on both markets, and optimizing artificial marketâs parameters through immune - particle swarm optimization techniques (I-PSO). The author also introduces a mechanism contributing to the transformation of technical into herd traders. A generalized auto-regressive conditional heteroscedasticity - copula (GARCH-copula) is further applied to calculate the tail dependence between the affected market and the origin of the crisis, and that parameter is used in the fitness function for selecting the best solutions within the evolving population of possible model parameters, and therefore in the optimization criteria for contagion simulation. The overall model is also applied in predictive mode, where the author optimize in the pre-crisis period using data from the domestic market and the crisis-origin foreign market, and predict in the crisis period using data from the foreign market and predicting the affected domestic market
Applied Metaheuristic Computing
For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC
An evolutionary theory of systemic risk and its mitigation for the global financial system
This thesis is the outcome of theory development research into an identified gap
in knowledge about systemic risk of the global financial system. It takes a
systems-theoretic approach, incorporating a simulation-constructivist orientation
towards the meaning of theory and theory development, within a realist
constructivism epistemology for knowledge generation about complex social
phenomena. The specific purpose of which is to describe systemic risk of failure,
and explain how it occurs in the global financial system, in order to diagnose and
understand circumstances in which it arises, and offer insights into how that risk
may be mitigated.
An outline theory is developed, introducing a new operational definition of
systemic risk of failure in which notions from evolutionary economics, finance
and complexity science are combined with a general interpretation of entropy, to
explain how catastrophic phenomena arise in that system. When a conceptual
model incorporating the Icelandic financial system failure over the years 2003 â
2008 is constructed from this theory, and the results of simulation experiments
using a verified computational representation of the model are validated with
empirical data from that event, and corroborated by theoretical triangulation, a
null-hypothesis about the theory is refuted. Furthermore, results show that
interplay between a lack of diversity in system participation strategies and shared
exposure to potential losses may be a key operational mechanism of catastrophic
tensions arising in the supply and demand of financial services. These findings
suggest new policy guidance for pre-emptive intervention calls for improved
operational transparency from system participants, and prompt access to data
about their operational behaviour, in order to prevent positive feedback inducing a
failure of the system to operate within required parameters.
The theory is then revised to reflect new insights exposed by simulation, and
finally submitted as a new theory capable of unifying existing knowledge in this
problem domain
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 âHigh-Performance Modelling and Simulation for Big Data Applications (cHiPSet)â project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 âHigh-Performance Modelling and Simulation for Big Data Applications (cHiPSet)â project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
- âŠ