332,274 research outputs found

    Extraction of the underlying structure of systematic risk from non-Gaussian multivariate financial time series using independent component analysis: Evidence from the Mexican stock exchange

    Get PDF
    Regarding the problems related to multivariate non-Gaussianity of financial time series, i.e., unreliable results in extraction of underlying risk factors -via Principal Component Analysis or Factor Analysis-, we use Independent Component Analysis (ICA) to estimate the pervasive risk factors that explain the returns on stocks in the Mexican Stock Exchange. The extracted systematic risk factors are considered within a statistical definition of the Arbitrage Pricing Theory (APT), which is tested by means of a two-stage econometric methodology. Using the extracted factors, we find evidence of a suitable estimation via ICA and some results in favor of the APT.Peer ReviewedPostprint (published version

    Formal Availability Analysis using Theorem Proving

    Full text link
    Availability analysis is used to assess the possible failures and their restoration process for a given system. This analysis involves the calculation of instantaneous and steady-state availabilities of the individual system components and the usage of this information along with the commonly used availability modeling techniques, such as Availability Block Diagrams (ABD) and Fault Trees (FTs) to determine the system-level availability. Traditionally, availability analyses are conducted using paper-and-pencil methods and simulation tools but they cannot ascertain absolute correctness due to their inaccuracy limitations. As a complementary approach, we propose to use the higher-order-logic theorem prover HOL4 to conduct the availability analysis of safety-critical systems. For this purpose, we present a higher-order-logic formalization of instantaneous and steady-state availability, ABD configurations and generic unavailability FT gates. For illustration purposes, these formalizations are utilized to conduct formal availability analysis of a satellite solar array, which is used as the main source of power for the Dong Fang Hong-3 (DFH-3) satellite.Comment: 16 pages. arXiv admin note: text overlap with arXiv:1505.0264

    Finite-size effect and the components of multifractality in financial volatility

    Full text link
    Many financial variables are found to exhibit multifractal nature, which is usually attributed to the influence of temporal correlations and fat-tailedness in the probability distribution (PDF). Based on the partition function approach of multifractal analysis, we show that there is a marked finite-size effect in the detection of multifractality, and the effective multifractality is the apparent multifractality after removing the finite-size effect. We find that the effective multifractality can be further decomposed into two components, the PDF component and the nonlinearity component. Referring to the normal distribution, we can determine the PDF component by comparing the effective multifractality of the original time series and the surrogate data that have a normal distribution and keep the same linear and nonlinear correlations as the original data. We demonstrate our method by taking the daily volatility data of Dow Jones Industrial Average from 26 May 1896 to 27 April 2007 as an example. Extensive numerical experiments show that a time series exhibits effective multifractality only if it possesses nonlinearity and the PDF has impact on the effective multifractality only when the time series possesses nonlinearity. Our method can also be applied to judge the presence of multifractality and determine its components of multifractal time series in other complex systems.Comment: 9 RevTex pages including 9 eps figures. Comments and suggestions are warmly welcom

    A new framework for yield curve, output and inflation relationships

    Get PDF
    This article develops a theoretically-consistent and easy-to-apply framework for interpreting, investigating, and monitoring the relationships between the yield curve, output, and inflation. The framework predicts that steady-state inflation plus steady-state output growth should be cointegrated with the long-maturity level of the yield curve as estimated by a arbitrage-free version of the Nelson and Siegel (1987) model, while the shape of the yield curve model from that model should correspond to the profile (that is, the timing and magnitude) of expected future inflation and output growth. These predicted relationships are confirmed empirically using 51 years of United States data. The framework may be used for monitoring expectations of inflation and output growth implied by the yield curve. It should also provide a basis for using the yield curve to value and hedge derivatives on macroeconomic data

    Bayesian forecasting and scalable multivariate volatility analysis using simultaneous graphical dynamic models

    Full text link
    The recently introduced class of simultaneous graphical dynamic linear models (SGDLMs) defines an ability to scale on-line Bayesian analysis and forecasting to higher-dimensional time series. This paper advances the methodology of SGDLMs, developing and embedding a novel, adaptive method of simultaneous predictor selection in forward filtering for on-line learning and forecasting. The advances include developments in Bayesian computation for scalability, and a case study in exploring the resulting potential for improved short-term forecasting of large-scale volatility matrices. A case study concerns financial forecasting and portfolio optimization with a 400-dimensional series of daily stock prices. Analysis shows that the SGDLM forecasts volatilities and co-volatilities well, making it ideally suited to contributing to quantitative investment strategies to improve portfolio returns. We also identify performance metrics linked to the sequential Bayesian filtering analysis that turn out to define a leading indicator of increased financial market stresses, comparable to but leading the standard St. Louis Fed Financial Stress Index (STLFSI) measure. Parallel computation using GPU implementations substantially advance the ability to fit and use these models.Comment: 28 pages, 9 figures, 7 table

    Towards the Formal Reliability Analysis of Oil and Gas Pipelines

    Get PDF
    It is customary to assess the reliability of underground oil and gas pipelines in the presence of excessive loading and corrosion effects to ensure a leak-free transport of hazardous materials. The main idea behind this reliability analysis is to model the given pipeline system as a Reliability Block Diagram (RBD) of segments such that the reliability of an individual pipeline segment can be represented by a random variable. Traditionally, computer simulation is used to perform this reliability analysis but it provides approximate results and requires an enormous amount of CPU time for attaining reasonable estimates. Due to its approximate nature, simulation is not very suitable for analyzing safety-critical systems like oil and gas pipelines, where even minor analysis flaws may result in catastrophic consequences. As an accurate alternative, we propose to use a higher-order-logic theorem prover (HOL) for the reliability analysis of pipelines. As a first step towards this idea, this paper provides a higher-order-logic formalization of reliability and the series RBD using the HOL theorem prover. For illustration, we present the formal analysis of a simple pipeline that can be modeled as a series RBD of segments with exponentially distributed failure times.Comment: 15 page

    Attributing returns and optimising United States swaps portfolios using an intertemporally-consistent and arbitrage-free model of the yield curve

    Get PDF
    This paper uses the volatility-adjusted orthonormalised Laguerre polynomial model of the yield curve (the VAO model) from Krippner (2005), an intertemporally-consistent and arbitrage-free version of the popular Nelson and Siegel (1987) model, to develop a multi-dimensional yield-curve-based risk framework for fixed interest portfolios. The VAO model is also used to identify relative value (i.e. potential excess returns) from the universe of securities that define the yield curve. In combination, these risk and return elements provide an intuitive framework for attributing portfolio returns ex-post, and for optimising portfolios ex-ante. The empirical applications are to six years of daily United States interest rate swap data. The first application shows that the main sources of fixed interest portfolio risk (i.e. unanticipated variability in ex-post returns) are first-order (‘duration’) effects from stochastic shifts in the level and shape of the yield curve; second-order (‘convexity’) effects and other contributions are immaterial. The second application shows that fixed interest portfolios optimised ex-ante using the VAO model risk/relative framework significantly outperform a naive evenly-weighted benchmark over time

    Principal Component Analysis of Volatility Smiles and Skews

    Get PDF
    Several principal component models of volatility smiles and skews have been based on daily changes in implied volatilities, by strike and/or by moneyness. Derman and Kamal (1997) analyze S&P500 and Nikkei 225 index options where the daily change in the volatility surface is specified by delta and maturity. Skiadopoulos, Hodges and Clewlow (1998) apply PCA to first differences of implied volatilities for fixed maturity buckets, across both strike and moneyness metrics. And Fengler et. al. (2000) employ a common PCA that allows options on equities in the DAX of different maturities to be analyzed simultaneously.

    Dynamics and sparsity in latent threshold factor models: A study in multivariate EEG signal processing

    Full text link
    We discuss Bayesian analysis of multivariate time series with dynamic factor models that exploit time-adaptive sparsity in model parametrizations via the latent threshold approach. One central focus is on the transfer responses of multiple interrelated series to underlying, dynamic latent factor processes. Structured priors on model hyper-parameters are key to the efficacy of dynamic latent thresholding, and MCMC-based computation enables model fitting and analysis. A detailed case study of electroencephalographic (EEG) data from experimental psychiatry highlights the use of latent threshold extensions of time-varying vector autoregressive and factor models. This study explores a class of dynamic transfer response factor models, extending prior Bayesian modeling of multiple EEG series and highlighting the practical utility of the latent thresholding concept in multivariate, non-stationary time series analysis.Comment: 27 pages, 13 figures, link to external web site for supplementary animated figure
    • 

    corecore