99 research outputs found

    Two-step Nonnegative Matrix Factorization Algorithm for the Approximate Realization of Hidden Markov Models

    Full text link
    We propose a two-step algorithm for the construction of a Hidden Markov Model (HMM) of assigned size, i.e. cardinality of the state space of the underlying Markov chain, whose nn-dimensional distribution is closest in divergence to a given distribution. The algorithm is based on the factorization of a pseudo Hankel matrix, defined in terms of the given distribution, into the product of a tall and a wide nonnegative matrix. The implementation is based on the nonnegative matrix factorization (NMF) algorithm. To evaluate the performance of our algorithm we produced some numerical simulations in the context of HMM order reduction.Comment: presented at MTNS2010 - Budapest, July 201

    On Fisher's information matrix of an ARMA process and Sylvester's resultant matrix

    Get PDF

    On the solution of Stein's equation and Fisher information matrix of an ARMAX process

    Get PDF
    The main goal of this paper consists in expressing the solution of a Stein equation in terms of the Fisher information matrix (FIM) of a scalar ARMAX process. A condition for expressing the FIM in terms of a solution to a Stein equation is also set forth. Such interconnections can be derived when a companion matrix with eigenvalues equal to the roots of an appropriate polynomial associated with the ARMAX process is inserted in the Stein equation. The case of algebraic multiplicity greater than or equal to one is studied. The FIM and the corresponding solution to Stein’s equation are presented as solutions to systems of linear equations. The interconnections are obtained by using the common particular solution of these systems. The kernels of the structured coefficient matrices are described as well as some right inverses. This enables us to find a solution to the newly obtained linear system of equations

    On the resultant property of the Fisher information matrix of a vector ARMA process

    Get PDF
    A matrix is called a multiple resultant matrix associated to two matrix polynomials when it becomes singular if and only if the two matrix polynomials have at least one common eigenvalue. In this paper a new multiple resultant matrix is introduced. It concerns the Fisher information matrix (FIM) of a stationary vector autoregressive and moving average time series process (VARMA). The two matrix polynomials are the autoregressive and the moving average matrix polynomials of the VARMA process. In order to show that the FIM is a multiple resultant matrix two new representations of the FIM are derived. To construct such representations appropriate matrix differential rules are applied. The newly obtained representations are expressed in terms of the multiple Sylvester matrix and the tensor Sylvester matrix. The representation of the FIM expressed by the tensor Sylvester matrix is used to prove that the FIM becomes singular if and only if the autoregressive and moving average matrix polynomials have at least one common eigenvalue. It then follows that the FIM and the tensor Sylvester matrix have equivalent singularity conditions. In a simple numerical example it is shown however that the FIM fails to detect common eigenvalues due to some kind of numerical instability. Whereas the tensor Sylvester matrix reveals it clearly, proving the usefulness of the results derived in this paper. © 2005 Elsevier Inc. All rights reserved.SCOPUS: ar.jinfo:eu-repo/semantics/publishe

    A dimension reduction approach for loss valuation in credit risk modeling

    Get PDF
    This paper addresses the “curse of dimensionality” in the loss valuation of credit risk models. A dimension reduction methodology based on the Bayesian filter and smoother is proposed. This methodology is designed to achieve a fast and accurate loss valuation algorithm in credit risk modeling, but it can also be extended to valuation models of other risk types. The proposed methodology is generic, robust and can easily be implemented. Moreover, the accuracy of the proposed methodology in the estimation of expected loss and value-at-risk (VaR) is illustrated by numerical experiments. The results suggest that, compared to the currently most used Principal Component Analysis (PCA) approach, the proposed methodology provides more accurate estimation of expected loss and VaR of a loss distribution.<br/

    A dimension reduction approach for loss valuation in credit risk modeling

    Get PDF
    This paper addresses the “curse of dimensionality” in the loss valuation of credit risk models. A dimension reduction methodology based on the Bayesian filter and smoother is proposed. This methodology is designed to achieve a fast and accurate loss valuation algorithm in credit risk modeling, but it can also be extended to valuation models of other risk types. The proposed methodology is generic, robust and can easily be implemented. Moreover, the accuracy of the proposed methodology in the estimation of expected loss and value-at-risk (VaR) is illustrated by numerical experiments. The results suggest that, compared to the currently most used Principal Component Analysis (PCA) approach, the proposed methodology provides more accurate estimation of expected loss and VaR of a loss distribution.<br/

    Liquidity-free implied volatilities:an approach using conic finance

    Get PDF
    We consider the problem of calculating risk-neutral implied volatilities of European options without relying on option mid prices but solely on bid and ask prices. We provide an approach, based on the conic finance paradigm, that allows to uniquely strip risk-neutral implied volatilities from bid and ask quotes, and that does not require restrictive assumptions. Our methodology also allows to jointly calculate the implied liquidity of the market. The idea outlined in this paper can be applied to calculate other implied parameters from bid and ask security prices as soon as their theoretical risk-neutral counterparts are strictly increasing with respect to the former
    corecore