5,221 research outputs found

    Practical volatility and correlation modeling for financial market risk management

    Get PDF
    What do academics have to offer market risk management practitioners in financial institutions? Current industry practice largely follows one of two extremely restrictive approaches: historical simulation or RiskMetrics. In contrast, we favor flexible methods based on recent developments in financial econometrics, which are likely to produce more accurate assessments of market risk. Clearly, the demands of real-world risk management in financial institutions - in particular, real-time risk tracking in very high-dimensional situations - impose strict limits on model complexity. Hence we stress parsimonious models that are easily estimated, and we discuss a variety of practical approaches for high-dimensional covariance matrix modeling, along with what we see as some of the pitfalls and problems in current practice. In so doing we hope to encourage further dialog between the academic and practitioner communities, hopefully stimulating the development of improved market risk management technologies that draw on the best of both worlds

    Estimating continuous-time income models

    Get PDF
    While earning processes are commonly unobservable income flows which evolve in continuous time, observable income data are usually discrete, having been aggregated over time. We consider continuous-time earning processes, specifically (non-linearly) transformed Ornstein-Uhlenbeck processes, and the associated integrated, i.e. time aggregated process. Both processes are characterised, and we show that time aggregation alters important statistical properties. The parameters of the earning process are estimable by GMM, and the finite sample properties of the estimator are investigated. Our methods are applied to annual earnings data for the US. It is demonstrated that the model replicates well important features of the earnings distribution. Keywords; integrated non-linearly transformed ornstein-uhlenbeck process, temporal aggregation

    Multiplier bootstrap of tail copulas with applications

    Full text link
    For the problem of estimating lower tail and upper tail copulas, we propose two bootstrap procedures for approximating the distribution of the corresponding empirical tail copulas. The first method uses a multiplier bootstrap of the empirical tail copula process and requires estimation of the partial derivatives of the tail copula. The second method avoids this estimation problem and uses multipliers in the two-dimensional empirical distribution function and in the estimates of the marginal distributions. For both multiplier bootstrap procedures, we prove consistency. For these investigations, we demonstrate that the common assumption of the existence of continuous partial derivatives in the the literature on tail copula estimation is so restrictive, such that the tail copula corresponding to tail independence is the only tail copula with this property. Moreover, we are able to solve this problem and prove weak convergence of the empirical tail copula process under nonrestrictive smoothness assumptions that are satisfied for many commonly used models. These results are applied in several statistical problems, including minimum distance estimation and goodness-of-fit testing.Comment: Published in at http://dx.doi.org/10.3150/12-BEJ425 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Practical Volatility and Correlation Modeling for Financial Market Risk Management

    Get PDF
    What do academics have to offer market risk management practitioners in financial institutions? Current industry practice largely follows one of two extremely restrictive approaches: historical simulation or RiskMetrics. In contrast, we favor flexible methods based on recent developments in financial econometrics, which are likely to produce more accurate assessments of market risk. Clearly, the demands of real-world risk management in financial institutions - in particular, real-time risk tracking in very high-dimensional situations - impose strict limits on model complexity. Hence we stress parsimonious models that are easily estimated, and we discuss a variety of practical approaches for high-dimensional covariance matrix modeling, along with what we see as some of the pitfalls and problems in current practice. In so doing we hope to encourage further dialog between the academic and practitioner communities, hopefully stimulating the development of improved market risk management technologies that draw on the best of both worlds.

    Practical Volatility and Correlation Modeling for Financial Market Risk Management

    Get PDF
    What do academics have to offer market risk management practitioners in financial institutions? Current industry practice largely follows one of two extremely restrictive approaches: historical simulation or RiskMetrics. In contrast, we favor flexible methods based on recent developments in financial econometrics, which are likely to produce more accurate assessments of market risk. Clearly, the demands of real-world risk management in financial institutions ā€“ in particular, real-time risk tracking in very high-dimensional situations ā€“ impose strict limits on model complexity. Hence we stress parsimonious models that are easily estimated, and we discuss a variety of practical approaches for high-dimensional covariance matrix modeling, along with what we see as some of the pitfalls and problems in current practice. In so doing we hope to encourage further dialog between the academic and practitioner communities, hopefully stimulating the development of improved market risk management technologies that draw on the best of both worlds.

    Practical Volatility and Correlation Modeling for Financial Market Risk Management

    Get PDF
    What do academics have to offer market risk management practitioners in financial institutions? Current industry practice largely follows one of two extremely restrictive approaches: historical simulation or RiskMetrics. In contrast, we favor flexible methods based on recent developments in financial econometrics, which are likely to produce more accurate assessments of market risk. Clearly, the demands of real-world risk management in financial institutions -- in particular, real-time risk tracking in very high-dimensional situations -- impose strict limits on model complexity. Hence we stress parsimonious models that are easily estimated, and we discuss a variety of practical approaches for high-dimensional covariance matrix modeling, along with what we see as some of the pitfalls and problems in current practice. In so doing we hope to encourage further dialog between the academic and practitioner communities, hopefully stimulating the development of improved market risk management technologies that draw on the best of both worlds.

    Decomposing changes in income risk using consumption data

    Get PDF
    This paper concerns the decomposition of income risk into permanent and transitory components using repeated cross-section data on income and consumption. Our focus is on the detection of changes in the magnitudes of variances of permanent and transitory risks. A new approximation to the optimal consumption growth rule is developed. Evidence from a dynamic stochastic simulation is used to show that this approximation can provide a robust method for decomposing income risk in a nonstationary environment. We examine robustness to unobserved heterogeneity in consumption growth and to unobserved heterogeneity in income growth. We use this approach to investigate the growth in income inequality in the UK in the 1980s

    Probability distribution theory, generalisations and applications of l-moments

    Get PDF
    In this thesis, we have studied L-moments and trimmed L-moments (TL-moments) which are both linear functions of order statistics. We have derived expressions for exact variances and covariances of sample L-moments and of sample TL-moments for any sample size n in terms of first and second-order moments of order statistics from small conceptual sample sizes, which do not depend on the actual sample size n. Moreover, we have established a theorem which characterises the normal distribution in terms of these second-order moments and the characterisation suggests a new test of normality. We have also derived a method of estimation based on TL-moments which gives zero weight to extreme observations. TL-moments have certain advantages over L-moments and method of moments. They exist whether or not the mean exists (for example the Cauchy distribution) and they are more robust to the presence of outliers. Also, we have investigated four methods for estimating the parameters of a symmetric lambda distribution: maximum likelihood method in the case of one parameter and L-moments, LQ-moments and TL-moments in the case of three parameters. The L-moments and TL-moments estimators are in closed form and simple to use, while numerical methods are required for the other two methods, maximum likelihood and LQ-moments. Because of the flexibility and the simplicity of the lambda distribution, it is useful in fitting data when, as is often the case, the underlying distribution is unknown. Also, we have studied the symmetric plotting position for quantile plot assuming a symmetric lambda distribution and conclude that the choice of the plotting position parameter depends upon the shape of the distribution. Finally, we propose exponentially weighted moving average (EWMA) control charts to monitor the process mean and dispersion using the sample L-mean and sample L-scale and charts based on trimmed versions of the same statistics. The proposed control charts limits are less influenced by extreme observations than classical EWMA control charts, and lead to tighter limits in the presence of out-of-control observations

    Locking of correlated neural activity to ongoing oscillations

    Full text link
    Population-wide oscillations are ubiquitously observed in mesoscopic signals of cortical activity. In these network states a global oscillatory cycle modulates the propensity of neurons to fire. Synchronous activation of neurons has been hypothesized to be a separate channel of signal processing information in the brain. A salient question is therefore if and how oscillations interact with spike synchrony and in how far these channels can be considered separate. Experiments indeed showed that correlated spiking co-modulates with the static firing rate and is also tightly locked to the phase of beta-oscillations. While the dependence of correlations on the mean rate is well understood in feed-forward networks, it remains unclear why and by which mechanisms correlations tightly lock to an oscillatory cycle. We here demonstrate that such correlated activation of pairs of neurons is qualitatively explained by periodically-driven random networks. We identify the mechanisms by which covariances depend on a driving periodic stimulus. Mean-field theory combined with linear response theory yields closed-form expressions for the cyclostationary mean activities and pairwise zero-time-lag covariances of binary recurrent random networks. Two distinct mechanisms cause time-dependent covariances: the modulation of the susceptibility of single neurons (via the external input and network feedback) and the time-varying variances of single unit activities. For some parameters, the effectively inhibitory recurrent feedback leads to resonant covariances even if mean activities show non-resonant behavior. Our analytical results open the question of time-modulated synchronous activity to a quantitative analysis.Comment: 57 pages, 12 figures, published versio
    • ā€¦
    corecore