4,811 research outputs found

    Financial Fragility and Growth Dynamics of Italian Business Firms

    Get PDF
    This work explores a number of properties investigated in the empirical literature on firm size and growth dynamics: (i) the distribution and the autoregressive structure of firm size; (ii) the existence of size-growth scaling relationships; (iii) the distribution and the autoregressive structure of scaling-free growth rates. The major novelty concerns our exploiting of a credit rating index to condition all the analyses upon firms' financial fragility and access to credit. We find that the distributions of both firm size and firm growth rates are fatter tailed among less solvable firms than in the rest of the sample, both at the bottom and at the top extreme of the distributions. As a result, we conclude that not only small and/or slowly growing firms might suffer from difficulties in raising external financing, but also big and fast growing ones might be exposed to financial constraints.Firm size, Firm growth, Financial constraints

    International Diversification: A Copula Approach

    Get PDF
    .Diversification; Copula; Correlation Complexity; Downside Risk; Systemic Risk

    The modelling of operational risk: experience with the analysis of the data collected by the Basel Committee

    Get PDF
    The revised Basel Capital Accord requires banks to meet a capital requirement for operational risk as part of an overall risk-based capital framework. Three distinct options for calculating operational risk charges are proposed (Basic Approach, Standardised Approach, Advanced Measurement Approaches), reflecting increasing levels of risk sensitivity. Since 2001, the Risk Management Group of the Basel Committee has been performing specific surveys of banksÂ’ operational loss data, with the main purpose of obtaining information on the industryÂ’s operational risk experience, to be used for the refinement of the capital framework and for the calibration of the regulatory coefficients. The second loss data collection was launched in the summer of 2002: the 89 banks participating in the exercise provided the Group with more than 47,000 observations, grouped by eight standardised Business Lines and seven Event Types. A summary of the data collected, which focuses on the description of the range of individualgross loss amounts and of the distribution of the banksÂ’ losses across the business lines/event types, was returned to the industry in March 2003. The objective of this paper is to move forward with respect to that document, by illustrating the methodologies and the outcomes of the inferential analysis carried out on the data collected through 2002. To this end, after pooling the individual banksÂ’ losses according to a Business Line criterion, the operational riskiness of each Business Line data set is explored using empirical and statistical tools. The work aims, first of all, to compare the sensitivity of conventional actuarial distributions and models stemming from the Extreme Value Theory in representing the highest percentiles of the data sets: the exercise shows that the extreme value model, in its Peaks Over Threshold representation, explains the behaviour of the operational risk data in the tail area well. Then, measures of severity and frequency of the large losses are gained and, by a proper combination of these estimates, a bottom-up operational risk capital figure is computed for each Business Line. Finally, for each Business Line and in the eight Business Lines as a whole, the contributions of the expected losses to the capital figures are evaluated and the relationships between the capital charges and the corresponding average level of Gross Incomes are determined and compared with the current coefficients envisaged in the simplified approaches of the regulatory framework.operational risk, heavy tails, conventional inference, Extreme Value Theory, Peaks Over Threshold, median shortfall, Point Process of exceedances, capital charge, Business Line, Gross Income, regulatory coefficients

    On the performance of the minimum VaR portfolio

    Get PDF
    Alexander and Baptista (2002) develop the concept of mean-VaR efficiency for portfolios and demonstrate its very close connection with mean-variance efficiency. In particular, they identify the minimum VaR portfolio as a special type of mean-variance efficient portfolio. Our empirical analysis finds that, for commonly used VaR breach probabilities, minimum VaR portfolios yield ex post returns that conform well with the specified VaR breach probabilities and with return/risk expectations. These results provide a considerable extension of evidence supporting the empirical validity and tractability of the mean-VaR efficiency concept

    The Dependence Structure of Macroeconomic Variables in the US

    Get PDF
    A central role for economic policy involves reducing the incidence of systemic downturns, when key economic variables experience joint extreme events. In this paper, we empirically analyze such dependence using two approaches, correlations and copulas. We document four findings. First, linear correlations and copulas disagree substantially about the nation’s dependence structure, indicating correlation complexity in the US economy. Second, GDP exhibits linear dependence with interest rates and prices, but no extreme dependence with the latter. This is consistent with the existence of liquidity traps. Third, GDP exhibits asymmetric extreme dependence with employment, consumption and investment, with relatively greater dependence during downturns. Fourth, money is neutral, especially during extreme economic conditions.Asymmetric dependence; Copula; Correlation Complexity; Extreme Event; Economic Policy; Money Neutrality; Systemic Downturn

    Recovering the sunk costs of R&D: the moulds industry case

    Get PDF
    Sunk costs for R&D are an important determinant of the level of innovation in the economy. In this paper I recover them using a Markov equilibrium framework. The contribution is twofold. First, a model of industry dynamics which accounts for selection into R&D, capital accumulation and entry/exit is proposed. The industry state is summarized by an aggregate state with the advantage that it avoids the "curse of dimensionality". Second, the estimated sunk costs of R&D for the Portuguese moulds industry are shown to be important (3.4 million Euros). They become particularly relevant since the industry is mostly populated by small firms. Institutional changes in the early 1990s generated an increase in demand from European car makers and created the incentives for firms to pay the costs of investment. Trade-induced innovation reinforced the selection effect by which international trade leads to productivity growth. Finally, using the estimated parameters, simulations evaluate the effects of changes in market size, sunk costs and entry costs

    Portfolio choice and optimal hedging with general risk functions: a simplex-like algorithm.

    Get PDF
    The minimization of general risk functions is becoming more and more important in portfolio choice theory and optimal hedging. There are two major reasons. Firstly, heavy tails and the lack of symmetry in the returns of many assets provokes that the classical optimization of the standard deviation may lead to dominated strategies, from the point of view of the second order stochastic dominance. Secondly, but not less important, many institutional investors must respect legal capital requirements, which may be more easily studied if one deals with a risk measure related to capital losses. This paper proposes a new method to simultaneously minimize several general risk or dispersion measures. The representation theorems of risk functions are applied to transform the general risk minimization problem in a minimax problem, and later in a linear programming problem between infinite-dimensional Banach spaces. Then, new necessary and sufficient optimality conditions are stated and a simplex-like algorithm is developed. The algorithm solves the dual problem and provides both optimal portfolios and their sensitivities. The approach is general enough and does not depend on any particular risk measure, but some of the most important cases are specially analyzed. A final real data numerical example illustrates the practical performance of the proposed methodology.Risk measures; Deviation measure; Portfolio selection; Infinite dimensional linear programming; Simplex like method;

    Asymmetric Dependence in US Financial Risk Factors?

    Get PDF
    .Asymmetric Dependence; Copulas; Diversification Failure; Risk Factor; Systemic Risk; Time-Varying Downside Risk

    The virtues and vices of equilibrium and the future of financial economics

    Get PDF
    The use of equilibrium models in economics springs from the desire for parsimonious models of economic phenomena that take human reasoning into account. This approach has been the cornerstone of modern economic theory. We explain why this is so, extolling the virtues of equilibrium theory; then we present a critique and describe why this approach is inherently limited, and why economics needs to move in new directions if it is to continue to make progress. We stress that this shouldn't be a question of dogma, but should be resolved empirically. There are situations where equilibrium models provide useful predictions and there are situations where they can never provide useful predictions. There are also many situations where the jury is still out, i.e., where so far they fail to provide a good description of the world, but where proper extensions might change this. Our goal is to convince the skeptics that equilibrium models can be useful, but also to make traditional economists more aware of the limitations of equilibrium models. We sketch some alternative approaches and discuss why they should play an important role in future research in economics.Comment: 68 pages, one figur
    • …
    corecore