9,974 research outputs found

    Crow

    Get PDF

    The Foundations of Banks' Risk Regulation: a Review of the Literature

    Get PDF
    The stability of the banking industry around the world has been observed as periodical since the Great Depression. Financial markets have changed dramatically over the last twenty-five years, introducing more competition for and from banks. Banks are the financial institutions responsible for providing liquidity to the economy. This responsibility is, however, the main cause of their fragility. Deposit insurance is the most efficient instrument for protecting depositors and for preventing bank runs. Pricing deposit insurance according to the individual bank's risk seems to be the most appropriate strategy but it does not seem to be sufficient in the sense that it seems to remain residual information problems in the market, although there is no appropriate statistical analysis on this issue. In 1988, the G10 modified banking regulation significantly by setting capital standards for international banks. These standards have now been adopted by more than one hundred countries as part of their national regulation of banks' risk. Current regulation of bank capital adequacy has its critics because it imposes the same rules on all banks. This seems particularly unsuitable when applied to credit risk which is the major source of a bank's risk (about 70%). Moreover, diversification of a bank's credit-risk portfolio is not taken into account in the computation of capital ratios. These shortcomings seem to have distorted the behaviour of banks and this makes it much more complicated to monitor them. In fact, it is not even clear that the higher capital ratios observed since the introduction of this new form of capital regulation necessarily lower risks. Additional reform is expected in 2004, but there is as yet no consensus on othe form it will take nor on whether it will suitably regulate banks in individual countries. Consequently, it might be appropriate to continue developing national regualtion based on optimal deposit insurance (with individual insurance pricing and continuous auditing on individual risk) and to keep searching for other optimal complementary instruments for use against systemic risk, instruments suitably designed to fit the banking industry's peculiar structure. Other market discipline (such as subordinated debt) and governance instruments may be more efficient than the current capital requirement scheme for the banks' commitment problem associated to deposit insurance. The central bank should be responsible for aggregate liquidity. Confidence inthe financial sector is a public good that must be ensured by the government. Who should be in charge: the central bank or a regulatory agency? The revised literature seems to say that this role should be taken by a regulatory agency independent fromthe central bank and independent from the political power.Bank, liquidity, deposit insurance, capital standard, national regulation, credit risk, capital regulation, subordinated debt, governance, capital requirement, central bank, regulatory agency

    Structured Finance, Risk Management, and the Recent Financial Crisis

    Get PDF
    Structured finance is often mentioned as the main cause of the latest financial crisis. We argue that structured finance per se did not trigger the last financial crisis. The crisis was propagated around the world because of poor risk management such as agency problems in the securitization market, poor rating and pricing standards, rating agency incentives, lack of market transparency, the search for higher yields by top decision makers and the failure of regulators and central banks to understand the implications of the changing environment.Structured finance, risk management, financial crisis, collateral debt obligation (CDO), asset back commercial paper (ABCP), rating, pricing, securitization, regulation of financial markets

    New Evidence on the Determinants of Absenteeism Using Linked Employer-Employee Data

    Get PDF
    In this paper, we provide new evidence on the determinants of absenteeism using the Workplace Employee Survey (WES) 1999-2002 from Statistics Canada. Our paper extends the typical labour-leisure model used to analyze the decision to skip work to include firm-level policy variables relevant to the absenteeism decision and uncertainty about the cost of absenteeism. It also provides a non-linear econometric model that explicitly takes into account the count nature of absenteeism data and unobserved heterogeneity at both the individual and firm level. Controlling for very detailed demographic, job and firm characteristics (including workplace practices), we find that dissatisfaction with contracted hours is a significant determinant of absence.Absenteeism; Linked Employer-Employee Data; Unobserved Heterogeneity; Count Data Models.

    On the Necessity of Using Lottery Qualities

    Get PDF
    The aim of this paper is to propose a model of decision-making for lotteries. The key element of the theory is the use of lottery qualities. Qualities allow the derivation of optimal decision-making processes and are taken explicitly into account for lottery evaluation. Our contribution explains the major violations of the expected utility theory for decisions on two-point lotteries and shows the necessity of giving explicit consideration to the lottery qualities.Lottery choice, common ratio, preference reversal, pricing, lottery test, cognitive process, certainty equivalent, lottery quality

    Zero-Hopf bifurcation in the Van der Pol oscillator with delayed position and velocity feedback

    Full text link
    In this paper, we consider the traditional Van der Pol Oscillator with a forcing dependent on a delay in feedback. The delay is taken to be a nonlinear function of both position and velocity which gives rise to many different types of bifurcations. In particular, we study the Zero-Hopf bifurcation that takes place at certain parameter values using methods of centre manifold reduction of DDEs and normal form theory. We present numerical simulations that have been accurately predicted by the phase portraits in the Zero-Hopf bifurcation to confirm our numerical results and provide a physical understanding of the oscillator with the delay in feedback

    Estimating the effect of a change in insurance pricing regime on accidents with endogenous mobility.

    Get PDF
    In this paper, we estimate the impact of introducing a bonus-malus system on the probability of having automobile accidents, taking into account contract duration or the client mobility between insurers. We show that the new incentive scheme reduces accident rates of all policyholders when contract duration is taken into account, but does not affect accident rates of movers that shirk the imposed incentive effects of the new insurance pricing scheme.Bonus-malus; contract duration; automobile accident; Poisson distribution; right- and left-censoring; exponential distribution.

    Scaling Models for the Severity and Frequency of External Operational Loss Data

    Get PDF
    According to Basel II criteria, the use of external data is absolutely indispensable to the implementation of an advanced method for calculating operational capital. This article investigates how the severity and frequencies of external losses are scaled for integration with internal data. We set up an initial model designed to explain the loss severity. This model takes into account firm size, location, and business lines as well as risk types. It also shows how to calculate the internal loss equivalent to an external loss, which might occur in a given bank. OLS estimation results show that the above variables have significant power in explaining the loss amount. They are used to develop a normalization formula. A second model based on external data is developed to scale the frequency of losses over a given period. Two regression models are analyzed: the truncated Poisson model and the truncated negative binomial model. Variables estimating the size and geographical distribution of the banks' activities have been introduced as explanatory variables. The results show that the negative binomial distribution outperforms the Poisson distribution. The scaling is done by calculating the parameters of the selected distribution based on the estimated coefficients and the variables related to a given bank. Frequency of losses of more than $1 million are generated on a specific horizon.Operational risk in banks, scaling, severity distribution, frequency distribution, truncated count data regression models
    corecore