25,882 research outputs found

    Calculating Value-at-Risk

    Get PDF
    The market risk of a portfolio refers to the possibility of financial loss due to the joint movement of systematic economic variables such as interest and exchange rates. Quantifying market risk is important to regulators in assessing solvency and to risk managers in allocating scarce capital. Moreover, market risk is often the central risk faced by financial institutions. The standard method for measuring market risk places a conservative, one-sided confidence interval on portfolio losses for short forecast horizons. This bound on losses is often called capital-at-risk or value-at-risk (VAR), for obvious reasons. Calculating the VAR or any similar risk metric requires a probability distribution of changes in portfolio value. In most risk management models, this distribution is derived by placing assumptions on (1) how the portfolio function is approximated, and (2) how the state variables are modeled. Using this framework, we first review four methods for measuring market risk. We then develop and illustrate two new market risk measurement models that use a second-order approximation to the portfolio function and a multivariate GARCH(l,1) model for the state variables. We show that when changes in the state variables are modeled as conditional or unconditional multivariate normal, first-order approximations to the portfolio function yield a univariate normal for the change in portfolio value while second-order approximations yield a quadratic normal. Using equity return data and a hypothetical portfolio of options, we then evaluate the performance of all six models by examining how accurately each calculates the VAR on an out-of-sample basis. We find that our most general model is superior to all others in predicting the VAR. In additional empirical tests focusing on the error contribution of each of the two model components, we find that the superior performance of our most general model is largely attributable to the use of the second-order approximation, and that the first-order approximations favored by practitioners perform quite poorly. Empirical evidence on the modeling of the state variables is mixed but supports usage of a model which reflects non-linearities in state variable return distributions. This paper was presented at the Financial Institutions Center's October 1996 conference on "

    Enemy Business Enterprises and the Alien Property Custodian, II.

    Get PDF

    Interpreting Presidential Powers

    Get PDF
    Justice Holmes famously observed that [g]reat cases . . . make bad law. The problem may be especially acute in the domain of national security, where presidents frequently interpret their own powers without judicial review and where executive precedents play a large role in subsequent interpretive debates. On the one hand, some of the historical assertions of presidential authority that stretch constitutional and statutory language the furthest seem hard to condemn in light of the practical stakes. On the other hand, to credit the authority of executive precedent risks leaving the president dangerously unbound. To address the conundrum posed by executive precedent, this Article proposes a two-tiered theory for the interpretation of presidential powers. Framed as an analogy to a position in moral philosophy known as threshold deontology, two-tiered interpretive theory treats rules that restrict executive power as normally inviolable, not subject to a case-by-case balancing analysis. Analogously to threshold deontology, however, two-tiered theory also recognizes that when the costs of adherence to ordinary principles grow exorbitantly high, extraordinary interpretive principles should govern instead and should result in the upholding of broad presidential power. For reasons that the Article explains, resort to extraordinary reliance on second-tier justifications for assertions of sweeping executive authority involves a legal analogue to dirty-handed moral conduct and should be labeled accordingly. And executive precedents set in extraordinary, second-tier cases should not apply to more ordinary ones. Through its conjunction of elements, two-tiered interpretive theory furnishes analytical and rhetorical safeguards against executive overreaching, but also allows accommodations for truly extraordinary cases

    Obtaining Atomic Matrix Elements from Vector Tune-Out Wavelengths using Atom Interferometry

    Full text link
    Accurate values for atomic dipole matrix elements are useful in many areas of physics, and in particular for interpreting experiments such as atomic parity violation. Obtaining accurate matrix element values is a challenge for both experiment and theory. A new technique that can be applied to this problem is tune-out spectroscopy, which is the measurement of light wavelengths where the electric polarizability of an atom has a zero. Using atom interferometry methods, tune-out wavelengths can be measured very accurately. Their values depend on the ratios of various dipole matrix elements and are thus useful for constraining theory and broadening the application of experimental values. Tune-out wavelength measurements to date have focused on zeros of the scalar polarizability, but in general the vector polarizability also contributes. We show here that combined measurements of the vector and scalar polarizabilities can provide more detailed information about the matrix element ratios, and in particular can distinguish small contributions from the atomic core and the valence tail states. These small contributions are the leading error sources in current parity violation calculations for cesium.Comment: 11 pages, 3 figure

    Evaluation of teenage pregnancy interventions in Wigan

    Get PDF
    This report presents the findings from a 12 month study that involved the development of an online questionnaire, and analysis of over 50 completed responses. The questionnaire aimed to determine the impact of a variety of services in Wigan that currently engage in strategies to reduce teenage pregnancy rates in the borough. The report begins with the background and specific study aims and objectives followed by a policy and literature overview. Details of the study design and processes undertaken to develop the instrument are given, together with data collected from a number of participating sites. This data was analysed and the findings and recommendations are presented

    Processing (Post)humanism, Mediating Desire: Technology in the Works of Three Border Playwrights

    Get PDF
    New electronic technology, such as personal video cameras, videotape players, and the internet, has increasingly sparked interest from Northern Mexican border authors across genres. In Juan Ríos’s Generación Atari, Francisco J. López´s Cibernauta: cómo vivir atrapado en la red, and Bárbara Colio’s Teoría y práctica de la muerte de una cucaracha (sin dolor) and La habitación, technological innovations play a key role in the development of the central emotional conflicts. The four dramatic works relate new technology to an increased social openness regarding more diverse expressions of sexuality, yet they also portray existing hierarchies, fraught relationships, and tragic events that signal the limits and interruptions involved in the technological mediation of desire. Rather than any wholesale condemnation or celebration of technology, these works pose human-machine relations as an open question to be shared with and pondered by the audience

    Organic Beef Production - Sire Breed Comparison

    Get PDF
    The results to date, from this sire breed comparison study indicate that with the contrasting Aberdeen Angus and Charolais sire breeds that is possible to achieve animal performance data comparable to well managed conventional suckler calf to beef systems (300 kg carcass for heifers in Nov and 400 kg carcass for steers in March). Similarly the responses to sire breed type, sex and date of slaughter for the organic beef animals are biologically compatible. Organic beef is produced under organic rules in response to consumer demand for organic product. The organic system contributes to the protection of the environment and animal welfare. “We have not inherited the world from our forefathers we have borrowed it from our children” (Kashmiri proverb)

    Adjustment and the labor market

    Get PDF
    How has the labor market responded to changes in macroeconomic conditions and related government policies? And to what extent has government intervention affected the microeconomic functioning of the labor market. Geographical immobility of workers does not seem to hinder adjustment. Labor is increasingly deployed in nontradables and import competing sectors, however, and problems of mobility between tradables and nontradables are reported. In addition, shortages of skilled manpower are reported. There is little evidence of wage resistence where wage indexation is not institutionalized. Traditional methods of wage support have become less important in the past two decades. Where effective minimum wage policies exist, they have the expected distortionary effects. Wage differences between the public and private sectors, particularly in sub-Saharan Africa, have continued to widen, and the efficiency of the public sector has declined as a result. Job security regulations may be an obstacle to structural adjustment programs insofar as they hinder the release of labor from contracting sectors.Environmental Economics&Policies,Banks&Banking Reform,Economic Theory&Research,Health Economics&Finance,Poverty Assessment
    corecore