4,025 research outputs found

    A Conversation with Alan Gelfand

    Get PDF
    Alan E. Gelfand was born April 17, 1945, in the Bronx, New York. He attended public grade schools and did his undergraduate work at what was then called City College of New York (CCNY, now CUNY), excelling at mathematics. He then surprised and saddened his mother by going all the way across the country to Stanford to graduate school, where he completed his dissertation in 1969 under the direction of Professor Herbert Solomon, making him an academic grandson of Herman Rubin and Harold Hotelling. Alan then accepted a faculty position at the University of Connecticut (UConn) where he was promoted to tenured associate professor in 1975 and to full professor in 1980. A few years later he became interested in decision theory, then empirical Bayes, which eventually led to the publication of Gelfand and Smith [J. Amer. Statist. Assoc. 85 (1990) 398-409], the paper that introduced the Gibbs sampler to most statisticians and revolutionized Bayesian computing. In the mid-1990s, Alan's interests turned strongly to spatial statistics, leading to fundamental contributions in spatially-varying coefficient models, coregionalization, and spatial boundary analysis (wombling). He spent 33 years on the faculty at UConn, retiring in 2002 to become the James B. Duke Professor of Statistics and Decision Sciences at Duke University, serving as chair from 2007-2012. At Duke, he has continued his work in spatial methodology while increasing his impact in the environmental sciences. To date, he has published over 260 papers and 6 books; he has also supervised 36 Ph.D. dissertations and 10 postdocs. This interview was done just prior to a conference of his family, academic descendants, and colleagues to celebrate his 70th birthday and his contributions to statistics which took place on April 19-22, 2015 at Duke University.Comment: Published at http://dx.doi.org/10.1214/15-STS521 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Experimental Arc Furnace Smelting of Titaniferous Iron Sands From the Northern Oregon Coast

    Get PDF
    This investigation is an attempt to experimentally smelt titaniferous iron sands from the Northern Oregon Coast in a laboratory arc furnace

    The Role of Capital in Financial Institutions

    Get PDF
    This paper examines the role of capital in financial institutions. As the introductory article to a conference on the role of capital management in banking and insurance, it describes the authors' views of why capital is important, how market-generated capital requirements' differ from regulatory requirements and the form that regulatory requirements should take. It also examines the historical trends in bank capital, problems in measuring capital and some possible unintended consequences of capital requirements. According to the authors, the point of departure for all modern research on capital structure is the Modigliani-Miller (M&M, 1958) proposition that in a frictionless world of full information and complete markets, a firm s capital structure cannot affect its value. The authors suggest however, that financial institutions lack any plausible rationale in the frictionless world of M&M. Most of the past research on financial institutions has begun with a set of assumed imperfections, such as taxes, costs of financial distress, transactions costs, asymmetric information and regulation. Miller argues (1995) that these imperfections may not be important enough to overturn the M&M Proposition. Most of the other papers presented at this conference on capital take the view that the deviations from M&M s frictionless world are important, so that financial institutions may be able to enhance their market values by taking on an optimal amount of leverage. The authors highlight these positions in this article. The authors next examine why markets require' financial institutions to hold capital. They define this capital requirement' as the capital ratio that maximizes the value of the bank in the absence of regulatory capital requirements and all the regulatory mechanisms that are used to enforce them, but in the presence of the rest of the regulatory structure that protects the safety and soundness of banks. While the requirement differs for each bank, it is the ratio toward which each bank would tend to move in the long run in the absence of regulatory capital requirements. The authors then introduce imperfections into the frictionless world of M&M taxes and the costs of financial distress, transactions costs and asymmetric information problems and the regulatory safety net. The authors analysis suggests that departures from the frictionless M&M world may help explain market capital requirements for banks. Tax considerations tend to reduce market capital requirements , the expected costs of financial distress tend to raise these requirements , and transactions costs and asymmetric information problems may either increase or reduce the capital held in equilibrium. The federal safety net shields bank creditors from the full consequences of bank risk taking and thus tends to reduce market capital requirements . The paper then summarizes the historical evolution of bank capital ratios in the United States and the reasons regulators require financial institutions to hold capital. They suggest that regulatory capital requirements are blunt standards that respond only minimally to perceived differences in risk rather than the continuous prices and quantity limits set by uninsured creditors in response to changing perceptions of the risk of individual banks. The authors suggest an ideal system for setting capital standards but agree that it would be prohibitively expensive, if not impossible. Regulators lack precise estimates of social costs and benefits to tailor a capital requirement for each bank, and they cannot easily revise the requirements continuously as conditions change. The authors continue with suggestions for measuring regulatory capital more effectively. They suggest that a simple risk-based capital ratio is a relatively blunt tool for controlling bank risk-taking. The capital in the numerator may not always control bank moral hazard incentive; it is difficult to measure, and its measured value may be subject to manipulation by gains trading . The risk exposure in the denominator is also difficult to measure, corresponds only weakly to actual risk and may be subject to significant manipulation. These imprecisions worsen the social tradeoff between the externalities from bank failures and the quantity of bank intermediation. To keep bank risk to a tolerable level, capital standards must be higher on average than they otherwise would be if the capital ratios could be set more precisely, raising bank costs and reducing the amount of intermediation in the economy in the long run. Since actual capital standards are, at best, an approximation to the ideal, the authors argue that it should not be surprising that they may have had some unintended effects. They examine two unintended effects on bank portfolio risk or credit allocative inefficiencies. These two are the explosive growth of securitization and the so-called credit crunch by U.S. banks in the early 1990s. The authors show that capital requirements may give incentives for some banks to increase their risks of failure. Inaccuracies in setting capital requirements distort relative prices and may create allocative inefficiencies that divert financial resources from their most productive uses. During the 1980s, capital requirements may have created artificial incentives for banks to take off-balance sheet risk, and changes in capital requirements in the 1990s may have contributed to a credit crunch.

    Is query translation a distinct task from search?

    Get PDF
    INTRODUCTION The University of Sheffield participated in iCLEF 2002 using, as a test-bed, the prototype under development in the Clarity project. Clarity is an EU funded project aimed at developing a system for cross-language information retrieval for so-called low density languages, those with few translation resources. Currently translation between English and Finnish is supported; soon Swedish will be added and in the near future Latvian and Lithuanian. Clarity is being developed in a user-centred way with user involvement from the beginning. The design of the first user interface was based on current best practise, particular attention was paid to empirical evidence for a specific design choice. Six paper-based interface mock-ups representing important points in the cross-language search task were generated and presented for user assessment as a part of an extensive user study. The study (reported in Petrelli et al. 2002) was conducted to understand users and uses of cross-language information retrieval systems. Many different techniques were applied: contextual enquiry, interviews, questionnaires, informal evaluation of existing cross-language technology, and participatory design sessions with the interface mock-ups mentioned above. As a result, a user class profile was sketched and a long list of user requirements was compiled. As a followup, a redesign session took place and the new system was designed for users whoknow the language(s) they are searching (polyglots); • search for writing (journalists, translators business analysts); • have limited searching skills; • know the topic in advance or will learn/read on it while searching; • use many languages in the same search session and often swap between them. New system features were listed as important and the user interface was redesigned. Considering the result of the study the new interface allowed the user to dynamically change the language setting from query to query, hid the query translation and showed the retrieved set as ranked list primary. Despite the fact that this new design was considered to be more effective, a comparison between the first layout based on the relevant literature and the new one based on the user study was considered an important research question. In particular, the choice of hiding the query translation was considered an important design decision, against the common agreement to allow and support the user in controlling the system actions. Thus the participation of Sheffield in iCLEF was organized around the idea of checking if the user should validate the query translation before the search is run or instead if the system should perform the translation and search in a single step without any user’s supervision

    The Net Effect of Exchange Rates on Agricultural Inputs and Outputs

    Get PDF
    For more than thirty years, studies about the effect of the exchange rate on exports have been conducted. However, few have considered the combined effect of the exchange rate on imported inputs into the agricultural system and the exports of final agricultural products those inputs produce. A current concern is for the net effect as the total value and quantity of inputs imported has increased. This research examines the effect of exchange rate changes on imported inputs into the corn, wheat, and beef cattle production systems. Effects on cost of production budgets are calculated, examining affects on profitability. Vector Autoregression (VAR) and Bayesian Averaging of Classical Estimates (BACE) models were estimated to evaluate those effects. Daily and weekly price data were used for corn, wheat, feeder steers, ethanol, diesel, ammonia, urea, di-ammonium phosphate, and the exchange rate. A VAR model was estimated to model the relationship between the variables. After having incongruous test results in determining the lag length structure it was decided that a BACE model would be approximated. After estimating the BACE model the price responses of the commodities to the exchange rates was estimated. The price responses were used in demonstrating the effect of the exchange rate on a producer’s profitability. It was determined that, generally, a strengthening exchange rate has a negative impact on prices. It was also found that the exchange rate has a greater impact on prices now than it did 14 years ago, implying that the exchange rate now has a greater affect on profitability. A one percent increase in the value of the dollar led to a decline in profitability ranging from 0.02/buinwheatto0.02/bu in wheat to 0.56/cwt in feeder steers. However, agricultural producers should not be overly concerned about a lower valued dollar from the perspective of their agricultural business.Agricultural and Food Policy, Crop Production/Industries, International Relations/Trade, Livestock Production/Industries,

    A comprehensive astrodynamic exposition and classification of earth-moon transits

    Get PDF
    Three-body problem for development of geometrical and topological taxonomy of earth-moon transit

    Buffet test in the National Transonic Facility

    Get PDF
    A buffet test of a commercial transport model was accomplished in the National Transonic Facility at the NASA Langley Research Center. This aeroelastic test was unprecedented for this wind tunnel and posed a high risk for the facility. Presented here are the test results from a structural dynamics and aeroelastic response point of view. The activities required for the safety analysis and risk assessment are described. The test was conducted in the same manner as a flutter test and employed on-board dynamic instrumentation, real time dynamic data monitoring, and automatic and manual tunnel interlock systems for protecting the model

    Low temperature shape relaxation of 2-d islands by edge diffusion

    Full text link
    We present a precise microscopic description of the limiting step for low temperature shape relaxation of two dimensional islands in which activated diffusion of particles along the boundary is the only mechanism of transport allowed. In particular, we are able to explain why the system is driven irreversibly towards equilibrium. Based on this description, we present a scheme for calculating the duration of the limiting step at each stage of the relaxation process. Finally, we calculate numerically the total relaxation time as predicted by our results and compare it with simulations of the relaxation process.Comment: 11 pages, 5 figures, to appear in Phys. Rev.

    Exchange between deep donors in semiconductors: a quantum defect approach

    Full text link
    Exchange interactions among defects in semiconductors are commonly treated within effective-mass theory using a scaled hydrogenic wave-function. However such a wave-function is only applicable to shallow impurities; here we present a simple but robust generalization to treat deep donors, in which we treat the long-range part of the wavefunction using the well established quantum defect theory, and include a model central-cell correction to fix the bound-state eigenvalue at the experimentally observed value. This allows us to compute the effect of binding energy on exchange interactions as a function of donor distance; this is a significant quantity given recent proposals to carry out quantum information processing using deep donors. As expected, exchange interactions are suppressed (or increased), compared to the hydrogenic case, by the greater localization (or delocalization) of the wavefunctions of deep donors (or `super-shallow' donors with binding energy less then the hydrogenic value). The calculated results are compared with a simple scaling of the Heitler-London hydrogenic exchange; the scaled hydrogenic results give the correct order of magnitude but fail to reproduce quantitatively our calculations. We calculate the donor exchange in silicon including inter-valley interference terms for donor pairs along the {100}\{100\} direction, and also show the influence of the donor type on the distribution of nearest-neighbour exchange constants at different concentrations. Our methods can be used to compute the exchange interactions between two donor electrons with arbitrary binding energy.Comment: 11 pages, 10 figures, RevTeX
    • …
    corecore