52,383 research outputs found

    Erratum: Next-to-leading order supersymmetric QCD predictions for associated production of gauginos and gluinos [Phys. Rev. D 62, 095014 (2000)]

    Full text link
    Errors in the published version of the paper are corrected, and new figures are provided.Comment: 3 pages, latex, 4 figure

    Two-particle spatial correlations in superfluid nuclei

    Full text link
    We discuss the effect of pairing on two-neutron space correlations in deformed nuclei. The spatial correlations are described by the pairing tensor in coordinate space calculated in the HFB approach. The calculations are done using the D1S Gogny force. We show that the pairing tensor has a rather small extension in the relative coordinate, a feature observed earlier in spherical nuclei. It is pointed out that in deformed nuclei the coherence length corresponding to the pairing tensor has a pattern similar to what we have found previously in spherical nuclei, i.e., it is maximal in the interior of the nucleus and then it is decreasing rather fast in the surface region where it reaches a minimal value of about 2 fm. This minimal value of the coherence length in the surface is essentially determined by the finite size properties of single-particle states in the vicinity of the chemical potential and has little to do with enhanced pairing correlations in the nuclear surface. It is shown that in nuclei the coherence length is not a good indicator of the intensity of pairing correlations. This feature is contrasted with the situation in infinite matter.Comment: 14 pages, 17 figures, submitted to PR

    Optimisation of patch distribution strategies for AMR applications

    Get PDF
    As core counts increase in the world's most powerful supercomputers, applications are becoming limited not only by computational power, but also by data availability. In the race to exascale, efficient and effective communication policies are key to achieving optimal application performance. Applications using adaptive mesh refinement (AMR) trade off communication for computational load balancing, to enable the focused computation of specific areas of interest. This class of application is particularly susceptible to the communication performance of the underlying architectures, and are inherently difficult to scale efficiently. In this paper we present a study of the effect of patch distribution strategies on the scalability of an AMR code. We demonstrate the significance of patch placement on communication overheads, and by balancing the computation and communication costs of patches, we develop a scheme to optimise performance of a specific, industry-strength, benchmark application

    The 'gated-diode' configuration in MOSFET's, a sensitive tool for characterizing hot-carrier degradation

    Get PDF
    This paper describes a new measurement technique, the forward gated-diode current characterized at low drain voltages to be applied in MOSFET's for investigating hot-carrier stress-induced defects at high spatial resolution. The generation/recombination current in the drain-to-substrate diode as a function of gate voltage, combined with two-dimensional numerical simulation, provides a sensitive tool for detecting the spatial distribution and density of interface defects. In the case of strong accumulation, additional information is obtained from interband tunneling processes occurring via interface defects. The various mechanisms for generating interface defects and fixed charges at variable stress conditions are discussed, showing that information complementary to that available from other methods is obtaine

    Spin Precession and Avalanches

    Full text link
    In many magnetic materials, spin dynamics at short times are dominated by precessional motion as damping is relatively small. In the limit of no damping and no thermal noise, we show that for a large enough initial instability, an avalanche can transition to an ergodic phase where the state is equivalent to one at finite temperature, often above that for ferromagnetic ordering. This dynamical nucleation phenomenon is analyzed theoretically. For small finite damping the high temperature growth front becomes spread out over a large region. The implications for real materials are discussed.Comment: 4 pages 2 figure

    Integrated optics for astronomical interferometry. I. Concept and astronomical applications

    Full text link
    We propose a new instrumental concept for long-baseline optical single-mode interferometry using integrated optics which were developed for telecommunication. Visible and infrared multi-aperture interferometry requires many optical functions (spatial filtering, beam combination, photometric calibration, polarization control) to detect astronomical signals at very high angular resolution. Since the 80's, integrated optics on planar substrate have become available for telecommunication applications with multiple optical functions like power dividing, coupling, multiplexing, etc. We present the concept of an optical / infrared interferometric instrument based on this new technology. The main advantage is to provide an interferometric combination unit on a single optical chip. Integrated optics are compact, provide stability, low sensitivity to external constrains like temperature, pressure or mechanical stresses, no optical alignment except for coupling, simplicity and intrinsic polarization control. The integrated optics devices are inexpensive compared to devices that have the same functionalities in bulk optics. We think integrated optics will fundamentally change single-mode interferometry. Integrated optics devices are in particular well-suited for interferometric combination of numerous beams to achieve aperture synthesis imaging or for space-based interferometers where stability and a minimum of optical alignments are wished.Comment: 11 pages, 8 figures, accpeted by Astronomy and Astrophysics Supplement Serie

    The Role of Capital in Financial Institutions

    Get PDF
    This paper examines the role of capital in financial institutions. As the introductory article to a conference on the role of capital management in banking and insurance, it describes the authors' views of why capital is important, how market-generated capital requirements' differ from regulatory requirements and the form that regulatory requirements should take. It also examines the historical trends in bank capital, problems in measuring capital and some possible unintended consequences of capital requirements. According to the authors, the point of departure for all modern research on capital structure is the Modigliani-Miller (M&M, 1958) proposition that in a frictionless world of full information and complete markets, a firm s capital structure cannot affect its value. The authors suggest however, that financial institutions lack any plausible rationale in the frictionless world of M&M. Most of the past research on financial institutions has begun with a set of assumed imperfections, such as taxes, costs of financial distress, transactions costs, asymmetric information and regulation. Miller argues (1995) that these imperfections may not be important enough to overturn the M&M Proposition. Most of the other papers presented at this conference on capital take the view that the deviations from M&M s frictionless world are important, so that financial institutions may be able to enhance their market values by taking on an optimal amount of leverage. The authors highlight these positions in this article. The authors next examine why markets require' financial institutions to hold capital. They define this capital requirement' as the capital ratio that maximizes the value of the bank in the absence of regulatory capital requirements and all the regulatory mechanisms that are used to enforce them, but in the presence of the rest of the regulatory structure that protects the safety and soundness of banks. While the requirement differs for each bank, it is the ratio toward which each bank would tend to move in the long run in the absence of regulatory capital requirements. The authors then introduce imperfections into the frictionless world of M&M taxes and the costs of financial distress, transactions costs and asymmetric information problems and the regulatory safety net. The authors analysis suggests that departures from the frictionless M&M world may help explain market capital requirements for banks. Tax considerations tend to reduce market capital requirements , the expected costs of financial distress tend to raise these requirements , and transactions costs and asymmetric information problems may either increase or reduce the capital held in equilibrium. The federal safety net shields bank creditors from the full consequences of bank risk taking and thus tends to reduce market capital requirements . The paper then summarizes the historical evolution of bank capital ratios in the United States and the reasons regulators require financial institutions to hold capital. They suggest that regulatory capital requirements are blunt standards that respond only minimally to perceived differences in risk rather than the continuous prices and quantity limits set by uninsured creditors in response to changing perceptions of the risk of individual banks. The authors suggest an ideal system for setting capital standards but agree that it would be prohibitively expensive, if not impossible. Regulators lack precise estimates of social costs and benefits to tailor a capital requirement for each bank, and they cannot easily revise the requirements continuously as conditions change. The authors continue with suggestions for measuring regulatory capital more effectively. They suggest that a simple risk-based capital ratio is a relatively blunt tool for controlling bank risk-taking. The capital in the numerator may not always control bank moral hazard incentive; it is difficult to measure, and its measured value may be subject to manipulation by gains trading . The risk exposure in the denominator is also difficult to measure, corresponds only weakly to actual risk and may be subject to significant manipulation. These imprecisions worsen the social tradeoff between the externalities from bank failures and the quantity of bank intermediation. To keep bank risk to a tolerable level, capital standards must be higher on average than they otherwise would be if the capital ratios could be set more precisely, raising bank costs and reducing the amount of intermediation in the economy in the long run. Since actual capital standards are, at best, an approximation to the ideal, the authors argue that it should not be surprising that they may have had some unintended effects. They examine two unintended effects on bank portfolio risk or credit allocative inefficiencies. These two are the explosive growth of securitization and the so-called credit crunch by U.S. banks in the early 1990s. The authors show that capital requirements may give incentives for some banks to increase their risks of failure. Inaccuracies in setting capital requirements distort relative prices and may create allocative inefficiencies that divert financial resources from their most productive uses. During the 1980s, capital requirements may have created artificial incentives for banks to take off-balance sheet risk, and changes in capital requirements in the 1990s may have contributed to a credit crunch.
    • …
    corecore