667 research outputs found

    Why do banks hold capital in excess of regulatory requirements? A functional approach

    Get PDF
    This paper provides an explanation for the observation that banks hold on average a capital ratio in excess of regulatory requirements. We use a functional approach to banking based on Diamond and Rajan (2001) to demonstrate that banks can use capital ratios as a strategic tool for renegotiating loans with borrowers. As capital ratios affect the ability of banks to collect loans in a nonmonotonic way, a bank may be forced to exceed capital requirements. Moreover, high capital ratios may also constrain the amount a banker can borrow from investors. Consequently, the size of the banking sector may shrink.incomplete contracts, minimum capital requirements, bank capital, disintermediation, procyclicality

    Banksā€™ Internationalization Strategies: The Role of Bank Capital Regulation

    Get PDF
    This paper studies how capital requirements influence a bankā€™s mode of entry into foreign financial markets. We develop a model of an internationally operating bank that creates and allocates liquidity across countries and argue that the advantage of multinational banking over offering cross-border financial services depends on the benefit and the cost of intimacy with local markets. The benefit is that it allows to create more liquidity. The cost is that it causes inefficiencies in internal capital markets, on which a multinational bank relies to allocate liquidity across countries. Capital requirements affect this trade-off by influencing the degree of inefficiency in internal capital markets.incomplete financial contracting, cross-border financial services, multinational banking, liquidity allocation, capital regulation

    Behold the 'Behemoth'. The privatization of Japan Post Bank

    Get PDF
    This paper analyzes the privatization process of the Japanese Post Bank (JPB), the largest bank in the world. We report some evidence in favour of the "political view" of SOB's and argue that, before privatization, postal savings banks served as vehicles for politicians to reallocate funds in exchange for private rents. We ask why politicians in Japan decided to privatize the postal savings system, predict how the privatization will proceed and study the expected results of the privatization process. We argue that there will be no level playing field in bank competition after the start of the privatization process and discuss possible out-comes of JPB privatization on financial stability in Japan.Public banking, Japan, Privatization, Postal savings banks

    Alternative fuels from Biomass and Power (PBtL) A case study on process options, technical potentials, fuel costs and ecological performance

    Get PDF
    Greenhouse gas emissions in the transport sector can significantly be reduced by replacing fossil based fuels with biomass-based alternatives. Several promising fuel production paths of the second generation made from residues and waste wood had already been developed in recent years. These fuel concepts typically suffer from the axiomatically limited technical potential of biomass resources in central Europe. Furthermore, fuel costs are currently not competitive on the market. In order to change this state, the German Aerospace Center has refined existing Biomass-to-Liquid (BtL) and Power-to-Liquid (PtL) concepts to the so-called Power&Biomass-to-Liquid concept. The main idea is to utilize the large technical exploitation potential of renewable electricity in modified BtL plants. The case study presents detailed results on promising process configurations of Fischer-Tropsch PBtL concepts based on different gasifier and electrolyzer technologies, the expectable technical fuel potential, fuel production costs and CO2 footprint. As a result, the fuel output of BtL plants can be nearly quadrupled at the same biomass input. Hence, fuel costs can significantly be reduced due to economy of scale effects. Furthermore, the specific CO2 footprint of the fuel is reduced as well

    Evaluation of different calibration strategies for large scale continuous hydrological modelling

    Get PDF
    For the analysis of climate impact on flood flows and flood frequency in macroscale river basins, hydrological models can be forced by several sets of hourly long-term climate time series. Considering the large number of model units, the small time step and the required recalibrations for different model forcing an efficient calibration strategy and optimisation algorithm are essential. This study investigates the impact of different calibration strategies and different optimisation algorithms on the performance and robustness of a semi-distributed model. The different calibration strategies were (a) Lumped, (b) 1-Factor, (c) Distributed and (d) Regionalisation. The latter uses catchment characteristics and estimates parameter values via transfer functions. These methods were applied in combination with three different optimisation algorithms: PEST, DDS, and SCE. In addition to the standard temporal evaluation of the calibration strategies, a spatial evaluation was applied. This was done by transferring the parameters from calibrated catchments to uncalibrated ones and validating the model performance of these uncalibrated catchments. The study was carried out for five sub-catchments of the Aller-Leine River Basin in Northern Germany. The best result for temporal evaluation was achieved by using the combination of the DDS optimisation with the Distributed strategy. The Regionalisation method obtained the weakest performance for temporal evaluation. However, for spatial evaluation the Regionalisation indicated more robust models, closely followed by the Lumped method. The 1-Factor and the Distributed strategy showed clear disadvantages regarding spatial parameter transferability. For the parameter estimation based on catchment descriptors as required for ungauged basins, the Regionalisation strategy seems to be a promising tool particularly in climate impact analysis and for hydrological modelling in general.Ministry for Science and Culture of Lower Saxon

    Development of Baltic cod eggs at different levels of temperature and oxygen content

    Get PDF
    The influence of ambient temperature (2-7Ā°C) and oxygen level (1.0-8.3 mlĀ· l- 1) on the development of Baltic cod eggs was investigated in laboratory experiments. The incubation period, i.e. the time from fertilization to 50% hatching, decreased from 27.5 days at 2Ā°C to 13.0 days at 7Ā°C. Reduced oxygen levels did not significantly affect the time of hatching. Throughout the incubation period highest mortality rates were found during gastrulation and immediately prior to hatching at all tested oxygen levels. Egg survival decreased from around 30% at an oxygen level of 8 mlĀ· 1- 1 to less than 10% at 2 mlĀ· 1- 1 oxygen content. At oxygen concentrations below 2 ml 0 2 ā€¢ l- 1 the development ceased at a very early stage. Field observations revealed that in the past years Baltic cod eggs were most abundant below the halocline, depth with unfavourable oxygen condition. Besides the effect on egg survival, low environmental oxygen may also affect the initial viability of larvae and consequently their ability to approach the feeding areas close to the sea surface. Thus, the effective reproduction volume of water for cod in the central Baltic may have been smaller than expected and it is suggested that oxygen depletion was the limiting factor determining the reproductive success of cod in this area during the last decade

    Preliminary investigations on high energy electron beam tomography

    Get PDF
    In computed tomography (CT) cross-sectional images of the attenuation distribution within a slice are created by scanning radiographic projections of an object with a rotating X-ray source detector compound and subsequent reconstruction of the images from these projection data on a computer. CT can be made very fast by employing a scanned electron beam instead of a mechanically moving X-ray source. Now this principle was extended towards high-energy electron beam tomography with an electrostatic accelerator. Therefore a dedicated experimental campaign was planned and carried out at the Budker Insitute of Nuclear Physics (BINP), Novosibirsk. There we investigated the capabilities of BINPā€™s accelerators as an electron beam generating and scanning unit of a potential high-energy electron beam tomography device. The setup based on a 1 MeV ELV-6 (BINP) electron accelerator and a single detector. Besides tomographic measurements with different phantoms, further experiments were carried out concerning the focal spot size and repeat accuracy of the electron beam as well as the detectorā€™s response time and signal to noise ratio

    Nonlinear Multidimensional Bayesian Estimation with Fourier Densities

    Get PDF
    Efficiently implementing nonlinear Bayesian estimators is still an unsolved problem, especially for the multidimensional case. A trade-off between estimation quality and demand on computational resources has to be found. Using multidimensional Fourier series as representation for probability density functions, so called Fourier densities, is proposed. To ensure non-negativity, the approximation is performed indirectly via Psi-densities, of which the absolute square represent the Fourier density. It is shown that PSI-densities can be determined using the efficient fast Fourier transform algorithm and their coefficients have an ordering with respect to the Hellinger metric. Furthermore, the multidimensional Bayesian estimator based on Fourier Densities is derived in closed form. That allows an efficient realization of the Bayesian estimator where the demands on computational resources are adjustable

    Closed-Form Prediction of Nonlinear Dynamic Systems by Means of Gaussian Mixture Approximation of the Transition Density

    Get PDF
    Recursive prediction of the state of a nonlinear stochastic dynamic system cannot be efficiently performed in general, since the complexity of the probability density function characterizing the system state increases with every prediction step. Thus, representing the density in an exact closed-form manner is too complex or even impossible. So, an appropriate approximation of the density is required. Instead of directly approximating the predicted density, we propose the approximation of the transition density by means of Gaussian mixtures. We treat the approximation task as an optimization problem that is solved offline via progressive processing to bypass initialization problems and to achieve high quality approximations. Once having calculated the transition density approximation offline, prediction can be performed efficiently resulting in a closed-form density representation with constant complexity

    Approximate Nonlinear Bayesian Estimation Based on Lower and Upper Densities

    Get PDF
    Recursive calculation of the probability density function characterizing the state estimate of a nonlinear stochastic dynamic system in general cannot be performed exactly, since the type of the density changes with every processing step and the complexity increases. Hence, an approximation of the true density is required. Instead of using a single complicated approximating density, this paper is concerned with bounding the true density from below and from above by means of two simple densities. This provides a kind of guaranteed estimator with respect to the underlying true density, which requires a mechanism for ordering densities. Here, a partial ordering with respect to the cumulative distributions is employed. Based on this partial ordering, a modified Bayesian filter step is proposed, which recursively propagates lower and upper density bounds. A specific implementation for piecewise linear densities with finite support is used for demonstrating the performance of the new approach in simulations
    • ā€¦
    corecore