2,250 research outputs found
Loan growth and loan quality: some preliminary evidence from Texas banks
Following the failures of depository institutions in the 1980s, many analysts concluded that the rapid growth of lending activity and the deterioration of loan quality were related. Robert T. Clair tests this relationship after separating loan growth by its source: increased lending to new or existing customers, bank mergers, and acquisitions of failed banks. The preliminary evidence suggests that additional lending to new or existing customers beyond what might be normal at a given stage of the business cycle lowers loan quality after a three-year lag. This relationship, based on evidence from Texas banks, was especially strong at banks with below-average capitalization. ; Not all loan growth, however, will lead to lower loan quality. Loan growth during an economic expansion is to be expected as loan demand increases. Furthermore, well-capitalized banks were able to grow very rapidly and maintain loan quality. ; One method of increasing lending while maintaining loan quality was through the purchase of failed banks with the assistance of the Federal Deposit Insurance Corporation (FDIC). Of course, these purchases increased lending only for the acquiring banks and did not reflect an increase in total lending for the banking industry. Furthermore, it is possible that FDIC resolution procedures have discouraged the acquisition of weak but still solvent banks by stronger banks and are thereby slowing the rate of needed consolidation in the banking industry.Bank failures ; Bank loans
The Clearing House Interbank Payments System: a description of its operation and risk management
Clearinghouses (Banking)
Daylight overdrafts: who really bears the risk?
Payment systems ; Federal Reserve banks ; Electronic funds transfers ; Overdrafts
Six causes of the credit crunch
Bank lending typically moves with the business cycle. In Texas from 1987 to 1992, however, bank loans declined while nonagricultural employment rose. Robert T. Clair and Paula Tucker consider this evidence of a constrained supply of bank loans, or credit crunch. ; Clair and Tucker find that multiple factors have reduced banks' willingness and ability to supply loans. The resolution of failed banks and thrifts, tightening of bank examination standards, new capital requirements, new regulations and increased enforcement of old regulations, and increased exposure to lawsuits have each had an effect. Many of these regulatory changes where made to address important economic and social goals, but their side effects, often unintended and perhaps unavoidable, have been to reduce bank lending in the short run.Credit
The Texas banking crisis and the payments system
The Federal Reserve System plays a crucial role in the payments system that is especially important during periods of financial turmoil. In this article, Robert Clair, Joanna Kolson, and Kenneth Robinson explain the process and the risks involved in clearing checks in the private sector. They compare these processes and risks with the essentially risk-free check-clearing service the Federal Reserve System offers. During banking crises, they hypothesize, banks will increase their check-clearing through the Federal Reserve to minimize their risk exposure. A model of Federal Reserve check-clearing volume is constructed and estimated. The empirical results show that during banking crises, Federal Reserve check- processing volume rises as banks seek safer methods of clearing checks. Consequently, Federal Reserve payment services are important tools in minimizing the disruptive effects of banking crises on the economy.Banks and banking - Texas ; Payment systems
CAA study of airfoil broadband interaction noise using stochastic turbulent vorticity sources
The interaction of the turbulent wakes of the rotor with the outer guide vanes is one of the main broadband noise source in turbofan engines at approach conditions. Hence its prediction and reduction is a priority for engine manufacturers. The development of numerical methods is required as analytical approaches are limited to simple geometries and simplified flow configurations. The linearized Euler equations are solved in the time-domain to model the response of an isolated airfoil interacting with turbulence that is stochastically synthesized and injected in the computational domain through vorticity sources. This new method of injection has the advantages of being easy to implement and parallelize in an existing solver, whilst the generated turbulence is frozen. The method is firstly validated on a 2D free-field configuration. It is then applied, in the framework of the Fan Stage Broadband Noise Benchmarking Programme, to a two-dimensional NACA 65(12)-10 airfoil with no angle of attack and the results are validated through comparisons with experimental data. Afterwards, the effect of the angle of attack is studied and the results suggest that a one-component turbulent model is not satisfactory to perform accurate acoustic predictions with an angle of attack, as it overestimates the rate of decay of the acoustic spectra at high frequencies. The study of the influence of the integral length scale of the turbulence confirms that the airfoil leading edge response is only modulated by the incoming turbulence characteristics. Finally, the acoustic spectra predicted for different velocities show a better agreement with a flat plate analytical model when the velocity is increased
Algorithms to automatically quantify the geometric similarity of anatomical surfaces
We describe new approaches for distances between pairs of 2-dimensional
surfaces (embedded in 3-dimensional space) that use local structures and global
information contained in inter-structure geometric relationships. We present
algorithms to automatically determine these distances as well as geometric
correspondences. This is motivated by the aspiration of students of natural
science to understand the continuity of form that unites the diversity of life.
At present, scientists using physical traits to study evolutionary
relationships among living and extinct animals analyze data extracted from
carefully defined anatomical correspondence points (landmarks). Identifying and
recording these landmarks is time consuming and can be done accurately only by
trained morphologists. This renders these studies inaccessible to
non-morphologists, and causes phenomics to lag behind genomics in elucidating
evolutionary patterns. Unlike other algorithms presented for morphological
correspondences our approach does not require any preliminary marking of
special features or landmarks by the user. It also differs from other seminal
work in computational geometry in that our algorithms are polynomial in nature
and thus faster, making pairwise comparisons feasible for significantly larger
numbers of digitized surfaces. We illustrate our approach using three datasets
representing teeth and different bones of primates and humans, and show that it
leads to highly accurate results.Comment: Changes with respect to v1, v2: an Erratum was added, correcting the
references for one of the three datasets. Note that the datasets and code for
this paper can be obtained from the Data Conservancy (see Download column on
v1, v2
- …