310 research outputs found
Relationship banking and the credit market in India: An empirical analysis
Relationship banking based on Okun's "customer credit markets" has important implications for monetary policy via the credit transmission channel. Studies of LDC credit markets from this point of view seem to be scanty and this paper attempts to address this lacuna. Relationship banking implies short-term disequilibrium in credit markets, suggesting the VECM (vector error-correction model) as an appropriate framework for analysis. We develop VECM models in the Indian context (for the period April 1991- December 2004 using monthly data) to analyse salient features of the credit market. An analysis of the ECMs (error-correction mechanisms) reveals that disequilibrium in the Indian credit market is adjusted via demand responses rather than supply responses, which is in accordance with the customer view of credit markets. Further light on the working of the model is obtained through the "generalized" impulse responses and "generalized" error decompositions (both of which are independent of the variable ordering). Our conclusions point towards firms using short-term credit as a liquidity buffer. This fact, together with the gradual adjustment exhibited by the "persistence profiles" provides substantive evidence in favour of "customer credit markets".customer credit markets, monetary policy, co-integration, impulse response, persistence profiles
"RELATIONSHIP BANKING" AND THE CREDIT MARKET IN INDIA : AN EMPIRICAL ANALYSIS
Relationship banking based on Okun's "customer credit markets" has important implications for monetary policy via the credit transmission channel. Studies of LDC credit markets from this point of view seem to be scanty and this paper attempts to address this lacuna. Relationship banking implies short-term disequilibrium in credit markets, suggesting the VECM (vector error-correction model) as an appropriate framework for analysis. We develop VECM models in the Indian context (for the period April 1991- December 2004 using monthly data) to analyse salient features of the credit market. An analysis of the ECMs (error-correction mechanisms) reveals that disequilibrium in the Indian credit market is adjusted via demand responses rather than supply responses, which is in accordance with the customer view of credit markets. Further light on the working of the model is obtained through the "generalized" impulse responses and "generalized" error decompositions (both of which are independent of the variable ordering). Our conclusions point towards firms using short-term credit as a liquidity buffer. This fact, together with the gradual adjustment exhibited by the "persistence profiles" provides substantive evidence in favour of "customer credit markets".
Are object detection assessment criteria ready for maritime computer vision?
Maritime vessels equipped with visible and infrared cameras can complement
other conventional sensors for object detection. However, application of
computer vision techniques in maritime domain received attention only recently.
The maritime environment offers its own unique requirements and challenges.
Assessment of the quality of detections is a fundamental need in computer
vision. However, the conventional assessment metrics suitable for usual object
detection are deficient in the maritime setting. Thus, a large body of related
work in computer vision appears inapplicable to the maritime setting at the
first sight. We discuss the problem of defining assessment metrics suitable for
maritime computer vision. We consider new bottom edge proximity metrics as
assessment metrics for maritime computer vision. These metrics indicate that
existing computer vision approaches are indeed promising for maritime computer
vision and can play a foundational role in the emerging field of maritime
computer vision
Polygonal Approximation of Digital Planar Curve Using Novel Significant Measure
This chapter presents an iterative smoothing technique for polygonal approximation of digital image boundary. The technique starts with finest initial segmentation points of a curve. The contribution of initially segmented points toward preserving the original shape of the image boundary is determined by computing the significant measure of every initial segmentation point that is sensitive to sharp turns, which may be missed easily when conventional significant measures are used for detecting dominant points. The proposed method differentiates between the situations when a point on the curve between two points on a curve projects directly upon the line segment or beyond this line segment. It not only identifies these situations but also computes its significant contribution for these situations differently. This situation-specific treatment allows preservation of points with high curvature even as revised set of dominant points are derived. Moreover, the technique may find its application in parallel manipulators in detecting target boundary of an image with varying scale. The experimental results show that the proposed technique competes well with the state-of-the-art techniques
Some Inequalities on ‘Useful’ Mean g-deviation with Applications in Information Theory
The objective of this correspondence is to offer an elaboration of some latest inequalities' findings, in which we have given a new improvement of ‘useful’ Jensen's inequality, as well as utilization in the theory of information. In linear spaces, for convex functions constructed on a convex subset, an improvement inequality of Jensen's is provided. For ‘useful mean deviation and ‘useful’ divergences, we provide robust lower bounds as well as the ‘useful’ mean h-absolute deviation, and lastly, we have given applications of divergence measure. Uniqueness for the ‘useful’ KL-Divergence and ‘useful’ Jeffreys divergence is obtained
Data Quality Management: Trade-offs in Data Characteristics to Maintain Data Quality
We are living in an age of information in which organizations are crumbling under the pressure of exponentially growing data. Increased data quality ensures better decision making, thereby enabling companies to stay competitive in the market. To improve data quality, it is imperative to identify all the characteristics that describe data. And, building on one characteristic results in compromising another, creating a trade-off. There are many well established and interesting theories regarding data quality and data characteristics. However, we found that there is a lack of research and literature regarding how trade-offs are handled between the different types of data that is stored by an organization. To understand how organisations deal with trade-offs, we chose a framework formulated by Eppler, where various data characteristics trade-offs are discussed. After a pre-study with experts in this field, we narrowed it down to three main data characteristic trade-offs and these were further analysed through interviews. Based on the interviews conducted and the literature review, we could prioritize data types under different data characteristics. This research gives insight to how data characteristics trade-offs should be accomplished in organizations
Selection of Combat Aircraft by Using Shannon Entropy and VIKOR Method
The selection of military defense equipment, especially fighter aircraft, has a bearing on the readiness ofthe Indian Air Force to defend the country’s independence. This study analyses a collection of alternative fighteraircraft that are linked to several choice factors using a multiple-criterion decision-making analysis technique. Tohandle such scenarios and make wise design judgements, a variety of criterion decision analysis techniques can beused. In this study, we assess fifth-generation fighter aircraft that incorporate significant 21st-century technologicaladvancements. These aircraft represent the state-of-the-art in fleet planning operations to 2022. These are generallyequipped with quick-moving airframes, highly integrated computer systems, superior avionics features, networkingwith other battlefield elements, situational awareness, command, control, and other communication capabilities.The study proposes a strategy for the selection of the fifth-generation combat aircraft for the National Air Force.Because of the problems, the Army needed an application that could assist with decision-making for combat selection systems. Solving the decision problem for evaluating fifteen military fighter alternatives in terms of nine decision criteria is the main objective of this work. We use the Shannon entropy and VIKOR model for the Air Force’s fleet program to evaluate military fighter aircraft suitability. The entropy technique is used to compute the weight of the criteria, and then the VIKOR technique has been used to rank the fighter aircraft
- …