3,299 research outputs found
Banking Relations, Competition and Research Incentives
When banks incur sunk costs to provide ex-ante information about customers, exclusive banking relations will occur under intense price competition when monitoring costs are low. When monitoring costs are sufficiently high, only non-monitored finance will be provided, typically, by multiple lenders. While multiple lending generally is (second-best) efficient when it emerges, relationship lending typically is not. In our framework, the informational rents in relationships of a single financier (house bank) typically exceed the risk premium required for financing projects from the unscreened pool of applicants. Accordingly, when entrepreneurs can affect repayment probabilities by sunk ex-ante investments prior to the financing stage, in a house bank regime investment incentives are typically lower than under conditions of competitive non-monitored lending.relationship banking, multiple lending, monitoring, research incentives
Stereo Matching in the Presence of Sub-Pixel Calibration Errors
Stereo matching commonly requires rectified images that are computed from calibrated cameras. Since all under-lying parametric camera models are only approximations, calibration and rectification will never be perfect. Additionally, it is very hard to keep the calibration perfectly stable in application scenarios with large temperature changes and vibrations. We show that even small calibration errors of a quarter of a pixel are severely amplified on certain structures. We discuss a robotics and a driver assistance example where sub-pixel calibration errors cause severe problems. We propose a filter solution based on signal theory that removes critical structures and makes stereo algorithms less sensitive to calibration errors. Our approach does not aim to correct decalibration, but rather to avoid amplifications and mismatches. Experiments on ten stereo pairs with ground truth and simulated decalibrations as well as images from robotics and driver assistance scenarios demonstrate the success and limitations of our solution that can be combined with any stereo method
Stock price informativeness, cross-listings and investment decisions
In this paper, the authors show that a cross-listing allows a firm to make better investment decisions because it enhances stock price informativeness.Cross-listings; cross-listings premium; price informativeness; investment decisions; flow-back; ownership.
Information Sharing in Banking: A Collusive Device?
We show that information sharing among banks may serve as a collusive device. An informational sharing agreement is an a-priori commitment to reduce informational asymmetry between banks in future lending. Hence, information sharing agreements tend to increase the intensity of competition in future periods and, thus, reduce the value of informational rents in current competition. We contribute to the existing literature by emphasizing that a reduction in informational rents will also reduce the intensity of competition in the current period, thereby reducing competitive pressure in current credit markets. We provide a large class of economic environments, where a ban on information sharing is strictly preferred by society.
Two at the Top: Quality Differentiation in Markets with Switching Costs
We explore the effects of switching costs on the subgame perfect quality decisions of oligopolists with repeated price competition. We establish a strong strategic quality premium. We show that competition for the establishment of customer relationships will eliminate low-quality firms in period 1 and that low-quality firms can survive only based on poaching profits. The equilibrium configuration is characterized by an agglomeration of two providers of top-quality as soon as switching cost heterogeneity is sufficiently significant. We demonstrate a finiteness property, according to which the two top-quality firms dominate the market with a joint market share exceeding 50 %.quality choice; switching costs; poaching; natural oligopoly
Venture Cycles: Theory and Evidence
We demonstrate how endogenous information acquisition in venture capital markets creates investment cycles when competing financiers undertake their screening decisions in an uncoordinated way, thereby highlighting the role of intertemporal screening externalities induced by competition among venture capitalists as a structural source of instability. We show that uncoordinated screening behavior of competing financiers is an independent source of fluctuations inducing venture investment cycles. We also empirically document the existence of cyclical features in a number of industries such as biotechnology, electronics, financial services, healthcare, medical services and consumer products.screening, venture capital, investment cycles
Focus Is All You Need: Loss Functions For Event-based Vision
Event cameras are novel vision sensors that output pixel-level brightness
changes ("events") instead of traditional video frames. These asynchronous
sensors offer several advantages over traditional cameras, such as, high
temporal resolution, very high dynamic range, and no motion blur. To unlock the
potential of such sensors, motion compensation methods have been recently
proposed. We present a collection and taxonomy of twenty two objective
functions to analyze event alignment in motion compensation approaches (Fig.
1). We call them Focus Loss Functions since they have strong connections with
functions used in traditional shape-from-focus applications. The proposed loss
functions allow bringing mature computer vision tools to the realm of event
cameras. We compare the accuracy and runtime performance of all loss functions
on a publicly available dataset, and conclude that the variance, the gradient
and the Laplacian magnitudes are among the best loss functions. The
applicability of the loss functions is shown on multiple tasks: rotational
motion, depth and optical flow estimation. The proposed focus loss functions
allow to unlock the outstanding properties of event cameras.Comment: 29 pages, 19 figures, 4 table
MicroPoem: experimental investigation of birch pollen emissions
Diseases due to aeroallergens constantly increased over the last decades and affect more and more people. Adequate protective and pre-emptive measures require both reliable assessment of production and release of various pollen species, and the forecasting of their atmospheric dispersion. Pollen forecast models, which may be either based on statistical knowledge or full physical transport and dispersion modeling, can provide pollen forecasts with full spatial coverage. Such models are currently being developed in many countries. The most important shortcoming in these pollen transport systems is the description of emissions, namely the dependence of the emission rate on physical processes such as turbulent exchange or mean transport and biological processes such as ripening (temperature) and preparedness for release. Thus the quantification of pollen emissions and determination of the governing mesoscale and micrometeorological factors are subject of the present project MicroPoem, which includes experimental field work as well as numerical modeling. The overall goal of the project is to derive an emission parameterization based on meteorological parameters, eventually leading to enhanced pollen forecasts. In order to have a well-defined source location, an isolated birch pollen stand was chosen for the set-up of a ‘natural tracer experiment', which was conducted during the birch pollen season in spring 2009. The site was located in a broad valley, where a mountain-plains wind system usually became effective during clear weather periods. This condition allowed to presume a rather persistent wind direction and considerable velocity during day- and nighttime. Several micrometeorological towers were operated up- and downwind of this reference source and an array of 26 pollen traps was laid out to observe the spatio-temporal variability of pollen concentrations. Additionally, the lower boundary layer was probed by means of a sodar and a tethered balloon system (also yielding a pollen concentration profile). In the present contribution a project overview is given and first results are presented. An emphasis is put on the relative performance of different sample technologies and the corresponding relative calibration in the lab and the field. The concentration distribution downwind of the birch stand exhibits a significant spatial (and temporal) variability. Small-scale numerical dispersion modeling will be used to infer the emission characteristics that optimally explain the observed concentration patterns
Recommended from our members
An evaluation framework for stereo-based driver assistance
This is the post-print version of the Article - Copyright @ 2012 Springer VerlagThe accuracy of stereo algorithms or optical flow methods is commonly assessed by comparing the results against the Middlebury
database. However, equivalent data for automotive or robotics applications
rarely exist as they are difficult to obtain. As our main contribution, we introduce an evaluation framework tailored for stereo-based driver assistance able to deliver excellent performance measures while
circumventing manual label effort. Within this framework one can combine several ways of ground-truthing, different comparison metrics, and use large image databases.
Using our framework we show examples on several types of ground truthing techniques: implicit ground truthing (e.g. sequence recorded without a crash occurred), robotic vehicles with high precision sensors, and to a small extent, manual labeling. To show the effectiveness of our evaluation framework we compare three different stereo algorithms on
pixel and object level. In more detail we evaluate an intermediate representation
called the Stixel World. Besides evaluating the accuracy of the Stixels, we investigate the completeness (equivalent to the detection rate) of the StixelWorld vs. the number of phantom Stixels. Among many findings, using this framework enables us to reduce the number of phantom Stixels by a factor of three compared to the base parametrization. This base parametrization has already been optimized by test driving vehicles for distances exceeding 10000 km
Positiveness and Pauli exception principle in raw Bloch equations for quantum boxes
The aim of this paper is to derive a raw Bloch model for the interaction of
light with quantum boxes in the framework of a two-electron-species (conduction
and valence) description. This requires a good understanding of the one-species
case and of the treatment of level degeneracy. In contrast with some existing
literature we obtain a Liouville equation which induces the positiveness and
the boundedness of solutions, that are necessary for future mathematical
studies involving higher order phenomena
- …