9,952 research outputs found

    The Coexistence of Multiple Distributions Systems for Financial Services: The Case of Property-Liability Insurance

    Get PDF
    Property-liability insurance is distributed by two different types of firms, those that distribute their product through independent agents, who represent more than one insurer,and direct writing insurers that distribute insurance through exclusive agents, who represent only one insurer. This paper analyzes the reasons for the long term coexistence of the independent agency and direct writing distribution systems. Two primary hypotheses explain the coexistence of independent and exclusive agents. The market imperfections hypothesis suggests that firms that use independent agents survive while providing essentially the same service as firms using exclusive agents because of market imperfections such as price regulation, slow diffusion of information in insurance markets, or search costs that permit inefficient firms to survive alongside efficient firms. Efficient firms are expected to earn super-normal risk-adjusted profits, while inefficient firms will earn risk-adjusted profits closer to normal levels. The product quality hypothesis suggests that higher costs of independent agents represent unobserved differences in product quality or service intensity, such as the providing of additional customer assistance with claims settlement,offering a greater variety of product choice sand reducing policyholder search costs. This hypothesis predicts normal risk-adjusted profits for both independent and exclusive agency firms. Because product quality in insurance is essentially unobserved, researchers have been unable to reach consensus on whether the market imperfections hypothesis or the product quality hypothesis is more consistent with the observed cost data. This lack of consensus leaves open the economic question of whether the market works well in solving the problem of minimizing product distribution costs and leaves unresolved the policy issue of whether marketing costs in property-liability insurance are excessive and perhaps should receive regulatory attention. The authors propose a new methodology for distinguishing between market imperfection sand product quality using frontier efficiency methods. They estimate both profit efficiency and cost efficiency for a sample of independent and exclusive agency insurers. Measuring profit efficiency helps to identify unobserved product quality differences because customers should be willing to pay extra for higher quality. This approach allows for the possibility that some firms may incur additional costs providing superior service and be compensated for these costs through higher revenues. Profit efficiency also implicitly incorporates the qualities floss control and risk management services,since insurers that more effectively control losses and manage risk should have higher average risk-adjusted profits but not necessarily lower costs than less effective insurers. The empirical results confirm that independent agency firms have higher costs on average than do direct writers. The principal finding of the study is that most of the average differential between the two groups of firms disappears in the profit function analysis. This is a robust result that holds both in the authors tables of averages and in the regression analysis and applies to both the standard and non-standard profit functions. Based on averages, the profit efficiency differential is at most one-third as large as the profit efficiency differential.Based on the regression analysis, the profit inefficiency differential is at most one-fourth as large as the cost inefficiency differential,and the profit inefficiency differential is not statistically significant in the more fully specified models that control for size,organizational form and business mix. The results provide strong support for the product quality hypothesis and do not support the market imperfections hypothesis. The higher costs of independent agents appear to be due almost entirely to the provision of higher quality services, which are compensated for by additional revenues. A significant public policy implication is that regulatory decisions should not be based on costs alone. The authors findings imply that marketing cost differentials among insurers are mostly attributable to service differentials rather than to inefficiency and therefore do not represent social costs. The profit inefficiency results show that there is room for improvement in both the independent and direct writing segments of the industry. However, facilitating competition is likely to be a more effective approach to increasing efficiency than restrictive price regulation.

    New Methodological Developments for the International Comparison Program

    Get PDF
    The paper explains new methodology that was used in the 2005 International Comparison Program (ICP) that compared the relative price levels and GDP levels across 146 countries. In this round of the ICP, the world was divided into 6 regions: OECD, CIS, Africa, South America, Asia Pacific and West Asia. What is new in this round compared to previous rounds of the ICP is that each region was allowed to develop its own product list and collect prices on this list for countries in the region. The regions were then linked using another separate product list and 18 countries across the 6 regions collected prices for products on this list and this information was used to link prices and quantities across the regions. An additional complication was that the final linking of prices and volumes across regions had to respect the regional price and volume measures that were (separately) constructed by the regions. The paper also studies the properties of the Iklé Dikhanov Balk multilateral system of index numbers which was used by Africa.Index numbers, multilateral comparison methods, GEKS, EKS, Geary-Khamis, Balk, Dikhanov, Iklé, Country Product Dummy (CPD) method, basic headings, St

    The Distributional Impacts of Indonesia's Financial Crisis on Household Welfare: "A Rapid Response Methodology"

    Full text link
    Analyzing the distributional impacts of economic crises is important and, unfortunately, an ever more pressing need. If policymakers are to intervene to help those most adversely impacted, then policymakers need to identify those who have been most harmed and the magnitude of that harm. Furthermore, policy responses to economic crises typically must be timely. In this paper, we develop a simple methodology to fill the order and we've applied our methodology to analyze the impact of the Indonesian economic crisis on household welfare there. Using only pre-crisis household information, we estimate the compensating variation for Indonesian households following the 1997 Asian currency crisis and then explore the results with flexible non-parametric methods. We find that virtually every household was severely impacted, although it was the urban poor that fared the worst. The ability of poor rural households to produce food mitigated the worst consequences of the high inflation. The distributional conseqences are the same whether we allow households to substitute towards relatively cheaper goods or not. However the geographic location of the household mattered even within urban or rural areas and household income categories. Additionally, households with young children may have suffered disproportionately adverse effects.http://deepblue.lib.umich.edu/bitstream/2027.42/39771/3/wp387.pd

    Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    Get PDF
    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described

    Application of advanced technologies to derivatives of current small transport aircraft

    Get PDF
    Mission requirements of the derivative design were the same as the baseline to readily identify the advanced technology benefits achieved. Advanced technologies investigated were in the areas of propulsion, structures and aerodynamics and a direct operating cost benefit analysis conducted to identify the most promising. Engine improvements appear most promising and combined with propeller, airfoil, surface coating and composite advanced technologies give a 21-25 percent DOC savings. A 17 percent higher acquisition cost is offset by a 34 percent savings in fuel used

    A sequential sampling strategy for extreme event statistics in nonlinear dynamical systems

    Full text link
    We develop a method for the evaluation of extreme event statistics associated with nonlinear dynamical systems, using a small number of samples. From an initial dataset of design points, we formulate a sequential strategy that provides the 'next-best' data point (set of parameters) that when evaluated results in improved estimates of the probability density function (pdf) for a scalar quantity of interest. The approach utilizes Gaussian process regression to perform Bayesian inference on the parameter-to-observation map describing the quantity of interest. We then approximate the desired pdf along with uncertainty bounds utilizing the posterior distribution of the inferred map. The 'next-best' design point is sequentially determined through an optimization procedure that selects the point in parameter space that maximally reduces uncertainty between the estimated bounds of the pdf prediction. Since the optimization process utilizes only information from the inferred map it has minimal computational cost. Moreover, the special form of the metric emphasizes the tails of the pdf. The method is practical for systems where the dimensionality of the parameter space is of moderate size, i.e. order O(10). We apply the method to estimate the extreme event statistics for a very high-dimensional system with millions of degrees of freedom: an offshore platform subjected to three-dimensional irregular waves. It is demonstrated that the developed approach can accurately determine the extreme event statistics using limited number of samples

    Why is productivity procyclical? Why do we care?

    Get PDF
    Productivity rises in booms and falls in recessions. There are four main explanations for this procyclical productivity: (i) procyclical technology shocks, (ii) widespread imperfect competition and increasing returns, (iii) variable utilization of inputs over the cycle, and (iv) resource reallocations. Recent macroeconomic literature views this stylized fact of procyclical productivity as an essential feature of business cycles because each explanation has important implications for macroeconomic modeling. In this paper, we discuss empirical methods for assessing the importance of these four explanations. We provide microfoundations for our preferred approach of estimating an explicitly first-order approximation to the production function, using a theoretically motivated proxy for utilization. When we implement this approach, we find that variable utilization and resource reallocations are particularly important in explaining procyclical productivity. We also argue that the reallocation effects that we identify are not "biases" -- they reflect changes in an economy’s ability to produce goods and services for final consumption from given primary inputs of capital and labor. Thus, from a normative viewpoint, reallocations are significant for welfare; from a positive viewpoint, they constitute potentially important amplification and propagation mechanisms for macroeconomic modeling.Productivity ; Business cycles

    A Quantum Monte Carlo algorithm for non-local corrections to the Dynamical Mean-Field Approximation

    Full text link
    We present the algorithmic details of the dynamical cluster approximation (DCA), with a quantum Monte Carlo (QMC) method used to solve the effective cluster problem. The DCA is a fully-causal approach which systematically restores non-local correlations to the dynamical mean field approximation (DMFA) while preserving the lattice symmetries. The DCA becomes exact for an infinite cluster size, while reducing to the DMFA for a cluster size of unity. We present a generalization of the Hirsch-Fye QMC algorithm for the solution of the embedded cluster problem. We use the two-dimensional Hubbard model to illustrate the performance of the DCA technique. At half-filling, we show that the DCA drives the spurious finite-temperature antiferromagnetic transition found in the DMFA slowly towards zero temperature as the cluster size increases, in conformity with the Mermin-Wagner theorem. Moreover, we find that there is a finite temperature metal to insulator transition which persists into the weak-coupling regime. This suggests that the magnetism of the model is Heisenberg like for all non-zero interactions. Away from half-filling, we find that the sign problem that arises in QMC simulations is significantly less severe in the context of DCA. Hence, we were able to obtain good statistics for small clusters. For these clusters, the DCA results show evidence of non-Fermi liquid behavior and superconductivity near half-filling.Comment: 25 pages, 15 figure
    corecore