978 research outputs found

    Middlemen versus Market Makers: A Theory of Competitive Exchange

    Get PDF
    We present a model in which the microstructure of trade in a commodity or asset is endogenously determined. Producers and consumers of a commodity (or buyers and sellers of an asset) who wish to trade can choose between two competing types of intermediaries: 'middlemen' (dealer/brokers) and 'market makers' (specialists). Market makers post publicly observable bid and ask prices, whereas the prices quoted by different middlemen are private information that can only be obtained through a costly search process. We consider an initial equilibrium where there are no market makers but there is free entry of middlemen with heterogeneous transactions costs. We characterize conditions under which entry of a single market maker can be profitable even though it is common knowledge that all surviving middlemen will undercut the market maker's publicly posted bid and ask prices in the post-entry equilibrium. The market maker's entry induces the surviving middlemen to reduce their bid-ask spreads, and as a result, all producers and consumers who choose to participate in the market enjoy a strict increase in their expected gains from trade. We show that strict Pareto improvements occur even in cases where the market maker's entry drives all middlemen out of business, monopolizing the intermediation of trade in the market.

    Econometric Methods for Endogenously Sampled Time Series: The Case of Commodity Price Speculation in the Steel Market

    Get PDF
    This paper studies the econometric problems associated with estimation of a stochastic process that is endogenously sampled. Our interest is to infer the law of motion of a discrete-time stochastic process {p_t} that is observed only at a subset of times {t_1,...,t_n} that depend on the outcome of a probabilistic sampling rule that depends on the history of the process as well as other observed covariates x_t. We focus on a particular example where p_t denotes the daily wholesale price of a standardized steel product. However there are no formal exchanges or centralized markets where steel is traded and pt can be observed. Instead nearly all steel transaction prices are a result of private bilateral negotiations between buyers and sellers, typically intermediated by middlemen known as steel service centers. Even though there is no central record of daily transactions prices in the steel market, we do observe transaction prices for a particular firm -- a steel service center that purchases large quantities of steel in the wholesale market for subsequent resale in the retail market. The endogenous sampling problem arises from the fact that the firm only records p_t on the days that it purchases steel. We present a parametric analysis of this problem under the assumption that the timing of steel purchases is part of an optimal trading strategy that maximizes the firm's expected discounted trading profits. We derive a parametric partial information maximum likelihood (PIML) estimator that solves the endogenous sampling problem and efficiently estimates the unknown parameters of a Markov transition probability that determines the law of motion for the underlying {p_t} process. The PIML estimator also yields estimates of the structural parameters that determine the optimal trading rule. We also introduce an alternative consistent, less efficient, but computationally simpler simulated minimum distance (SMD) estimator that avoids high dimensional numerical integrations required by the PIML estimator. Using the SMD estimator, we provide estimates of a truncated lognormal AR(1) model of the wholesale price processes for particular types of steel plate. We use this to infer the share of the middleman's discounted profits that are due to markups paid by its retail customers, and the share due to price speculation. The latter measures the firm's success in forecasting steel prices and in timing its purchases in order to "buy low and sell high'." The more successful the firm is in speculation (i.e., in strategically timing its purchases), the more serious are the potential biases that would result from failing to account for the endogeneity of the sampling process.Endogenous sampling, Markov processes, Maximum likelihood, Simulation estimation

    Econometric Methods for Endogenously Sampled Time Series: The Case of Commodity Price Speculation in the Steel Market

    Get PDF
    This paper studies the econometric problems associated with estimation of a stochastic process that is endogenously sampled. Our interest is to infer the law of motion of a discrete-time stochastic process {pt} that is observed only at a subset of times {t1,..., tn} that depend on the outcome of a probabilistic sampling rule that depends on the history of the process as well as other observed covariates xt . We focus on a particular example where pt denotes the daily wholesale price of a standardized steel product. However there are no formal exchanges or centralized markets where steel is traded and pt can be observed. Instead nearly all steel transaction prices are a result of private bilateral negotiations between buyers and sellers, typically intermediated by middlemen known as steel service centers. Even though there is no central record of daily transactions prices in the steel market, we do observe transaction prices for a particular firm -- a steel service center that purchases large quantities of steel in the wholesale market for subsequent resale in the retail market. The endogenous sampling problem arises from the fact that the firm only records pt on the days that it purchases steel. We present a parametric analysis of this problem under the assumption that the timing of steel purchases is part of an optimal trading strategy that maximizes the firm's expected discounted trading profits. We derive a parametric partial information maximum likelihood (PIML) estimator that solves the endogenous sampling problem and efficiently estimates the unknown parameters of a Markov transition probability that determines the law of motion for the underlying {pt} process. The PIML estimator also yields estimates of the structural parameters that determine the optimal trading rule. We also introduce an alternative consistent, less efficient, but computationally simpler simulated minimum distance (SMD) estimator that avoids high dimensional numerical integrations required by the PIML estimator. Using the SMD estimator, we provide estimates of a truncated lognormal AR(1) model of the wholesale price processes for particular types of steel plate. We use this to infer the share of the middleman's discounted profits that are due to markups paid by its retail customers, and the share due to price speculation. The latter measures the firm's success in forecasting steel prices and in timing its purchases in order to buy low and sell high'. The more successful the firm is in speculation (i.e. in strategically timing its purchases), the more serious are the potential biases that would result from failing to account for the endogeneity of the sampling process.

    An Empirical Model of Inventory Investment by Durable Commodity Intermediaries

    Get PDF
    This paper introduces a new detailed data set of high-frequency observations on inventory investment by a U.S. steel wholesaler. Our analysis of these data leads to six main conclusions: orders and sales are made infrequently; orders are more volatile than sales; order sizes vary considerably; there is substantial high-frequency variation in the firm's sales prices; inventory/sales ratios are unstable; and there are occasional stockouts. We model the firm generically as a durable commodity intermediary that engages in commodity price speculation. We demonstrate that the firm's inventory investment behavior at the product level is well approximated by an optimal trading strategy from the solution to a nonlinear dynamic programming problem with two continuous state variables and one continuous control variable that is subject to frequently binding inequality constraints. We show that the optimal trading strategy is a generalized (S,s) rule. That is, whenever the firm's inventory level q falls below the order threshold s(p) the firm places an order of size S(p) - q in order to attain a target inventory level S(p) satisfying S(p) >= s(p), where p is the current spot price at which the firm can purchase unlimited amounts of the commodity after incurring a fixed order cost K. We show that the (S,s) bands are decreasing functions of p, capturing the basic intuition of commodity price speculation, namely, that it is optimal for the firm to hold higher inventories when the spot price is low than when it is high in order to profit from "buying low and selling high." We simulate a calibrated version of this model and show that the simulated data exhibit the key features of inventory investment we observe in the data.Commodities, inventories, dynamic programming

    How Large are the Classification Errors in the Social Security Disability Award Process?

    Get PDF
    This paper presents an .audit. of the multistage application and appeal process that the U.S. Social Security Administration (SSA) uses to determine eligibility for disability benefits from the Disability Insurance (DI) and Supplemental Security Income (SSI) programs. We use a subset of individuals from the Health and Retirement Study who applied for DI or SSI benefits between 1992 and 1996, to estimate classification error rates under the hypothesis that applicants' self-reported disability status and the SSA's ultimate award decision are noisy but unbiased indicators of a latent .true disability status. indicator. We find that approximately 20% of SSI/DI applicants who are ultimately awarded benefits are not disabled, and that 60% of applicants who were denied benefits are disabled. We also construct an optimal statistical screening rule that results in significantly lower classification error rates than does SSA's current award process.Social Security Disability Insurance, Supplemental Security Income, Health and Retirement Study, Classification Errors.

    How Large are the Classification Errors in the Social Security Disability Award Process?

    Get PDF
    This paper presents an audit' of the multistage application and appeal process that the U.S. Social Security Administration (SSA) uses to determine eligibility for disability benefits from the Disability Insurance (DI) and Supplemental Security Income (SSI) programs. We study a subset of individuals from the Health and Retirement Study (HRS) who applied for DI or SSI benefits between 1992 and 1996. We compare the SSA's ultimate award decision (i.e. after allowing for appeals) to the applicant's self-reported disability status. We use these data to estimate classification error rates under the hypothesis that applicants' self-reported disability status and the SSA's ultimate award decision are noisy but unbiased indicators of, a latent true disability status' indicator. We find that approximately 20% of SSI/DI applicants who are ultimately awarded benefits are not disabled, and that 60% of applicants who were denied benefits are disabled. Our analysis also yields insights into the patterns of self-selection induced by varying delays and award probabilities at various levels of the application and appeal process. We construct an optimal statistical screening rule using a subset of objective health indicators that the SSA uses in making award decisions that results in significantly lower classification error rates than does SSA's current award process.

    Communication and re-use of chemical information in bioscience.

    Get PDF
    The current methods of publishing chemical information in bioscience articles are analysed. Using 3 papers as use-cases, it is shown that conventional methods using human procedures, including cut-and-paste are time-consuming and introduce errors. The meaning of chemical terms and the identity of compounds is often ambiguous. valuable experimental data such as spectra and computational results are almost always omitted. We describe an Open XML architecture at proof-of-concept which addresses these concerns. Compounds are identified through explicit connection tables or links to persistent Open resources such as PubChem. It is argued that if publishers adopt these tools and protocols, then the quality and quantity of chemical information available to bioscientists will increase and the authors, publishers and readers will find the process cost-effective.An article submitted to BiomedCentral Bioinformatics, created on request with their Publicon system. The transformed manuscript is archived as PDF. Although it has been through the publishers system this is purely automatic and the contents are those of a pre-refereed preprint. The formatting is provided by the system and tables and figures appear at the end. An accommpanying submission, http://www.dspace.cam.ac.uk/handle/1810/34580, describes the rationale and cultural aspects of publishing , abstracting and aggregating chemical information. BMC is an Open Access publisher and we emphasize that all content is re-usable under Creative Commons Licens

    Is the ’Linkage Principle’ Valid?: Evidence from the Field

    Get PDF
    revenue comparison, auction choice, linkage principle, used-car auctions

    A Dynamic Programming Model of Retirement Behavior

    Get PDF
    This paper formulates a model of retirement behavior based on the solution to a stochastic dynamic programming problem. The workers objective is to maximize expected discounted utility over his remaining lifetime. At each time period the worker chooses how much to consume and whether to work full-time, part-time, or exit the labor force. The model accounts for the sequential nature f the retirement decision problem, and the role of expectations of uncertain future variables such as the worker's future lifespan, health status, marital and family status, employment status, as well as earnings from employment, assets, and social security retirement, disability and medicare payments. This paper applies a "nested fixed point" algorithm that converts the dynamic programming problem into the problem of repeatedly recomputing the fixed point to a contraction mapping operator as a subroutine of a standard nonlinear maximum likelihood program. The goal of the paper is to demonstrate that a fairly complex and realistic formulation of the retirement problem can be estimated using this algorithm and a current generation supercomputer, the Cray-2.

    Chemistry in Bioinformatics

    Get PDF
    A preprint of an invited submission to BioMedCentral Bioinformatics. This short manuscript is an overview or the current problems and opportunities in publishing chemical information. Full details of technology are given in the sibling manuscript http://www.dspace.cam.ac.uk/handle/1810/34579 The manuscript is the authors' preprint although it has been automatically transformed into this archived PDF by the submission system. The authors are not responsible for the formattingChemical information is now seen as critical for most areas of life sciences. But unlike Bioinformatics, where data is Openly available and freely re−usable, most chemical information is closed and cannot be re−distributed without permission. This has led to a failure to adopt modern informatics and software techniques and therefore paucity of chemistry in bioinformatics. New technology, however, offers the hope of making chemical data (compounds and properties) Free during the authoring process. We argue that the technology is already available; we require a collective agreement to enhance publication protocols
    • …
    corecore