238 research outputs found

    Endogenous Beliefs and Institutional Structure in Competitive Equilibrium with Adverse Selection

    Get PDF
    I model ļ¬nancial markets that structure decision-making into discrete points separating contract oļ¬€ers, applications, and acceptance/denial decisions. Endogenous beliefs about applicantsā€™ risk types emerge as the institutional process extracts private information allowing uninformed ļ¬rms to infer risk qualities by comparing applications of many consumers. Endogenous beliefs and low-risk consumer behavior render truthful disclosure of transactions incentive compatible supporting a unique equilibrium robust to cream-skimming and cross-subsidizing deviations, even under Hellwigā€™s ā€œsecretā€ policy assumption. In equilibrium each type demands low-riskā€™s optimal pooling policy and high-risk supplement to full coverage at fair-price. Nonpassive consumersā€™ belief ļ¬rms are sequentially rational necessary for equilibrium; lemon equilibrium with only high-risk insured possible

    Production and Distribution in Agrarian Economies

    Get PDF

    Competitive Screening and Market Segmentation

    Get PDF
    We characterize competitive equilibrium in markets (ļ¬nancial etc.) where price taking Bayesian decision makers screen to accept or reject applicants. Unlike signaling models, equilibrium fails to resolve imperfect information. In classical statistics terminology, some qualiļ¬ed applicants are rejected (type I error) and some unqualiļ¬ed applicants are accepted (type II error). We report three new results: i. optimal ļ¬rm behavior is deduced to be a Bayesian variant of the Neyman-Pearson theorem; ii. competitive equilibrium entails screening if and only if (net of screening costs) the cost of type II errors exceed the cost of type I errors, i.e. contrary to signaling (where buyers identify more qualiļ¬ed applicants who self screen to diļ¬€erentiate themselves e.g. Stiglitz 1975), price taking ļ¬rms screen to avoid lower quality sellers; iii. equilibrium groups the least attractive applicants into a single high risk assignment pool. Depending on costs of screening, the unique equilibrium may involve complete pooling (all applicants trade at one price) or partial separation (there are m separate pools with successive pools supported by a single (rising) price and a subset of agents of diļ¬€erent screen levels trading at that price). A screening equilibrium has and the mth secondary market entails no screening, as the most adversely selected agents are assigned to the high risk pool. Screening induces market segmentation. Invariably secondary markets contain individuals who with better or diļ¬€erent screening mechanisms could be accepted in the primary market. What roles traits such as ethnicity, gender, and race might assume in such decision making is relegated to subsequent research to explore the statistical theory of discrimination

    A Behavioral Interpretation of the Origins of African American Family Structure

    Get PDF
    1960 to 1980 doubling (21% to 41%) of black children in one-parent families emerged from 1940-to-1970 urbanization converging population toward urbanized blacksā€™ historically stable high rate, not post-1960 welfare liberalization or deindustrialization. Urban and rural child socializations structured diļ¬€erent Jim Crow Era black family formations. Agrarian economic enclaves socialized conformity to Jim Crow and two-parent families; urban enclaves rebellion, male joblessness, and destabilized families. Proxying urban/rural residence at age 16 for socialization location, logistic regressions on sixties census data conļ¬rm the hypothesis. Racialized urban socialization negatively aļ¬€ected two-parent family formation and poverty status of blacks but not whites

    Output Supply, Employment, and Intra-Industry Wage Dispersion

    Get PDF

    Estimation of drift and diffusion functions from time series data: A maximum likelihood framework

    Full text link
    Complex systems are characterized by a huge number of degrees of freedom often interacting in a non-linear manner. In many cases macroscopic states, however, can be characterized by a small number of order parameters that obey stochastic dynamics in time. Recently techniques for the estimation of the corresponding stochastic differential equations from measured data have been introduced. This contribution develops a framework for the estimation of the functions and their respective (Bayesian posterior) confidence regions based on likelihood estimators. In succession approximations are introduced that significantly improve the efficiency of the estimation procedure. While being consistent with standard approaches to the problem this contribution solves important problems concerning the applicability and the accuracy of estimated parameters.Comment: 18 pages, 2 figure

    Normalized entropy aggregation for inhomogeneous large-scale data

    Get PDF
    It was already in the fifties of the last century that the relationship between information theory, statistics, and maximum entropy was established, following the works of Kullback, Leibler, Lindley and Jaynes. However, the applications were restricted to very specific domains and it was not until recently that the convergence between information processing, data analysis and inference demanded the foundation of a new scientific area, commonly referred to as Info-Metrics. As huge amount of information and large-scale data have become available, the term "big data" has been used to refer to the many kinds of challenges presented in its analysis: many observations, many variables (or both), limited computational resources, different time regimes or multiple sources. In this work, we consider one particular aspect of big data analysis which is the presence of inhomogeneities, compromising the use of the classical framework in regression modelling. A new approach is proposed, based on the introduction of the concepts of info-metrics to the analysis of inhomogeneous large-scale data. The framework of information-theoretic estimation methods is presented, along with some information measures. In particular, the normalized entropy is tested in aggregation procedures and some simulation results are presented.publishe

    The Double Role of Ethnic Heterogeneity in Explaining Welfare-State Generosity

    Get PDF
    Based on theoretical models of budget-balanced social insurance and individual choice, we argue that in addition to the well-known empathy mechanism whereby ethnic heterogeneity undermines sentiments of solidarity among a citizenry to reduce welfare generosity, population heterogeneity aļ¬€ects the generosity of a polityā€™s social insurance programs through another distinct mechanism, political conflict . Ethnic heterogeneity likely intensiļ¬es political conflict and reduces welfare generosity because heterogeneity of unemployment risk makes it more diļ¬€icult to achieve social consensus concerning tax-beneļ¬t programs. Utilizing two separate regression analyses covering highly diverse polities, the 50 U.S. states and District of Columbia (CPS data), and 13 OECD countries (LIS data), we ļ¬nd strong evidence that empirically distinct empathy and political conflict eļ¬€ects on unemployment insurance programs characterize contemporary politics. Our ļ¬ndings suggest existing analyses of the negative relationship between ethnic heterogeneity and the size of the welfare state likely over- or underestimate the empathy eļ¬€ect. For example, perhaps surprisingly, had our analysis of US data omitted a measure of unemployment dispersion, the negative eļ¬€ect of ethnic fractionalization would have been underestimated

    Nitrogen fertilizer effects on soil carbon balances in Midwestern U.S. agricultural systems

    Get PDF
    A single ecosystem dominates the Midwestern United States, occupying 26 million hectares in five states alone: the cornā€“soybean agroecosystem [Zea mays L.ā€“Glycine max (L.) Merr.]. Nitrogen (N) fertilization could influence the soil carbon (C) balance in this system because the corn phase is fertilized in 97ā€“100% of farms, at an average rate of 135 kg NĀ·haāˆ’1Ā·yrāˆ’1. We evaluated the impacts on two major processes that determine the soil C balance, the rates of organic-carbon (OC) inputs and decay, at four levels of N fertilization, 0, 90, 180, and 270 kg/ha, in two long-term experimental sites in Mollisols in Iowa, USA. We compared the cornā€“soybean system with other experimental cropping systems fertilized with N in the corn phases only: continuous corn for grain; cornā€“corn-oats (Avena sativa L.)ā€“alfalfa (Medicago sativa L.; cornā€“oatsā€“alfalfaā€“alfalfa; and continuous soybean. In all systems, we estimated long-term OC inputs and decay rates over all phases of the rotations, based on long-term yield data, harvest indices (HI), and root : shoot data. For corn, we measured these two ratios in the four N treatments in a single year in each site; for other crops we used published ratios. Total OC inputs were calculated as aboveground plus belowground net primary production (NPP) minus harvested yield. For corn, measured total OC inputs increased with N fertilization (P \u3c 0.05, both sites). Belowground NPP, comprising only 6ā€“22% of total corn NPP, was not significantly influenced by N fertilization. When all phases of the crop rotations were evaluated over the long term, OC decay rates increased concomitantly with OC input rates in several systems. Increases in decay rates with N fertilization apparently offset gains in carbon inputs to the soil in such a way that soil C sequestration was virtually nil in 78% of the systems studied, despite up to 48 years of N additions. The quantity of belowground OC inputs was the best predictor of long-term soil C storage. This indicates that, in these systems, in comparison with increased N-fertilizer additions, selection of crops with high belowground NPP is a more effective management practice for increasing soil C sequestration

    Distinct Quantum States Can Be Compatible with a Single State of Reality

    Get PDF
    Perhaps the quantum state represents information about reality, and not reality directly. Wave function collapse is then possibly no more mysterious than a Bayesian update of a probability distribution given new data. We consider models for quantum systems with measurement outcomes determined by an underlying physical state of the system but where several quantum states are consistent with a single underlying state---i.e., probability distributions for distinct quantum states overlap. Significantly, we demonstrate by example that additional assumptions are always necessary to rule out such a model.Comment: 5 pages, 2 figure
    • ā€¦
    corecore