233 research outputs found

    Two New Proofs of Afriat's Theorem

    Get PDF
    We provide two new, simple proofs of Afriat's celebrated theorem stating that a finite set of price-quantity observations is consistent with utility maximization if, and only if, the observations satisfy a variation of the Strong Axiom of Revealed Preference known as the Generalized Axiom of Revealed Preference.Afriat's theorem, SARP, GARP

    Two New Proofs of Afriat's Theorem

    Full text link
    Two New Proofs of Afriat's Theore

    Two New Proofs of Afriat\u27s Theorem

    Get PDF
    We provide two new, simple proofs of Afriat’s celebrated theorem stating that a finite set of price-quantity observations is consistent with utility maximization if, and only if, the observations satisfy a variation of the Strong Axiom of Revealed Preference known as the Generalized Axiom of Revealed Preference

    Leverage-induced systemic risk under Basle II and other credit risk policies

    Get PDF
    We use a simple agent based model of value investors in financial markets to test three credit regulation policies. The first is the unregulated case, which only imposes limits on maximum leverage. The second is Basle II and the third is a hypothetical alternative in which banks perfectly hedge all of their leverage-induced risk with options. When compared to the unregulated case both Basle II and the perfect hedge policy reduce the risk of default when leverage is low but increase it when leverage is high. This is because both regulation policies increase the amount of synchronized buying and selling needed to achieve deleveraging, which can destabilize the market. None of these policies are optimal for everyone: Risk neutral investors prefer the unregulated case with low maximum leverage, banks prefer the perfect hedge policy, and fund managers prefer the unregulated case with high maximum leverage. No one prefers Basle II.Comment: 27 pages, 8 figure

    The Combinatorial World (of Auctions) According to GARP

    Full text link
    Revealed preference techniques are used to test whether a data set is compatible with rational behaviour. They are also incorporated as constraints in mechanism design to encourage truthful behaviour in applications such as combinatorial auctions. In the auction setting, we present an efficient combinatorial algorithm to find a virtual valuation function with the optimal (additive) rationality guarantee. Moreover, we show that there exists such a valuation function that both is individually rational and is minimum (that is, it is component-wise dominated by any other individually rational, virtual valuation function that approximately fits the data). Similarly, given upper bound constraints on the valuation function, we show how to fit the maximum virtual valuation function with the optimal additive rationality guarantee. In practice, revealed preference bidding constraints are very demanding. We explain how approximate rationality can be used to create relaxed revealed preference constraints in an auction. We then show how combinatorial methods can be used to implement these relaxed constraints. Worst/best-case welfare guarantees that result from the use of such mechanisms can be quantified via the minimum/maximum virtual valuation function

    Sources of variation in baseline gene expression levels from toxicogenomics study control animals across multiple laboratories

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The use of gene expression profiling in both clinical and laboratory settings would be enhanced by better characterization of variance due to individual, environmental, and technical factors. Meta-analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies could yield useful information on baseline fluctuations in gene expression, although control animal data has not been available on a scale and in a form best served for data-mining.</p> <p>Results</p> <p>A dataset of control animal microarray expression data was assembled by a working group of the Health and Environmental Sciences Institute's Technical Committee on the Application of Genomics in Mechanism Based Risk Assessment in order to provide a public resource for assessments of variability in baseline gene expression. Data from over 500 Affymetrix microarrays from control rat liver and kidney were collected from 16 different institutions. Thirty-five biological and technical factors were obtained for each animal, describing a wide range of study characteristics, and a subset were evaluated in detail for their contribution to total variability using multivariate statistical and graphical techniques.</p> <p>Conclusion</p> <p>The study factors that emerged as key sources of variability included gender, organ section, strain, and fasting state. These and other study factors were identified as key descriptors that should be included in the minimal information about a toxicogenomics study needed for interpretation of results by an independent source. Genes that are the most and least variable, gender-selective, or altered by fasting were also identified and functionally categorized. Better characterization of gene expression variability in control animals will aid in the design of toxicogenomics studies and in the interpretation of their results.</p

    Capital Flows: Issues and Policies

    Full text link
    This paper presents an analytical overview of recent contributions to the literature on the policy implications of capital flows in emerging and developing countries, focusing specifically on capital inflows as well as on the links between inflows and subsequent capital-flow reversals. The objective is to clarify the policy challenges that such inflows pose and to evaluate the policy alternatives available to the recipient countries to cope with those challenges. A large menu of possible policy responses to large capital inflows is considered, and experience with the use of such policies is reviewed. A policy `decision tree`-i. e. , an algorithm for determining how to deploy policies in response to an exogenous inflow episode- is developed, and strategies to achieve resilience to both inflows and outflows in a world where exogenous events may frequently drive capital flows in both directions are discussed

    Discovery and characterization of artifactual mutations in deep coverage targeted capture sequencing data due to oxidative DNA damage during sample preparation

    Get PDF
    As researchers begin probing deep coverage sequencing data for increasingly rare mutations and subclonal events, the fidelity of next generation sequencing (NGS) laboratory methods will become increasingly critical. Although error rates for sequencing and polymerase chain reaction (PCR) are well documented, the effects that DNA extraction and other library preparation steps could have on downstream sequence integrity have not been thoroughly evaluated. Here, we describe the discovery of novel C > A/G > T transversion artifacts found at low allelic fractions in targeted capture data. Characteristics such as sequencer read orientation and presence in both tumor and normal samples strongly indicated a non-biological mechanism. We identified the source as oxidation of DNA during acoustic shearing in samples containing reactive contaminants from the extraction process. We show generation of 8-oxoguanine (8-oxoG) lesions during DNA shearing, present analysis tools to detect oxidation in sequencing data and suggest methods to reduce DNA oxidation through the introduction of antioxidants. Further, informatics methods are presented to confidently filter these artifacts from sequencing data sets. Though only seen in a low percentage of reads in affected samples, such artifacts could have profoundly deleterious effects on the ability to confidently call rare mutations, and eliminating other possible sources of artifacts should become a priority for the research community.National Human Genome Research Institute (U.S.) (HG03067-05

    Modeling biomedical experimental processes with OBI

    Get PDF
    BACKGROUND: Experimental descriptions are typically stored as free text without using standardized terminology, creating challenges in comparison, reproduction and analysis. These difficulties impose limitations on data exchange and information retrieval. RESULTS: The Ontology for Biomedical Investigations (OBI), developed as a global, cross-community effort, provides a resource that represents biomedical investigations in an explicit and integrative framework. Here we detail three real-world applications of OBI, provide detailed modeling information and explain how to use OBI. CONCLUSION: We demonstrate how OBI can be applied to different biomedical investigations to both facilitate interpretation of the experimental process and increase the computational processing and integration within the Semantic Web. The logical definitions of the entities involved allow computers to unambiguously understand and integrate different biological experimental processes and their relevant components. AVAILABILITY: OBI is available at http://purl.obolibrary.org/obo/obi/2009-11-02/obi.ow
    corecore