71 research outputs found

    Nash implementation with little communication

    Get PDF
    The paper considers the communication complexity (measured in bits or real numbers) of Nash implementation of social choice rules. A key distinction is whether we restrict to the traditional one-stage mechanisms or allow multi-stage mechanisms. For one-stage mechanisms, the paper shows that for a large and important subclass of monotonic choice rules -- called "intersection monotonic" -- the total message space size needed for one-stage Nash implementation is essentially the same as that needed for "verification" (with honest agents who are privately informed about their preferences). According to Segal (2007), the latter is the size of the space of minimally informative budget equilibria verifying the choice rule. However, multi-stage mechanisms allow a drastic reduction in communication complexity. Namely, for an important subclass of intersection-monotonic choice rules (which includes rules based on coalitional blocking such as exact or approximate Pareto efficiency, stability, and envy-free allocations) we propose a two-stage Nash implementation mechanism in which each agent announces no more than two alternatives plus one bit per agent in any play. Such two-stage mechanisms bring about an exponential reduction in the communication complexity of Nash implementation for discrete communication measured in bits, or a reduction from infinite- to low-dimensional continuous communication.Monotonic social choice rules, Nash implementation, communication complexity,verification, realization, budget sets, price equilibria

    A simple status quo that ensures participation (with application to efficient bargaining)

    Get PDF
    We consider Bayesian incentive-compatible mechanisms with independent types and either private values or interdependent values that satisfy a form of "congruence." We show that in these settings, interim participation constraints are satisfied when the status quo is the randomized allocation that has the same distribution as the equilibrium allocation in the mechanism. Moreover, when utilities are convex in the allocation, we can instead satisfy participation constraints with the deterministic status quo equal to the expected equilibrium allocation in the mechanism. For quasilinear settings, these observations imply the possibility of efficient bargaining when the status quo specifies the expected efficient decision provided that the total surplus is convex in the decision.Efficient property rights, asymmetric information bargaining, transaction costs

    Looking at Cerebellar Malformations through Text-Mined Interactomes of Mice and Humans

    Get PDF
    WE HAVE GENERATED AND MADE PUBLICLY AVAILABLE TWO VERY LARGE NETWORKS OF MOLECULAR INTERACTIONS: 49,493 mouse-specific and 52,518 human-specific interactions. These networks were generated through automated analysis of 368,331 full-text research articles and 8,039,972 article abstracts from the PubMed database, using the GeneWays system. Our networks cover a wide spectrum of molecular interactions, such as bind, phosphorylate, glycosylate, and activate; 207 of these interaction types occur more than 1,000 times in our unfiltered, multi-species data set. Because mouse and human genes are linked through an orthological relationship, human and mouse networks are amenable to straightforward, joint computational analysis. Using our newly generated networks and known associations between mouse genes and cerebellar malformation phenotypes, we predicted a number of new associations between genes and five cerebellar phenotypes (small cerebellum, absent cerebellum, cerebellar degeneration, abnormal foliation, and abnormal vermis). Using a battery of statistical tests, we showed that genes that are associated with cerebellar phenotypes tend to form compact network clusters. Further, we observed that cerebellar malformation phenotypes tend to be associated with highly connected genes. This tendency was stronger for developmental phenotypes and weaker for cerebellar degeneration

    Probabilistic Inference of Transcription Factor Binding from Multiple Data Sources

    Get PDF
    An important problem in molecular biology is to build a complete understanding of transcriptional regulatory processes in the cell. We have developed a flexible, probabilistic framework to predict TF binding from multiple data sources that differs from the standard hypothesis testing (scanning) methods in several ways. Our probabilistic modeling framework estimates the probability of binding and, thus, naturally reflects our degree of belief in binding. Probabilistic modeling also allows for easy and systematic integration of our binding predictions into other probabilistic modeling methods, such as expression-based gene network inference. The method answers the question of whether the whole analyzed promoter has a binding site, but can also be extended to estimate the binding probability at each nucleotide position. Further, we introduce an extension to model combinatorial regulation by several TFs. Most importantly, the proposed methods can make principled probabilistic inference from multiple evidence sources, such as, multiple statistical models (motifs) of the TFs, evolutionary conservation, regulatory potential, CpG islands, nucleosome positioning, DNase hypersensitive sites, ChIP-chip binding segments and other (prior) sequence-based biological knowledge. We developed both a likelihood and a Bayesian method, where the latter is implemented with a Markov chain Monte Carlo algorithm. Results on a carefully constructed test set from the mouse genome demonstrate that principled data fusion can significantly improve the performance of TF binding prediction methods. We also applied the probabilistic modeling framework to all promoters in the mouse genome and the results indicate a sparse connectivity between transcriptional regulators and their target promoters. To facilitate analysis of other sequences and additional data, we have developed an on-line web tool, ProbTF, which implements our probabilistic TF binding prediction method using multiple data sources. Test data set, a web tool, source codes and supplementary data are available at: http://www.probtf.org

    Nash implementation with little communication

    No full text
    [This item is a preserved copy. To view the original, visit http://econtheory.org/] The paper considers the communication complexity (measured in bits or real numbers) of Nash implementation of social choice rules. A key distinction is whether we restrict to the traditional one-stage mechanisms or allow multi-stage mechanisms. For one-stage mechanisms, the paper shows that for a large and important subclass of monotonic choice rules -- called "intersection monotonic" -- the total message space size needed for one-stage Nash implementation is essentially the same as that needed for "verification" (with honest agents who are privately informed about their preferences). According to Segal (2007), the latter is the size of the space of minimally informative budget equilibria verifying the choice rule. However, multi-stage mechanisms allow a drastic reduction in communication complexity. Namely, for an important subclass of intersection-monotonic choice rules (which includes rules based on coalitional blocking such as exact or approximate Pareto efficiency, stability, and envy-free allocations) we propose a two-stage Nash implementation mechanism in which each agent announces no more than two alternatives plus one bit per agent in any play. Such two-stage mechanisms bring about an exponential reduction in the communication complexity of Nash implementation for discrete communication measured in bits, or a reduction from infinite- to low-dimensional continuous communication

    Monopoly and Soft Budget Constraint

    No full text
    A benevolent government may decide to subsidize an unprofitable monopoly whose profits do not capture all the social surplus from its production. Anticipating this, the firm may underinvest in order to become unprofitable and extract state subsidies. The resulting welfare loss may exceed by many times the deadweight cost of monopoly pricing. Committing the firm to a price ceiling may soften its budget constraint and thus reduce welfare. Competition can harden budget constraints in industries in which free entry is socially excessive.
    corecore