393 research outputs found

    English Auctions with toeholds: An experimental study

    Get PDF
    We run experiments on English Auctions where the bidders already own a part (toehold) of the good for sale. The theory predicts a very strong effect of even small toeholds, however we find the effects are not so strong in the lab. We explain this by analyzing the flatness of the payoff functions, which leads to relatively costless deviations from the equilibrium strategies. We find that a levels of reasoning model explains the results better than the Nash equilibrium. Moreover, we find that although big toeholds can be effective, the cost to acquire them might be higher than the strategic benefit they bring. Finally our results show that in general the seller’s revenues fall when the playing field is uneven.Experiments, toehold auction, takeover, payoff, flatness, quantal response, level-k, LeeX

    The Effects of the Economic Crisis on Greek Heritage: A View from the Private Cultural Sector

    Get PDF
    N/

    Coalition Formation in a Legislative Voting Game

    Get PDF
    We experimentally investigate the Jackson-Moselle (2002) model where legislators bargain over policy proposals and the allocation of private goods. Key comparative static predictions of the model hold as policy proposals shift in the predicted direction with private goods, with the variance in policy outcomes increasing as well. Private goods increase total welfare even after accounting for their cost and help secure legislative compromise. Coalition formations are better characterized by an efficient equal split between coalition partners than the stationary subgame perfect equilibrium prediction

    Extreme Scale De Novo Metagenome Assembly

    Full text link
    Metagenome assembly is the process of transforming a set of short, overlapping, and potentially erroneous DNA segments from environmental samples into the accurate representation of the underlying microbiomes's genomes. State-of-the-art tools require big shared memory machines and cannot handle contemporary metagenome datasets that exceed Terabytes in size. In this paper, we introduce the MetaHipMer pipeline, a high-quality and high-performance metagenome assembler that employs an iterative de Bruijn graph approach. MetaHipMer leverages a specialized scaffolding algorithm that produces long scaffolds and accommodates the idiosyncrasies of metagenomes. MetaHipMer is end-to-end parallelized using the Unified Parallel C language and therefore can run seamlessly on shared and distributed-memory systems. Experimental results show that MetaHipMer matches or outperforms the state-of-the-art tools in terms of accuracy. Moreover, MetaHipMer scales efficiently to large concurrencies and is able to assemble previously intractable grand challenge metagenomes. We demonstrate the unprecedented capability of MetaHipMer by computing the first full assembly of the Twitchell Wetlands dataset, consisting of 7.5 billion reads - size 2.6 TBytes.Comment: Accepted to SC1
    • …
    corecore