2,195 research outputs found

    Systems-compatible Incentives

    Get PDF
    Originally, the Internet was a technological playground, a collaborative endeavor among researchers who shared the common goal of achieving communication. Self-interest used not to be a concern, but the motivations of the Internet's participants have broadened. Today, the Internet consists of millions of commercial entities and nearly 2 billion users, who often have conflicting goals. For example, while Facebook gives users the illusion of access control, users do not have the ability to control how the personal data they upload is shared or sold by Facebook. Even in BitTorrent, where all users seemingly have the same motivation of downloading a file as quickly as possible, users can subvert the protocol to download more quickly without giving their fair share. These examples demonstrate that protocols that are merely technologically proficient are not enough. Successful networked systems must account for potentially competing interests. In this dissertation, I demonstrate how to build systems that give users incentives to follow the systems' protocols. To achieve incentive-compatible systems, I apply mechanisms from game theory and auction theory to protocol design. This approach has been considered in prior literature, but unfortunately has resulted in few real, deployed systems with incentives to cooperate. I identify the primary challenge in applying mechanism design and game theory to large-scale systems: the goals and assumptions of economic mechanisms often do not match those of networked systems. For example, while auction theory may assume a centralized clearing house, there is no analog in a decentralized system seeking to avoid single points of failure or centralized policies. Similarly, game theory often assumes that each player is able to observe everyone else's actions, or at the very least know how many other players there are, but maintaining perfect system-wide information is impossible in most systems. In other words, not all incentive mechanisms are systems-compatible. The main contribution of this dissertation is the design, implementation, and evaluation of various systems-compatible incentive mechanisms and their application to a wide range of deployable systems. These systems include BitTorrent, which is used to distribute a large file to a large number of downloaders, PeerWise, which leverages user cooperation to achieve lower latencies in Internet routing, and Hoodnets, a new system I present that allows users to share their cellular data access to obtain greater bandwidth on their mobile devices. Each of these systems represents a different point in the design space of systems-compatible incentives. Taken together, along with their implementations and evaluations, these systems demonstrate that systems-compatibility is crucial in achieving practical incentives in real systems. I present design principles outlining how to achieve systems-compatible incentives, which may serve an even broader range of systems than considered herein. I conclude this dissertation with what I consider to be the most important open problems in aligning the competing interests of the Internet's participants

    How Social Reputation Networks Interact with Competition in Anonymous Online Trading: An Experimental Study

    Get PDF
    Many Internet markets rely on ‘feedback systems’, essentially social networks of reputation, to facilitate trust and trustworthiness in anonymous transactions. Market competition creates incentives that arguably may enhance or curb the effectiveness of these systems. We investigate how different forms of market competition and social reputation networks interact in a series of laboratory online markets, where sellers face a moral hazard. We find that competition in strangers networks (where market encounters are one-shot) most frequently enhances trust and trustworthiness, and always increases total gains-from-trade. One reason is that information about reputation trumps pricing in the sense that traders usually do not conduct business with someone having a bad reputation not even for a substantial price discount. We also find that a reliable reputation network can largely reduce the advantage of partners networks (where a buyer and a seller can maintain repeated exchange with each other) in promoting trust and trustworthiness if the market is sufficiently competitive. We conclude that, overall, competitive online markets have more effective social reputation networks.reputation systems, e-commerce, internet markets, trust

    Social Reciprocity

    Get PDF
    We conduct a survey and find that 47% of respondents state they would sanction free riders in a team production scenario even though the respondent was not personally affected and no direct benefits could be expected to follow an intervention. To understand this phenomenon, we define social reciprocity as the act of demonstrating ones disapproval, at some personal cost, for the violation of a widely-held norm (for example, don’t free ride). Social reciprocity differs from reciprocity because social reciprocators punish all norm violators, regardless of group affiliation or whether or not the punisher bears the costs. Social reciprocity also differs from altruism because, while the latter is an outcome-oriented act benefiting someone else, the former is a triggered response not conditioned on future outcomes. To test the robustness of our survey results, we run a public goods experiment that allows players to punish each other. The experiment confirms the existence of social reciprocity and additionally demonstrates that more socially efficient outcomes arise when reciprocity can be expressed socially. Further we find that most subjects who punish do so to discipline transgressors and helping others is largely a positive externality. Finally, to provide some theoretical foundations for social reciprocity, we show that generalized punishment norms survive in one of the two stable equilibria of an evolutionary public goods game with selection drift.reciprocity, norm, experiment, public good, learning, evolution
    • 

    corecore