6,466 research outputs found

    Playing the wrong game: An experimental analysis of relational complexity and strategic misrepresentation

    Get PDF
    It has been suggested that players often produce simplified and/or misspecified mental representations of interactive decision problems (Kreps, 1990). We submit that the relational structure of players’ preferences in a game induces cognitive complexity, and may be an important driver of such simplifications. We provide a formal classification of order structures in two-person normal form games based on the two properties of monotonicity and projectivity, and present experiments in which subjects must first construct a representation of games of different relational complexity, and subsequently play the games according to their own representation. Experimental results support the hypothesis that relational complexity matters. More complex games are harder to represent, and this difficulty is correlated with measures of short term memory capacity. Furthermore, most erroneous representations are less complex than the correct ones. In addition, subjects who misrepresent the games behave consistently with such representations according to simple but rational decision criteria. This suggests that in many strategic settings individuals may act optimally on the ground of simplified and mistaken premises.pure motive, mixed motive, preferences, bi-orders, language, cognition, projectivity, monotonicity, short term memory, experiments

    Learning Utilities and Equilibria in Non-Truthful Auctions

    Full text link
    In non-truthful auctions, agents' utility for a strategy depends on the strategies of the opponents and also the prior distribution over their private types; the set of Bayes Nash equilibria generally has an intricate dependence on the prior. Using the First Price Auction as our main demonstrating example, we show that O~(n/ϵ2)\tilde O(n / \epsilon^2) samples from the prior with nn agents suffice for an algorithm to learn the interim utilities for all monotone bidding strategies. As a consequence, this number of samples suffice for learning all approximate equilibria. We give almost matching (up to polylog factors) lower bound on the sample complexity for learning utilities. We also consider settings where agents must pay a search cost to discover their own types. Drawing on a connection between this setting and the first price auction, discovered recently by Kleinberg et al. (2016), we show that O~(n/ϵ2)\tilde O(n / \epsilon^2) samples suffice for utilities and equilibria to be estimated in a near welfare-optimal descending auction in this setting. En route, we improve the sample complexity bound, recently obtained by Guo et al. (2019), for the Pandora's Box problem, which is a classical model for sequential consumer search

    An Alternative Approach to the Calculation and Analysis of Connectivity in the World City Network

    Full text link
    Empirical research on world cities often draws on Taylor's (2001) notion of an 'interlocking network model', in which office networks of globalized service firms are assumed to shape the spatialities of urban networks. In spite of its many merits, this approach is limited because the resultant adjacency matrices are not really fit for network-analytic calculations. We therefore propose a fresh analytical approach using a primary linkage algorithm that produces a one-mode directed graph based on Taylor's two-mode city/firm network data. The procedure has the advantage of creating less dense networks when compared to the interlocking network model, while nonetheless retaining the network structure apparent in the initial dataset. We randomize the empirical network with a bootstrapping simulation approach, and compare the simulated parameters of this null-model with our empirical network parameter (i.e. betweenness centrality). We find that our approach produces results that are comparable to those of the standard interlocking network model. However, because our approach is based on an actual graph representation and network analysis, we are able to assess cities' position in the network at large. For instance, we find that cities such as Tokyo, Sydney, Melbourne, Almaty and Karachi hold more strategic and valuable positions than suggested in the interlocking networks as they play a bridging role in connecting cities across regions. In general, we argue that our graph representation allows for further and deeper analysis of the original data, further extending world city network research into a theory-based empirical research approach.Comment: 18 pages, 9 figures, 2 table

    On Imperfect Recall in Multi-Agent Influence Diagrams

    Full text link
    Multi-agent influence diagrams (MAIDs) are a popular game-theoretic model based on Bayesian networks. In some settings, MAIDs offer significant advantages over extensive-form game representations. Previous work on MAIDs has assumed that agents employ behavioural policies, which set independent conditional probability distributions over actions for each of their decisions. In settings with imperfect recall, however, a Nash equilibrium in behavioural policies may not exist. We overcome this by showing how to solve MAIDs with forgetful and absent-minded agents using mixed policies and two types of correlated equilibrium. We also analyse the computational complexity of key decision problems in MAIDs, and explore tractable cases. Finally, we describe applications of MAIDs to Markov games and team situations, where imperfect recall is often unavoidable.Comment: In Proceedings TARK 2023, arXiv:2307.0400

    Unitary versus collective models of the household : time to shift theburden of proof?

    Get PDF
    Until recently, most economists viewed the household as a collection of individuals who behave as if in agreement on how best to combine time and goods (purchased or produced at home) to produce commodities that maximize some common welfare index. This model has been extended far beyond standard demand analysis to include the determinants of health, fertility, education, child fostering, migration, labor supply, home production, land tenure, and crop adoption. The appeal of the unitary model is the simplicity of comparative statics generated and the diversity of issues it can address. But, argue the authors, its theoretical foundations are weak and restrictive; its underlying assumptions are of questionable validity; it has not stood up well to empirical testing; and it ignores or obscures important policy issues. They argue that economists should regard households as collective rather than unitary entities. They make a case for accepting the collective model (with cooperative and noncooperative versions) as the industry standard - with caveats. The unitary model should be regarded as a special subset of the collective approach, suitable under certain conditions. The burden of proof should shift to those who claim the unitary model as the rule and collective models as the exception. Implicit in the authors'argument is the view that household economics has not taken Becker seriously enough."A household is truly a'small factory,'"wrote Becker (1965)."It combines capital goods, raw materials, and labor to clean, feed, procreate, and otherwise produce useful commodities."The authors, too, perceive the household as a factory, which, like all factories, contains individuals who - motivated at times by altruism, at times by self-interest, and often by both - cajole, cooperate, threaten, help, argue, support, and, indeed, occasionally walk out on each other. Labor economists and industrial organization theorists have long exploited the value of going inside the black box of the factory. It is time to do the same for household economics, say the authors.Health Economics&Finance,Environmental Economics&Policies,Poverty Lines,Housing&Human Habitats,Educational Technology and Distance Education

    Learning trajectories related to bivariate data in contemporary high school mathematics textbook series in the United States

    Get PDF
    Bivariate relationships play a critical role in school statistics, and textbooks are significant in determining student learning. In recent years, researchers have emphasized the importance of learning trajectories (LTs) in mathematics education. In this study, I examined LTs for bivariate data in relation to the development of covariational reasoning in three high school textbooks series: Holt McDougal Larson (HML), The University of Chicago School of Mathematics Project (UCSMP), and Core-Plus Mathematics Project (CPMP). The LTs were generated by coding for the presence of variable combinations, learning goals, and techniques and theories. Task features were analyzed in relation to the GAISE Framework, NAEP mathematical complexity, purpose and utility, and the CCSSM Standards for Mathematical Practice. The LTs varied by the presence, development, and emphases of bivariate content and alignment with the GAISE Framework and CCSSM. Across three series, about 80% to 90% of the 582 bivariate instances addressed two numerical variables. The CPMP series followed the GAISE's developmental progression for all combinations whereas UCSMP deviated for two categorical variables. All CCSSM learning expectations were found in HML and CPMP but not in UCSMP. At the same time, several bivariate learning expectations present in textbooks were not found in CCSSM. For the task features, few instances were at a high level of mathematical complexity and rarely included a Collect Data component. Analyses revealed the accordance of the GAISE and mathematical complexity frameworks. Research findings provide implications for curriculum development, content analysis, and teacher education, and challenge the notion of CCSSM aligned curricula.Includes bibliographical references (pages 235-246)

    Violence indicators in Quebrada de Humahuaca, Jujuy, Argentina: The Regional Development Period from a regional perspective

    Get PDF
    Quebrada de Humahuaca (Jujuy, Argentina) has been extensively studied by archaeologists. Studies have been focused mainly on the Late Regional Development Period (1250 1430 AD), which has been defined as a time of social conflict. In this paper we present bioarchaeological evidence of interper-sonal violence related trauma found in populations of the region. A sample of 153 skulls from three sites of Quebrada de Humahuaca: Los Amarillos, La Huerta and Yacoraite, were analyzed, differentiating antemortem and perimortem fractures, cut marks as well as the presence of trophy skulls. The results were subjected to nonparametric statistical tests, in order to assess inter-site level differences, sex and age distribution. Bioarchaeological analysis determined a high frequency of interpersonal violence related trauma. Most registered injuries belonged to the antemortem type, demonstrating that the individuals of those events that had generated said cranial trauma had managed to survive. Interpersonal violence affected both men and women the same, registering no differences in neither sex nor age group, however evidence of trauma varied geographically from site to site. Statistical calculations reveal that the Yacoraite site is where the highest frequency of trauma was found, while La Huerta is where the highest level of trophy skulls was registered.Fil: Seldes, Verónica. Universidad de Buenos Aires. Facultad de Filosofía y Letras. Departamento de Ciencias Antropológicas; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Botta, Florencia Natalia. Universidad de Buenos Aires. Facultad de Filosofía y Letras; Argentin
    corecore