103,870 research outputs found

    A study of memory effects in a chess database

    Full text link
    A series of recent works studying a database of chronologically sorted chess games --containing 1.4 million games played by humans between 1998 and 2007-- have shown that the popularity distribution of chess game-lines follows a Zipf's law, and that time series inferred from the sequences of those game-lines exhibit long-range memory effects. The presence of Zipf's law together with long-range memory effects was observed in several systems, however, the simultaneous emergence of these two phenomena were always studied separately up to now. In this work, by making use of a variant of the Yule-Simon preferential growth model, introduced by Cattuto et al., we provide an explanation for the simultaneous emergence of Zipf's law and long-range correlations memory effects in a chess database. We find that Cattuto's Model (CM) is able to reproduce both, Zipf's law and the long-range correlations, including size-dependent scaling of the Hurst exponent for the corresponding time series. CM allows an explanation for the simultaneous emergence of these two phenomena via a preferential growth dynamics, including a memory kernel, in the popularity distribution of chess game-lines. This mechanism results in an aging process in the chess game-line choice as the database grows. Moreover, we find burstiness in the activity of subsets of the most active players, although the aggregated activity of the pool of players displays inter-event times without burstiness. We show that CM is not able to produce time series with bursty behavior providing evidence that burstiness is not required for the explanation of the long-range correlation effects in the chess database.Comment: 18 pages, 7 figure

    Memory and long-range correlations in chess games

    Full text link
    In this paper we report the existence of long-range memory in the opening moves of a chronologically ordered set of chess games using an extensive chess database. We used two mapping rules to build discrete time series and analyzed them using two methods for detecting long-range correlations; rescaled range analysis and detrented fluctuation analysis. We found that long-range memory is related to the level of the players. When the database is filtered according to player levels we found differences in the persistence of the different subsets. For high level players, correlations are stronger at long time scales; whereas in intermediate and low level players they reach the maximum value at shorter time scales. This can be interpreted as a signature of the different strategies used by players with different levels of expertise. These results are robust against the assignation rules and the method employed in the analysis of the time series.Comment: 12 pages, 5 figures. Published in Physica

    Popular music and/as event: subjectivity, love and fidelity in the aftermath of rock ’n’ roll

    Get PDF
    This article concerns the usefulness of attaching a philosophy of the event to popular music studies. I am attempting to think about the ways that rock ’n’ roll functions as a musical revolution that becomes subjected to a narrative of loss accompanying the belief that the revolution has floundered, or even disappeared completely. In order to think about what this narrative of loss might entail I have found myself going back to the emergence of rock ’n’ roll, to what we might term its ‘event’, and then working towards the present to take stock of the current situation. The article is divided into three parts. Part One attempts to think of the emergence of rock ’n’ roll and its attendant discourse alongside Alain Badiou’s notion of event, looking at ways in which listening subjects are formed. Part Two continues the discussion of listening subjectivity while shifting the focus to objects associated with phonography. Part Three attends to a number of difficulties encountered in the Badiouian project and asks to what extent rock music might be thought of as a lost cause. All three parts deal with notions of subjectivity, love and fidelit

    Negativity effect and the emergence of ideologies

    Get PDF
    ``Negativity effect'' refers to the psychological phenomenon that people tend to attach greater weight to negative information than to equally extreme and equally likely positive information in a variety of information processing tasks. Numerous studies of impression formation have found that negative information is weighted more heavily than positive information as impressions of others are formed. There is empirical evidence in political science that shows the importance of the negativity effect in the information processing of the voters. This effect can explain the observed decrease of popularity for a president the longer he is in office. \\ We construct a dynamic model of political competition, incorporating the negativity effect in the decision rule of the voters and allowing their preferences to change over time, according to the past performance of the candidates while in office. Our model may explain the emergence of ideologies out of the competition for votes of myopic candidates freely choosing policy positions. This result gives rise to the formation of political parties, as infinitely--lived agents with a certain ideology. Furthermore, in this model some voters may start out by switching among parties associated with different policies, but find themselves supporting one of the parties from some point on. Thus, the model describes a process by which some voters become identified with a ``right'' or ``left'' bloc, while others ``swing'' between the two parties.Negativity effect, formation of ideologies

    A Simple Generative Model of Collective Online Behaviour

    Full text link
    Human activities increasingly take place in online environments, providing novel opportunities for relating individual behaviours to population-level outcomes. In this paper, we introduce a simple generative model for the collective behaviour of millions of social networking site users who are deciding between different software applications. Our model incorporates two distinct components: one is associated with recent decisions of users, and the other reflects the cumulative popularity of each application. Importantly, although various combinations of the two mechanisms yield long-time behaviour that is consistent with data, the only models that reproduce the observed temporal dynamics are those that strongly emphasize the recent popularity of applications over their cumulative popularity. This demonstrates---even when using purely observational data without experimental design---that temporal data-driven modelling can effectively distinguish between competing microscopic mechanisms, allowing us to uncover new aspects of collective online behaviour.Comment: Updated, with new figures and Supplementary Informatio

    The debate on warlordism: the importance of military legitimacy

    Get PDF
    Despite the careless use of the terms ' warlord' and ' warlordism' by the media, both have become increasingly popular among academics, even if some scholars object to their use. This paper draws on direct field experience as well as the ongoing debate. It aims, on the one hand, to reconcile the different perspectives - which are often not necessarily at odds with each other - and on the other hand, to propose a definition of 'warlordism' for the social sciences that is both closer to that used so far by historians and at the same time consistent with emerging evidence from the field

    Cognitive trait model for persistent and fine-tuned student modelling in adaptive virtual learning environments : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Science in Information Systems at Massey University

    Get PDF
    The increasing need for individualised instructional in both academic and corporate training environment encourages the emergence and popularity of adaptivity in virtual learning environments (VLEs). Adaptivity can be applied in VLEs as adaptivity content presentation, which generates the learning content adaptively to suit the particular learner's aptitude, and as adaptive navigational control, which dynamically modifies the structure of the virtual learning environment presented to the learner in order to prevent overloading the learner's cognitive load. Techniques for both adaptive content presentation and adaptive navigational control need to be integrated in a conceptual framework so their benefits can be synthesised to obtain a synergic result. Exploration space control (ESC) theory attempts to adjust the learning space, called exploration space, to allow the learners to reach an adequate amount of information that their cognitive load is not overloaded. Multiple presentation (MR) approach provides guidelines for the selection of multimedia objects for both the learning content presentation and as navigational links. ESC is further formalised by including the consideration of individual learner's cognitive traits, which are the cognitive characteristics and abilities the learner relevant in the process of learning. Cognitive traits selected in the formalisation include working memory capacity, inductive reasoning skill, associative learning skill, and information processing speed. The formalisation attempts to formulate a guideline on how the learning content and navigational space should be adjusted in order to support a learner with a particular set of cognitive traits. However, in order to support the provision of adaptivity, the learners and their activities in the VLEs need to be profiled; the profiling process is called student modelling. Student models nowadays can be categorised into state models, and process models. State models record learners' progress as states (e.g. learned, not learned), whereas a process model represents the learners in term of both the knowledge they learned in the domain, and the inference procedures they used for completing a process (task). State models and process models are both competence-based, and they do not provide the information of an individual's cognitive abilities required by the formalisation of exploration space control. A new approach of student modelling is required, and this approach is called cognitive trait model (CTM). The basis of CTM lies in the field of cognitive science. The process for the creation of CTM includes the following subtasks. The cognitive trait under inquiry is studied in order to find its indicative signs (e.g. sign A indicates high working memory capacity). The signs are called the manifests of the cognitive trait. Manifests are always in pairs, i.e. if manifest A indicates high working memory capacity, A's inverse, B, would indicates low working memory capacity. The manifests are then translated into implementation patterns which are observable patterns in the records of learner-system interaction. Implementation patterns are regarded as machine-recognisable manifests. The manifests are used to create nodes in a neural network like structure called individualised temperament network (ITN). Every node in the ITN has its weight that conditions and is conditioned by the overall result of the execution of ITN. The output of the ITN's execution is used to update the CTM. A formative evaluation was carried out for a prototype created in this work. The positive results of the evaluation show the educational potential of the CTM approach. The current CTM only cater for the working memory capacity, in the future research more cognitive traits will be studied and included into the CTM
    • 

    corecore