34,997 research outputs found

    Algorithmic complexity for psychology: A user-friendly implementation of the coding theorem method

    Full text link
    Kolmogorov-Chaitin complexity has long been believed to be impossible to approximate when it comes to short sequences (e.g. of length 5-50). However, with the newly developed \emph{coding theorem method} the complexity of strings of length 2-11 can now be numerically estimated. We present the theoretical basis of algorithmic complexity for short strings (ACSS) and describe an R-package providing functions based on ACSS that will cover psychologists' needs and improve upon previous methods in three ways: (1) ACSS is now available not only for binary strings, but for strings based on up to 9 different symbols, (2) ACSS no longer requires time-consuming computing, and (3) a new approach based on ACSS gives access to an estimation of the complexity of strings of any length. Finally, three illustrative examples show how these tools can be applied to psychology.Comment: to appear in "Behavioral Research Methods", 14 pages in journal format, R package at http://cran.r-project.org/web/packages/acss/index.htm

    Analyzing collaborative learning processes automatically

    Get PDF
    In this article we describe the emerging area of text classification research focused on the problem of collaborative learning process analysis both from a broad perspective and more specifically in terms of a publicly available tool set called TagHelper tools. Analyzing the variety of pedagogically valuable facets of learners’ interactions is a time consuming and effortful process. Improving automated analyses of such highly valued processes of collaborative learning by adapting and applying recent text classification technologies would make it a less arduous task to obtain insights from corpus data. This endeavor also holds the potential for enabling substantially improved on-line instruction both by providing teachers and facilitators with reports about the groups they are moderating and by triggering context sensitive collaborative learning support on an as-needed basis. In this article, we report on an interdisciplinary research project, which has been investigating the effectiveness of applying text classification technology to a large CSCL corpus that has been analyzed by human coders using a theory-based multidimensional coding scheme. We report promising results and include an in-depth discussion of important issues such as reliability, validity, and efficiency that should be considered when deciding on the appropriateness of adopting a new technology such as TagHelper tools. One major technical contribution of this work is a demonstration that an important piece of the work towards making text classification technology effective for this purpose is designing and building linguistic pattern detectors, otherwise known as features, that can be extracted reliably from texts and that have high predictive power for the categories of discourse actions that the CSCL community is interested in

    Quantum Algorithmic Integrability: The Metaphor of Polygonal Billiards

    Full text link
    An elementary application of Algorithmic Complexity Theory to the polygonal approximations of curved billiards-integrable and chaotic-unveils the equivalence of this problem to the procedure of quantization of classical systems: the scaling relations for the average complexity of symbolic trajectories are formally the same as those governing the semi-classical limit of quantum systems. Two cases-the circle, and the stadium-are examined in detail, and are presented as paradigms.Comment: 11 pages, 5 figure

    Estimating the Algorithmic Complexity of Stock Markets

    Full text link
    Randomness and regularities in Finance are usually treated in probabilistic terms. In this paper, we develop a completely different approach in using a non-probabilistic framework based on the algorithmic information theory initially developed by Kolmogorov (1965). We present some elements of this theory and show why it is particularly relevant to Finance, and potentially to other sub-fields of Economics as well. We develop a generic method to estimate the Kolmogorov complexity of numeric series. This approach is based on an iterative "regularity erasing procedure" implemented to use lossless compression algorithms on financial data. Examples are provided with both simulated and real-world financial time series. The contributions of this article are twofold. The first one is methodological : we show that some structural regularities, invisible with classical statistical tests, can be detected by this algorithmic method. The second one consists in illustrations on the daily Dow-Jones Index suggesting that beyond several well-known regularities, hidden structure may in this index remain to be identified

    Technology as an observing system : a 2nd order cybernetics approach

    Get PDF
    The role of technology in modern society is becoming fundamental to society itself as the boundary between technological utilization and technological interference narrows. Technology penetrates the core of an ever-increasing number of application domains. It exerts considerable influence over institutions, often in subtle ways that cannot be fully understood, and the effects of which, cannot be easily demarcated. Also, the ever-expanding ecosystem of Information and Communication Technologies (ICTs) results in an emergent complexity with unpredictable consequences. Over the past decades this has created a tension that has led to a heated debate concerning the relationship between the technical and the social. Some theorists subsume the technical into the social, others proclaim its domination, others its autonomy, while yet others suggest that it is a derivative of the social. Starting with Luhmann’s remark that technology determines what we observe and what we do not observe, this paper takes the approach that infers there are multiple benefits by looking into how Systems Theory can provide a coherent theoretical platform upon which these interactions can be further explored. It provides a theoretical treatise that examines the conditions through which the systemic nature of technology can be inspected. Also, the paper raises a series of questions that probe the nature of technological interference in other ‘function-systems’ of society (such as the economy, science, politics, etc). To achieve this goal, a 2nd order cybernetics approach is employed (mostly influenced by the works of Niklas Luhmann), in order to both investigate and delineate the impact of technology as system. Toward that end, a variety of influences of Information Systems (IS) are used as examples, opening the door to a complexity that emerges out of the interaction of technology with its socio-economic and political context. The paper describes technology as an observing system within the context of 2nd order cybernetics, and looks into what could be the different possibilities for a binary code for that system. Finally, the paper presents a framework that synthesizes relevant systems theoretical concepts in the context of the systemic character of technology
    • 

    corecore