1,091 research outputs found

    Truth and Correctness: Towards A Pluralist Framework for Validating Information Systems Research

    Get PDF
    Research in information systems includes a wide range of approaches that make a contribution in terms of knowledge, understanding, or practical developments. However, empirical studies show that discussion of validity in research is often weak. In this paper we examine the nature of truth and relatedly correctness in order to construct a validation framework that can potentially encompass all the varied forms of research. Current debates within philosophy revolve around the idea of a pluralist view of truth – that there may be different forms of truth de-pending on context or domain. Related to truth is the wider concept of correctness – propositions may be true (and therefore correct) but correctness can also be applied to actions, performances or behavior. Based on these two concepts, we develop a framework for research validity and apply it to a range of research forms including positivist, mathematical, interpretive, design science, critical and action-oriented

    A Framework for Validating Information Systems Research Based on a Pluralist Account of Truth and Correctness

    Get PDF
    Research in information systems includes a range of approaches which make varied contributions in terms of knowledge, understanding, or practical developments. In these days of “fake news” and spurious internet content, scholarly research needs to be able to demonstrate its validity – are its finding true, or its recommendations correct? We argue that there are fundamental validation criteria that can be applied to all research approaches despite their apparent diversity and conflict. These stem from current views of the nature of truth, and the related but wider concept correctness, within philosophy. There has been much debate about the nature of truth – is it correspondence, coherence, consensual or pragmatic? Current debates revolve around the idea of a pluralist view of truth – that there are different forms of truth depending on the context or domain. Related to truth is the wider concept of correctness – propositions may be true but correctness can also be applied to actions, performances or behavior for which truth is not appropriate. We develop a framework for research validity and apply it to a range of research forms including positivist, interpretive, design science, critical and action-oriented. The benefits are: i) a greater and more explicit focus on validity criteria will produce better research; ii) having a single framework can provide some commonality between what at times seem conflicting approaches to research; iii) having criteria made explicit should encourage debate and further development. The framework is applied to a variety of empirical papers employing varied research approaches

    Empirical Validation of Agent Based Models: A Critical Survey

    Get PDF
    This paper addresses the problem of finding the appropriate method for conducting empirical validation in agent-based (AB) models, which is often regarded as the Achilles’ heel of the AB approach to economic modelling. The paper has two objectives. First, to identify key issues facing AB economists engaged in empirical validation. Second, to critically appraise the extent to which alternative approaches deal with these issues. We identify a first set of issues that are common to both AB and neoclassical modellers and a second set of issues which are specific to AB modellers. This second set of issues is captured in a novel taxonomy, which takes into consideration the nature of the object under study, the goal of the analysis, the nature of the modelling assumptions, and the methodology of the analysis. Having identified the nature and causes of heterogeneity in empirical validation, we examine three important approaches to validation that have been developed in AB economics: indirect calibration, the Werker-Brenner approach, and the history-friendly approach. We also discuss a set of open questions within empirical validation. These include the trade-off between empirical support and tractability of findings, the issue of over-parameterisation, unconditional objects, counterfactuals, and the non-neutrality of data.Empirical validation, agent-based models, calibration, history-friendly modelling

    Celebrity, Democracy, and Epistemic Power

    Get PDF
    What, if anything, is problematic about the involvement of celebrities in democratic politics? While a number of theorists have criticized celebrity involvement in politics (Meyer 2002; Mills 1957; Postman 1987) none so far have examined this issue using the tools of social epistemology, the study of the effects of social interactions, practices and institutions on knowledge and belief acquisition. This paper will draw on these resources to investigate the issue of celebrity involvement in politics, specifically as this involvement relates to democratic theory and its implications for democratic practice. We will argue that an important and underexplored form of power, which we will call epistemic power, can explain one important way in which celebrity involvement in politics is problematic. This is because unchecked uses and unwarranted allocations of epistemic power, which celebrities tend to enjoy, threaten the legitimacy of existing democracies and raise important questions regarding core commitments of deliberative, epistemic, and plebiscitary models of democratic theory. We will finish by suggesting directions that democratic theorists could pursue when attempting to address some of these problems

    How to kill 999 flowers : in defence of logical monism

    Get PDF
    How many correct logics are there? For much of logic’s history it was widely assumed that there was exactly one correct logic, a position known as logical monism. However, the monist’s hegemony has recently become increasingly precarious as she has simultaneously come under attack from two sides. On one side she faces logical pluralists who contend that there is more than one correct logic, and on the other she faces logical nihilists who contend that there are no correct logics. This thesis aims to defend monism against the twin threats of pluralism and nihilism

    Logical Localism in the Context of Combining Logics

    Full text link
    [eng] Logical localism is a claim in the philosophy of logic stating that different logics are correct in different domains. There are different ways in which this thesis can be motivated and I will explore the most important ones. However, localism has an obvious and major challenge which is known as ‘the problem of mixed inferences’. The main goal of this dissertation is to solve this challenge and to extend the solution to the related problem of mixed compounds for alethic pluralism. My approach in order to offer a solution is one that has not been considered in the literature as far as I am aware. I will study different methods for combining logics, concentrating on the method of juxtaposition, by Joshua Schechter, and I will try to solve the problem of mixed inferences by making a finer translation of the arguments and using combination mechanisms as the criterion of validity. One of the most intriguing aspects of the dissertation is the synergy that is created between the philosophical debate and the technical methods with the problem of mixed inferences at the center of that synergy. I hope to show that not only the philosophical debate benefits from the methods for combining logics, but also that these methods can be developed in new and interesting ways motivated by the philosophical problem of mixed inferences. The problem suggests that there are relevant interactions between connectives, justified by the philosophical considerations for conceptualising different logic systems, that the methods for combining logics should allow to emerge. The recognition of this fact is what drives the improvements on the method of juxtaposition that I develop. That is, in order to allow for the emergence of desirable interaction principles, I will propose alternative ways of combining logic systems -specifically classical and intuitionistic logics- that go beyond the standard for combinations, which is based on minimality conditions so as to avoid the so-called collapse theorems.[spa] El localismo lógico es una tesis en filosofía de la lógica según la cual diferentes sistemas lógicos son correctos en función del dominio en el que se aplican. Dicha tesis cuenta, prima facie, con cierta plausibilidad y con varios argumentos que la respaldan como mostraré. Sin embargo, el localismo se presta a un evidente y poderoso contraargumento conocido como ‘el problema de las inferencias mixtas’. El objetivo principal de esta disertación es dar respuesta a ese problema y extender la solución al problema afín de los compuestos mixtos que afecta al pluralismo alético. La manera de abordar el problema de las inferencias mixtas consistirá en analizar casos paradigmáticos en la literatura a la luz de los métodos de combinación de lógicas. En concreto, me centraré en el método de la yuxtaposición, desarrollado por Joshua Schechter. Así, ofreceré una solución al problema de las inferencias mixtas que pasará por realizar un análisis más sutil y una formalización más precisa de las mismas, para después aplicar los mecanismos de combinación como criterio de validez. Además, mostraré que el problema de las inferencias mixtas provee de multitud de ejemplos que invitan a desarrollar los métodos de combinación de lógicas de formas novedosas. Una de las aportaciones más relevantes de la disertación consistirá en modificar el método de la yuxtaposición para obtener mecanismos que van más allá del estándar de las extensiones mínimas conservativas. En concreto, propondré diferentes mecanismos para combinar la lógica clásica y la intuicionista, de manera que se permita la aparición de distintos principios puente para los que tenemos buenas razones que los justifican, sin que ello conduzca al colapso de las lógicas que se combinan

    Logical Localism in the Context of Combining Logics

    Get PDF
    Programa de Doctorat en Ciència Cognitiva i Llenguatge[eng] Logical localism is a claim in the philosophy of logic stating that different logics are correct in different domains. There are different ways in which this thesis can be motivated and I will explore the most important ones. However, localism has an obvious and major challenge which is known as ‘the problem of mixed inferences’. The main goal of this dissertation is to solve this challenge and to extend the solution to the related problem of mixed compounds for alethic pluralism. My approach in order to offer a solution is one that has not been considered in the literature as far as I am aware. I will study different methods for combining logics, concentrating on the method of juxtaposition, by Joshua Schechter, and I will try to solve the problem of mixed inferences by making a finer translation of the arguments and using combination mechanisms as the criterion of validity. One of the most intriguing aspects of the dissertation is the synergy that is created between the philosophical debate and the technical methods with the problem of mixed inferences at the center of that synergy. I hope to show that not only the philosophical debate benefits from the methods for combining logics, but also that these methods can be developed in new and interesting ways motivated by the philosophical problem of mixed inferences. The problem suggests that there are relevant interactions between connectives, justified by the philosophical considerations for conceptualising different logic systems, that the methods for combining logics should allow to emerge. The recognition of this fact is what drives the improvements on the method of juxtaposition that I develop. That is, in order to allow for the emergence of desirable interaction principles, I will propose alternative ways of combining logic systems -specifically classical and intuitionistic logics- that go beyond the standard for combinations, which is based on minimality conditions so as to avoid the so-called collapse theorems.[spa] El localismo lógico es una tesis en filosofía de la lógica según la cual diferentes sistemas lógicos son correctos en función del dominio en el que se aplican. Dicha tesis cuenta, prima facie, con cierta plausibilidad y con varios argumentos que la respaldan como mostraré. Sin embargo, el localismo se presta a un evidente y poderoso contraargumento conocido como ‘el problema de las inferencias mixtas’. El objetivo principal de esta disertación es dar respuesta a ese problema y extender la solución al problema afín de los compuestos mixtos que afecta al pluralismo alético. La manera de abordar el problema de las inferencias mixtas consistirá en analizar casos paradigmáticos en la literatura a la luz de los métodos de combinación de lógicas. En concreto, me centraré en el método de la yuxtaposición, desarrollado por Joshua Schechter. Así, ofreceré una solución al problema de las inferencias mixtas que pasará por realizar un análisis más sutil y una formalización más precisa de las mismas, para después aplicar los mecanismos de combinación como criterio de validez. Además, mostraré que el problema de las inferencias mixtas provee de multitud de ejemplos que invitan a desarrollar los métodos de combinación de lógicas de formas novedosas. Una de las aportaciones más relevantes de la disertación consistirá en modificar el método de la yuxtaposición para obtener mecanismos que van más allá del estándar de las extensiones mínimas conservativas. En concreto, propondré diferentes mecanismos para combinar la lógica clásica y la intuicionista, de manera que se permita la aparición de distintos principios puente para los que tenemos buenas razones que los justifican, sin que ello conduzca al colapso de las lógicas que se combinan

    Embracing trustworthiness and authenticity in the validation of learning analytics systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog

    Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics Systems

    Get PDF
    Learning analytics sits in the middle space between learning theory and data analytics. The inherent diversity of learning analytics manifests itself in an epistemology that strikes a balance between positivism and interpretivism, and knowledge that is sourced from theory and practice. In this paper, we argue that validation approaches for learning analytics systems should be cognisant of these diverse foundations. Through a systematic review of learning analytics validation research, we find that there is currently an over-reliance on positivistic validity criteria. Researchers tend to ignore interpretivistic criteria such as trustworthiness and authenticity. In the 38 papers we analysed, researchers covered positivistic validity criteria 221 times, whereas interpretivistic criteria were mentioned 37 times. We motivate that learning analytics can only move forward with holistic validation strategies that incorporate “thick descriptions” of educational experiences. We conclude by outlining a planned validation study using argument-based validation, which we believe will yield meaningful insights by considering a diverse spectrum of validity criteria
    corecore