17 research outputs found

    What justifies belief? : probability, normalcy, and the functional theory

    Get PDF
    ‘What justifies belief?’ This question is arguably one of the most important questions in contemporary epistemology. The first part of this study looks at two very different answers to the above question, but ultimately finds both of them wanting. According to probabilistic accounts of justification, the property that makes a belief justified is some property along the lines of being highly probable. I call this picture of justification the Lockean View. In contrast, according to the most prominent non- probabilistic accounts of justification, the property that justifies belief is some property along the lines of being true in all normal worlds. I call this non-probabilistic picture of justification the Normalcy View. However, as we will see, both families of views turn out to be problematic. While probabilistic accounts are incompatible with an attractive principle called multi premise closure (MPC), non-probabilistic accounts, I argue, are too demanding and therefore too stingy. This leaves us in a dilemma; neither probabilistic nor non-probabilistic accounts of justification seem to be wholly satisfactory. I call this the (MPC)-Stinginess Dilemma. The second part of this study is concerned with how we should respond to this dilemma. After considering but rejecting some initial options, I argue that the dilemma can be avoided if we reject the almost universally accepted monist assumption that there is only one way for a belief to be justified; or, that there is only one property that can make a belief justified. Subsequently I develop and defend a novel, pluralist, theory of epistemic justification. I call it the Functional Theory of Justification. One upshot of the functional theory is that it makes room for the idea that there is more than one way for a belief to be justified; or, more precisely, that depending on our epistemic environment, justification can be realized by different properties."I also want to thank the AD Links Foundation who has provided funding for my studies over the last three years." -- Acknowledgement

    A bitter pill for closure

    Get PDF
    The primary objective of this paper is to introduce a new epistemic paradox that puts pressure on the claim that justification is closed under multi premise deduction. The first part of the paper will consider two well-known paradoxes—the lottery and the preface paradox—and outline two popular strategies for solving the paradoxes without denying closure. The second part will introduce a new, structurally related, paradox that is immune to these closure-preserving solutions. I will call this paradox, The Paradox of the Pill. Seeing that the prominent closure-preserving solutions do not apply to the new paradox, I will argue that it presents a much stronger case against the claim that justification is closed under deduction than its two predecessors. Besides presenting a more robust counterexample to closure, the new paradox also reveals that the strategies that were previously thought to get closure out of trouble are not sufficiently general to achieve this task as they fail to apply to similar closure-threatening paradoxes in the same vicinity.Publisher PDFPeer reviewe

    Epistemology and the law : why there is no epistemic mileage in legal cases

    Get PDF
    The primary aim of this paper is to defend the Lockean View—the view that a belief is epistemically justified iff it is highly probable—against a new family of objections. According to these objections, broadly speaking, the Lockean View ought to be abandoned because it is incompatible with, or difficult to square with, our judgments surrounding certain legal cases. I distinguish and explore three different versions of these objections—The Conviction Argument, the Argument from Assertion and Practical Reasoning, and the Comparative Probabilities Argument—but argue that none of them are successful. I also present some very general reasons for being pessimistic about the overall strategy of using legal considerations to evaluate epistemic theories; as we will see, there are good reasons to think that many of the considerations relevant to legal theorizing are ultimately irrelevant to epistemic theorizing.Publisher PDFPeer reviewe

    Normalcy, justification, and the easy-defeat problem

    Get PDF
    Recent years have seen the rise of a new family of non-probabilistic accounts of epistemic justification. According to these views—we may call them Normalcy Views—a belief in P is justified only if, given the evidence, there exists no normal world in which S falsely beliefs that P. This paper aims to raise some trouble for this new approach to justification by arguing that Normalcy Views, while initially attractive, give rise to problematic accounts of epistemic defeat. As we will see, on Normalcy Views seemingly insignificant pieces of evidence turn out to have considerable defeating powers. This problem—I will call it the Easy-Defeat Problem—gives rise to a two-pronged challenge. First, it shows that the Normalcy View has counterintuitive implications and, second, it opens the door to an uncomfortable skeptical threat.Publisher PDFPeer reviewe

    Should moral intuitionism go social?

    Get PDF
    In recent work, Bengson, Cuneo, and Shafer-Landau (2020) develop a new social version of moral intuitionism that promises to explain why our moral intuitions are trustworthy. In this paper, we raise several worries for their account and present some general challenges for the broader class of views we call Social Moral Intuitionism. We close by reflecting on Bengson, Cuneo, and Shafer-Landau's comparison between what they call the “perceptual practice” and the “moral intuition practice”, which we take to raise some difficult normative and meta-normative questions for theorists of all stripes

    Statically Detecting JavaScript Obfuscation and Minification Techniques in the Wild

    Get PDF
    JavaScript is both a popular client-side programming language and an attack vector. While malware developers transform their JavaScript code to hide its malicious intent and impede detection, well-intentioned developers also transform their code to, e.g., optimize website performance. In this paper, we conduct an in-depth study of code transformations in the wild. Specifically, we perform a static analysis of JavaScript files to build their Abstract Syntax Tree (AST), which we extend with control and data flows. Subsequently, we define two classifiers, benefitting from AST-based features, to detect transformed samples along with specific transformation techniques. Besides malicious samples, we find that transforming code is increasingly popular on Node.js libraries and client-side JavaScript, with, e.g., 90% of Alexa Top 10k websites containing a transformed script. This way, code transformations are no indicator of maliciousness. Finally, we showcase that benign code transformation techniques and their frequency both differ from the prevalent malicious ones

    Epistemic Justification : Probability, Normalcy, and the Functional Theory

    No full text
    This paper puts forward a novel pluralist theory of epistemic justification that brings together two competing views in the literature-probabilistic and non-probabilistic accounts of justification. The first part of the paper motivates the new theory by arguing that neither probabilistic nor non-probabilistic accounts alone are wholly satisfactory. The second part puts forward what I call the Functional Theory of Justification. The key merit of the new theory is that it combines the most attractive features of both probabilistic and non-probabilistic accounts of justification while avoiding their most serious shortcomings. The paper also provides a blueprint for future pluralist projects in epistemology

    Can groups be genuine believers? : The argument from interpretationism

    No full text
    In ordinary discourse we often attribute beliefs not just to individuals but also to groups. But can groups really have genuine beliefs? This paper considers but ultimately rejects one of the main arguments in support of the claim that groups can be genuine believers - the Argument From Interpretationism - and concludes that we have good reasons to be sceptical about the existence of group beliefs. According to the Argument From Interpretationism, roughly speaking, groups qualify as genuine believers because we can interpret (or predict) their behaviour in much the same way that we can interpret (or predict) the behaviour of individuals. While this argument may seem initially attractive, I argue that it is ultimately unsuccessful. In particular, I argue that the argument is unsuccessful even if one is generally sympathetic towards interpretationism. The reason for this, as we will see, is that a number of problems arise when we try to apply the interpretationist strategy - originally formulated with individual subjects in mind - to plural subjects or groups. In showing why the Argument From Interpretationism fails, the paper also brings into focus some more general constraints on the scope and applicability of interpretationism
    corecore