90 research outputs found

    Conversation with Robert Brandom

    Get PDF
    In this broad interview Robert Brandom talks about many themes concerning his work and about his career and education. Brandom reconstructs the main debts that he owes to colleagues and teachers, especially Wilfrid Sellars, Richard Rorty, and David Lewis, and talks about the projects he’s currently working on. He also talks about contemporary and classical pragmatism, and of the importance of classical thinkers like Kant and Hegel for contemporary debates. Other themes go deeper into the principal topics of his theoretical work – in particular, his later understanding of expressivism, his take on the debate between representationalists and anti-representationalists in semantics, the main open problems for his wide inferentialist project, and his methodological preference for the normative vocabulary in his account of discursive practice. Finally, Brandom touches on the epistemic role of perception and on his views about the importance of the phenomenological aspects of perceptual experience

    Implicit norms

    Get PDF
    Robert Brandom has developed an account of conceptual content as instituted by social practices. Such practices are understood as being implicitly normative. Brandom proposed the idea of implicit norms in order to meet some requirements imposed by Wittgenstein’s remarks on rule-following: escaping the regress of rules on the one hand, and avoiding mere regular behavior on the other. Anandi Hattiangadi has criticized this account as failing to meet such requirements. In what follows, I try to show how the correct understanding of sanctions and the expressivist reading of the issue can meet these challenges

    Counterfactually robust inferences, modally ruled out inferences, and semantic holism

    Get PDF
    It is often argued that inferential role semantics (IRS) entails semantic holism as long as theorists fail to answer the question about which inferences, among the many, are meaning-constitutive. Since analyticity, as truth in virtue of meaning, is a widely dismissed notion in indicating which inferences determine meaning, it seems that holism follows. Semantic holism is often understood as facing problems with the stability of content and many usual explanations of communication. Thus, we should choose between giving up IRS, to avoid these holistic entailments, and defending holism against this charge, to rescue IRS. I try to pursue the second goal by analyzing certain patterns of counterfactual reasoning. Wilfrid Sellars and Robert Brandom claim that, to defend IRS, content-constitutive inferences are those counterfactually robust. While it is difficult to assess the goodness of such a view, it nonetheless entails that counterfactually non-robust inferences (which I call “modally ruled out inferences”) are not content-constitutive. If this is true, and if we take certain remarks about the grasp of concepts on board, there is a way to restrict the scope of the holism entailed by IRS to the extent of reshaping problems with the stability of content

    Discursive pluralism: Inferentialist expressivism and the integration challenge

    Get PDF
    Discursive pluralism, recently fostered by anti-representationalist views, by stating that not all assertions conform to a descriptive model of language, poses an interesting challenge to representationalism. Although in recent years alethic pluralism has become more and more popular as an interesting way out for this issue, the discussion also hosts other interesting minority approaches in the anti-representationalist camp. In particular, the late stage of contemporary expressivism offers a few relevant insights, going from Price's denunciation of “placement problems” to Brandom's inferentialism. This paper attempts to show how these expressivist ideas combine well together, composing a unitary and metaphysically sober metaphilosophical framework

    Does language have a downtown? Wittgenstein, Brandom, and the game of “giving and asking for reasons”

    Get PDF
    Wittgenstein’s Investigations proposed an egalitarian view about language games, emphasizing their plurality (“language has no downtown”). Uses of words depend on the game one is playing, and may change when playing another. Furthermore, there is no privileged game dictating the rules for the others: games are as many as purposes. This view is pluralist and egalitarian, but it says little about the connection between meaning and use, and about how a set of rules is responsible for them in practice. Brandom’s Making It Explicit attempted a straightforward answer to these questions, by developing Wittgensteinian insights: the primacy of social practice over meanings; the idea that meaning is use; the idea of rule–following to understand participation in social practices. Nonetheless, Brandom defended a non–Wittgensteinian conception of discursive practice: language has a “downtown”, the game of “giving and asking for reasons”. This is the idea of a normative structure of language, consisting of advancing claims and drawing inferences. By means of assertions, speakers undertake “commitments” that can be challenged/defended in terms of reasons (those successfully justified can gain “entitlement”). This game is not one among many: it is indispensable to the very idea of discursive practice. In this paper, my aim will be that of exploring the main motivations and implications of both perspectives

    Putnam's Alethic Pluralism and the Fact-Value Dichotomy

    Get PDF
    Hilary Putnam spent much of his career criticizing the fact/value dichotomy, and this became apparent already during the phase when he defended internal realism. He later changed his epistemological and metaphysical view by endorsing natural realism, with the consequence of embracing alethic pluralism, the idea that truth works differently in various discourse domains. Despite these changes of mind in epistemology and in theory of truth, Putnam went on criticizing the fact/value dichotomy. However, alethic pluralism entails drawing distinctions among discourse domains, especially between factual and nonfactual domains, and these distinctions are in tension with the rejection of the fact/value dichotomy, as this would in principle hinder factual domains as genuine. This issue raises, prima facie, some doubts about the effective compatibility of these views

    La riduzione sociologica della normatività. Tre osservazioni sull’argomento di Stephen Turner

    Get PDF
    Stephen Turner claims that social science can explain away normativity. By exploiting a non-normative view of rationality and a causal view of belief, he claimed that normativist views are akin to what he calls Good Bad Theories (GBT). GBT are false accounts that play a role of social coordination like primitive rituals (Taboo and the like). Hence, “norms”, “commitments”, and “obligations” are just like Taboo and can be explained away as GBT. Normativism, as a consequence, is doomed to disappear in a disenchanted world. Turner focuses on the normativist idea that the normative does not reduce to the causal: he claims that social science succeeds in the reduction. This claim is presented as a major challenge to philosophical normativism. In what follows, I try to discuss some aspects of Turner’s challenge by focusing on certain features of belief and belief-change that prima facie promote a normativist view: this is the basis to focus on some problems concerning the scope of Turner’s argument

    Giustificazionismo e passato

    Get PDF
    La realtà del passato rappresenta uno dei principali problemi riguardanti la semantica giustificazionista proposta da Michael Dummett. L’antirealismo tipico di questa prospettiva determina una concezione del passato piuttosto controintuitiva secondo cui esso «cessa di esistere» quando non lascia tracce e testimonianze. In Truth and the Past, Dummett è tornato sulla questione abbandonando l’antirealismo sul passato con l’obiettivo di evitare questa concezione. Questa svolta rappresenta un inedito spostamento in direzione del realismo, limitato tuttavia dal netto rifiuto di aderire ad una nozione di verità bivalente. Il mio intervento intende ricostruire e analizzare criticamente le ragioni di questa svolta di Dummett e cercare di sondare la solidità e la coerenza di questa rimodulazione del giustificazionismo

    La riduzione sociologica della normatività. Tre osservazioni sull’argomento di Stephen Turner

    Get PDF
    Stephen Turner claims that social science can explain away normativity. By exploiting a non-normative view of rationality and a causal view of belief, he claimed that normativist views are akin to what he calls Good Bad Theories (GBT). GBT are false accounts that play a role of social coordination like primitive rituals (Taboo and the like). Hence, “norms”, “commitments”, and “obligations” are just like Taboo and can be explained away as GBT. Normativism, as a consequence, is doomed to disappear in a disenchanted world. Turner focuses on the normativist idea that the normative does not reduce to the causal: he claims that social science succeeds in the reduction. This claim is presented as a major challenge to philosophical normativism. In what follows, I try to discuss some aspects of Turner’s challenge by focusing on certain features of belief and belief-change that prima facie promote a normativist view: this is the basis to focus on some problems concerning the scope of Turner’s argument
    • …
    corecore