18,338 research outputs found

    Folk Theory of Mind: Conceptual Foundations of Social Cognition

    Get PDF
    The human ability to represent, conceptualize, and reason about mind and behavior is one of the greatest achievements of human evolution and is made possible by a “folk theory of mind” — a sophisticated conceptual framework that relates different mental states to each other and connects them to behavior. This chapter examines the nature and elements of this framework and its central functions for social cognition. As a conceptual framework, the folk theory of mind operates prior to any particular conscious or unconscious cognition and provides the “framing” or interpretation of that cognition. Central to this framing is the concept of intentionality, which distinguishes intentional action (caused by the agent’s intention and decision) from unintentional behavior (caused by internal or external events without the intervention of the agent’s decision). A second important distinction separates publicly observable from publicly unobservable (i.e., mental) events. Together, the two distinctions define the kinds of events in social interaction that people attend to, wonder about, and try to explain. A special focus of this chapter is the powerful tool of behavior explanation, which relies on the folk theory of mind but is also intimately tied to social demands and to the perceiver’s social goals. A full understanding of social cognition must consider the folk theory of mind as the conceptual underpinning of all (conscious and unconscious) perception and thinking about the social world

    Epistemological Realism and Onto-Relations

    Get PDF
    The traditional concept of knowledge is a justified true belief. The bulk of contemporary epistemology has focused primarily on that task of justification. Truth seems to be a quite obvious criterion—does the belief in question correspond to reality? My contention is that the aspect of ontology is far too separated from epistemology. This onto-relationship of between reality and beliefs require the epistemic method of epistemological realism. This is not to diminish the task of justification. I will then discuss the role of inference from the onto-relationships of free invention and discovery and whether it is best suited for a foundationalist or coherentist model within a theistic context

    Identifying the consequences of dynamic treatment strategies: A decision-theoretic overview

    Full text link
    We consider the problem of learning about and comparing the consequences of dynamic treatment strategies on the basis of observational data. We formulate this within a probabilistic decision-theoretic framework. Our approach is compared with related work by Robins and others: in particular, we show how Robins's 'G-computation' algorithm arises naturally from this decision-theoretic perspective. Careful attention is paid to the mathematical and substantive conditions required to justify the use of this formula. These conditions revolve around a property we term stability, which relates the probabilistic behaviours of observational and interventional regimes. We show how an assumption of 'sequential randomization' (or 'no unmeasured confounders'), or an alternative assumption of 'sequential irrelevance', can be used to infer stability. Probabilistic influence diagrams are used to simplify manipulations, and their power and limitations are discussed. We compare our approach with alternative formulations based on causal DAGs or potential response models. We aim to show that formulating the problem of assessing dynamic treatment strategies as a problem of decision analysis brings clarity, simplicity and generality.Comment: 49 pages, 15 figure

    Presume It Not: True Causes in the Search for the Basis of Heredity

    Get PDF
    Kyle Stanford has recently given substance to the problem of unconceived alternatives, which challenges the reliability of inference to the best explanation (IBE) in remote domains of nature. Conjoined with the view that IBE is the central inferential tool at our disposal in investigating these domains, the problem of unconceived alternatives leads to scientific anti-realism. We argue that, at least within the biological community, scientists are now and have long been aware of the dangers of IBE. We re-analyze the nineteenth-century study of inheritance and development (Stanford’s case study) and extend it into the twentieth century, focusing in particular on both classical Mendelian genetics and the studies by Avery et al. on the chemical nature of the hereditary substance. Our extended case studies show the preference of the biological community for a different methodological standard: the vera causa ideal, which requires that purported causes be shown on non-explanatory grounds to exist and be competent to produce their effects. On this basis, we defend a prospective realism about the biological sciences

    Miracles, Trust, and Ennui in Barnes’ Predictivism

    Get PDF
    Eric Barnes’ The Paradox of Predictivism is concerned primarily with two facts: predictivism and pluralism. In the middle part of the book, he peers through these two lenses at the tired realist scarecrow of the no-miracles argument. He attempts to reanimate this weatherworn realist argument, contra suggestions by people like me that it should be abandoned. In this paper, I want to get clear on Barnes’ contribution to the debate. He focuses on what he calls the miraculous endorsement argument, which explains not the success of a specific theory but instead the history of successes for an entire research program. The history of successes is explained by reliable and improving methods, which are the flipside of approximately true background theories. Yet, as Barnes notes, the whole story must begin with methods that are at least minimally reliable. Barnes demands that the realist explain the origin of the minimally reliable take-off point, and he suggests a way that the realist might do so. I contend that his explanation still relies on contingent developments and so fails to completely explain the development of take-off theories. However, this line of argument digs into familiar details of the no-miracles argument and overlooks what’s new in Barnes’ approach. By calling attention to pluralism, he reminds us that we need an account of scientific expertise. This is important, I suggest, because expertise is not indefinite. We do not trust specific experts for everything, but only for things within the bounds of their expertise. Drawing these boundaries relies on our own background theories and is only likely to be reliable if our background theories are approximately true. I argue, then, that pluralism gives us reason to be realists

    Methods in Psychological Research

    Get PDF
    Psychologists collect empirical data with various methods for different reasons. These diverse methods have their strengths as well as weaknesses. Nonetheless, it is possible to rank them in terms of different critieria. For example, the experimental method is used to obtain the least ambiguous conclusion. Hence, it is the best suited to corroborate conceptual, explanatory hypotheses. The interview method, on the other hand, gives the research participants a kind of emphatic experience that may be important to them. It is for the reason the best method to use in a clinical setting. All non-experimental methods owe their origin to the interview method. Quasi-experiments are suited for answering practical questions when ecological validity is importa
    • 

    corecore