293 research outputs found

    Long-Term Potentiation: One Kind or Many?

    Get PDF
    Do neurobiologists aim to discover natural kinds? I address this question in this chapter via a critical analysis of classification practices operative across the 43-year history of research on long-term potentiation (LTP). I argue that this 43-year history supports the idea that the structure of scientific practice surrounding LTP research has remained an obstacle to the discovery of natural kinds

    Behavioral Modernity and the Cultural Transmission of Structured Information: The Semantic Axelrod Model

    Full text link
    Cultural transmission models are coming to the fore in explaining increases in the Paleolithic toolkit richness and diversity. During the later Paleolithic, technologies increase not only in terms of diversity but also in their complexity and interdependence. As Mesoudi and O'Brien (2008) have shown, selection broadly favors social learning of information that is hierarchical and structured, and multiple studies have demonstrated that teaching within a social learning environment can increase fitness. We believe that teaching also provides the scaffolding for transmission of more complex cultural traits. Here, we introduce an extension of the Axelrod (1997} model of cultural differentiation in which traits have prerequisite relationships, and where social learning is dependent upon the ordering of those prerequisites. We examine the resulting structure of cultural repertoires as learning environments range from largely unstructured imitation, to structured teaching of necessary prerequisites, and we find that in combination with individual learning and innovation, high probabilities of teaching prerequisites leads to richer cultural repertoires. Our results point to ways in which we can build more comprehensive explanations of the archaeological record of the Paleolithic as well as other cases of technological change.Comment: 24 pages, 7 figures. Submitted to "Learning Strategies and Cultural Evolution during the Paleolithic", edited by Kenichi Aoki and Alex Mesoudi, and presented at the 79th Annual Meeting of the Society for American Archaeology, Austin TX. Revised 5/14/1

    Authenticity, Culture and Language Learning

    Get PDF
    In philosophy, authenticity has been used with two meanings: one entails the notion of correspondence; the other entails the notion of genesis (Cooper, 1983: 15). As in certain branches of philosophy, language teaching has perhaps clung too long to the first of these notions of authenticity at the expense of the other. This paper reviews four key conceptualisations of authenticity which have emerged in the field of applied linguistics: text authenticity, authenticity of language competence, learner authenticity and classroom authenticity. If any of these types of authenticity is couched exclusively in terms of one usage or the other, it can lead to an impoverishment and objectification of the experience of language learning. Text authenticity can lead to a poverty of language; authenticity of competence can lead to a poverty of performance; learner authenticity can lead to a poverty of interpretation; classroom authenticity can lead to a poverty of communication. This paper proposes that a pedagogy of intercultural communication be informed by a more hybrid view of authenticity as a process of subjectification, derived from the Heideggerian concept of self-concern

    Robot life: simulation and participation in the study of evolution and social behavior.

    Get PDF
    This paper explores the case of using robots to simulate evolution, in particular the case of Hamilton's Law. The uses of robots raises several questions that this paper seeks to address. The first concerns the role of the robots in biological research: do they simulate something (life, evolution, sociality) or do they participate in something? The second question concerns the physicality of the robots: what difference does embodiment make to the role of the robot in these experiments. Thirdly, how do life, embodiment and social behavior relate in contemporary biology and why is it possible for robots to illuminate this relation? These questions are provoked by a strange similarity that has not been noted before: between the problem of simulation in philosophy of science, and Deleuze's reading of Plato on the relationship of ideas, copies and simulacra

    Abstraction in ecology : reductionism and holism as complementary heuristics

    Get PDF
    In addition to their core explanatory and predictive assumptions, scientific models include simplifying assumptions, which function as idealizations, approximations, and abstractions. There are methods to investigate whether simplifying assumptions bias the results of models, such as robustness analyses. However, the equally important issue - the focus of this paper - has received less attention, namely, what are the methodological and epistemic strengths and limitations associated with different simplifying assumptions. I concentrate on one type of simplifying assumption, the use of mega parameters as abstractions in ecological models. First, I argue that there are two kinds of mega parameters qua abstractions, sufficient parameters and aggregative parameters, which have gone unnoticed in the literature. The two are associated with different heuristics, holism and reductionism, which many view as incompatible. Second, I will provide a different analysis of abstractions and the associated heuristics than previous authors. Reductionism and holism and the accompanying abstractions have different methodological and epistemic functions, strengths, and limitations, and the heuristics should be viewed as providing complementary research perspectives of cognitively limited beings. This is then, third, used as a premise to argue for epistemic and methodological pluralism in theoretical ecology. Finally, the presented taxonomy of abstractions is used to comment on the current debate whether mechanistic accounts of explanation are compatible with the use of abstractions. This debate has suffered from an abstract discussion of abstractions. With a better taxonomy of abstractions the debate can be resolved.Peer reviewe

    The Constraint Interpretation of Physical Emergence

    Get PDF
    I develop a variant of the constraint interpretation of the emergence of purely physical (non-biological) entities, focusing on the principle of the non-derivability of actual physical states from possible physical states (physical laws) alone. While this is a necessary condition for any account of emergence, it is not sufficient, for it becomes trivial if not extended to types of constraint that specifically constitute physical entities, namely, those that individuate and differentiate them. Because physical organizations with these features are in fact interdependent sets of such constraints, and because such constraints on physical laws cannot themselves be derived from physical laws, physical organization is emergent. These two complementary types of constraint are components of a complete non-reductive physicalism, comprising a non-reductive materialism and a non-reductive formalism

    Brain Complexity: Analysis, Models and Limits of Understanding

    Full text link
    Abstract. Manifold initiatives try to utilize the operational principles of organisms and brains to develop alternative, biologically inspired computing paradigms. This paper reviews key features of the standard method applied to complexity in the cognitive and brain sciences, i.e. decompositional analysis. Projects investigating the nature of computations by cortical columns are discussed which exemplify the application of this standard method. New findings are mentioned indicating that the concept of the basic uniformity of the cortex is untenable. The claim is discussed that non-decomposability is not an intrinsic property of complex, integrated systems but is only in our eyes, due to insufficient mathematical techniques. Using Rosen’s modeling relation, the scientific analysis method itself is made a subject of discussion. It is concluded that the fundamental assumption of cognitive science, i.e., cognitive and other complex systems are decomposable, must be abandoned.

    Less is Different: Emergence and Reduction Reconciled

    Get PDF
    This is a companion to another paper. Together they rebut two widespread philosophical doctrines about emergence. The first, and main, doctrine is that emergence is incompatible with reduction. The second is that emergence is supervenience; or more exactly, supervenience without reduction. In the other paper, I develop these rebuttals in general terms, emphasising the second rebuttal. Here I discuss the situation in physics, emphasising the first rebuttal. I focus on limiting relations between theories and illustrate my claims with four examples, each of them a model or a framework for modelling, from well-established mathematics or physics. I take emergence as behaviour that is novel and robust relative to some comparison class. I take reduction as, essentially, deduction. The main idea of my first rebuttal will be to perform the deduction after taking a limit of some parameter. Thus my first main claim will be that in my four examples (and many others), we can deduce a novel and robust behaviour, by taking the limit, N goes to infinity, of a parameter N. But on the other hand, this does not show that that the infinite limit is "physically real", as some authors have alleged. For my second main claim is that in these same examples, there is a weaker, yet still vivid, novel and robust behaviour that occurs before we get to the limit, i.e. for finite N. And it is this weaker behaviour which is physically real. My examples are: the method of arbitrary functions (in probability theory); fractals (in geometry); superselection for infinite systems (in quantum theory); and phase transitions for infinite systems (in statistical mechanics).Comment: 75 p
    corecore