15 research outputs found

    A Defense of Pure Connectionism

    Full text link
    Connectionism is an approach to neural-networks-based cognitive modeling that encompasses the recent deep learning movement in artificial intelligence. It came of age in the 1980s, with its roots in cybernetics and earlier attempts to model the brain as a system of simple parallel processors. Connectionist models center on statistical inference within neural networks with empirically learnable parameters, which can be represented as graphical models. More recent approaches focus on learning and inference within hierarchical generative models. Contra influential and ongoing critiques, I argue in this dissertation that the connectionist approach to cognitive science possesses in principle (and, as is becoming increasingly clear, in practice) the resources to model even the most rich and distinctly human cognitive capacities, such as abstract, conceptual thought and natural language comprehension and production. Consonant with much previous philosophical work on connectionism, I argue that a core principle—that proximal representations in a vector space have similar semantic values—is the key to a successful connectionist account of the systematicity and productivity of thought, language, and other core cognitive phenomena. My work here differs from preceding work in philosophy in several respects: (1) I compare a wide variety of connectionist responses to the systematicity challenge and isolate two main strands that are both historically important and reflected in ongoing work today: (a) vector symbolic architectures and (b) (compositional) vector space semantic models; (2) I consider very recent applications of these approaches, including their deployment on large-scale machine learning tasks such as machine translation; (3) I argue, again on the basis mostly of recent developments, for a continuity in representation and processing across natural language, image processing and other domains; (4) I explicitly link broad, abstract features of connectionist representation to recent proposals in cognitive science similar in spirit, such as hierarchical Bayesian and free energy minimization approaches, and offer a single rebuttal of criticisms of these related paradigms; (5) I critique recent alternative proposals that argue for a hybrid Classical (i.e. serial symbolic)/statistical model of mind; (6) I argue that defending the most plausible form of a connectionist cognitive architecture requires rethinking certain distinctions that have figured prominently in the history of the philosophy of mind and language, such as that between word- and phrase-level semantic content, and between inference and association

    The evolution of language: Proceedings of the Joint Conference on Language Evolution (JCoLE)

    Get PDF

    What Should Schools Teach? Disciplines, subjects and the pursuit of truth

    Get PDF
    The design of school curriculums involves deep thought about the nature of knowledge and its value to learners and society. It is a serious responsibility that raises a number of questions. What is knowledge for? What knowledge is important for children to learn? How do we decide what knowledge matters in each school subject? And how far should the knowledge we teach in school be related to academic disciplinary knowledge? These and many other questions are taken up in What Should Schools Teach? The blurring of distinctions between pedagogy and curriculum, and between experience and knowledge, has served up a confusing message for teachers about the part that each plays in the education of children. Schools teach through subjects, but there is little consensus about what constitutes a subject and what they are for. This book aims to dispel confusion through a robust rationale for what schools should teach that offers key understanding to teachers of the relationship between knowledge (what to teach) and their own pedagogy (how to teach), and how both need to be informed by values of intellectual freedom and autonomy. This second edition includes new chapters on Chemistry, Drama, Music and Religious Education, and an updated chapter on Biology. A revised introduction reflects on emerging discourse around decolonizing the curriculum, and on the relationship between the knowledge that children encounter at school and in their homes

    What Should Schools Teach?

    Get PDF
    The design of school curriculums involves deep thought about the nature of knowledge and its value to learners and society. It is a serious responsibility that raises a number of questions. What is knowledge for? What knowledge is important for children to learn? How do we decide what knowledge matters in each school subject? And how far should the knowledge we teach in school be related to academic disciplinary knowledge? These and many other questions are taken up in What Should Schools Teach? The blurring of distinctions between pedagogy and curriculum, and between experience and knowledge, has served up a confusing message for teachers about the part that each plays in the education of children. Schools teach through subjects, but there is little consensus about what constitutes a subject and what they are for. This book aims to dispel confusion through a robust rationale for what schools should teach that offers key understanding to teachers of the relationship between knowledge (what to teach) and their own pedagogy (how to teach), and how both need to be informed by values of intellectual freedom and autonomy. This second edition includes new chapters on Chemistry, Drama, Music and Religious Education, and an updated chapter on Biology. A revised introduction reflects on emerging discourse around decolonizing the curriculum, and on the relationship between the knowledge that children encounter at school and in their homes. Praise for What Should Schools Teach? ‘This book brings profound questions about what children need to know back to the centre of educational enquiry where they belong. The additional chapters in this second edition are excellent. We all need to read it.’ Professor Elizabeth Rata, University of Auckland ‘I am afraid that what we actually teach is so often forgotten in debates about schools. Subjects – the way that most people choose to divide up human knowledge – are too rarely the focus of our interest. Yet the subjects we offer and the syllabus content of each is arguably the most important single element of the school system. This book bucks the trend and should be of great importance to all teachers.’ Barnaby Lenon, University of Buckingha

    Adjectivization in Russian: Analyzing participles by means of lexical frequency and constraint grammar

    Get PDF
    This dissertation explores the factors that restrict and facilitate adjectivization in Russian, an affixless part-of-speech change leading to ambiguity between participles and adjectives. I develop a theoretical framework based on major approaches to adjectivization, and assess the effect of the factors on ambiguity in the empirical data. I build a linguistic model using the Constraint Grammar formalism. The model utilizes the factors of adjectivization and corpus frequencies as formal constraints for differentiating between participles and adjectives in a disambiguation task. The main question that is explored in this dissertation is which linguistic factors allow for the differentiation between adjectivized and unambiguous participles. Another question concerns which factors, syntactic or morphological, predict ambiguity in the corpus data and resolve it in the disambiguation model. In the theoretical framework, the syntactic context signals whether a participle is adjectivized, whereas internal morphosemantic properties (that is, tense, voice, and lexical meaning) cause or prevent adjectivization. The exploratory analysis of these factors in the corpus data reveals diverse results. The syntactic factor, the adverb of measure and degree očenʹ ‘very’, which is normally used with adjectives, also combines with participles, and is strongly associated with semantic classes of their base verbs. Nonetheless, the use of očenʹ with a participle only indicates ambiguity when other syntactic factors of adjectivization are in place. The lexical frequency (including the ranks of base verbs and the ratios of participles to other verbal forms) and several morphological types of participles strongly predict ambiguity. Furthermore, past passive and transitive perfective participles not only have the highest mean ratios among the other morphological types of participles, but are also strong predictors of ambiguity. The linguistic model using weighted syntactic rules shows the highest accuracy in disambiguation compared to the models with weighted morphological rules or the rule based on weights only. All of the syntactic, morphological, and weighted rules combined show the best performance results. Weights are the most effective for removing residual ambiguity (similar to the statistical baseline model), but are outperformed by the models that use factors of adjectivization as constraints

    What Should Schools Teach?

    Get PDF
    The design of school curriculums involves deep thought about the nature of knowledge and its value to learners and society. It is a serious responsibility that raises a number of questions. What is knowledge for? What knowledge is important for children to learn? How do we decide what knowledge matters in each school subject? And how far should the knowledge we teach in school be related to academic disciplinary knowledge? These and many other questions are taken up in What Should Schools Teach? The blurring of distinctions between pedagogy and curriculum, and between experience and knowledge, has served up a confusing message for teachers about the part that each plays in the education of children. Schools teach through subjects, but there is little consensus about what constitutes a subject and what they are for. This book aims to dispel confusion through a robust rationale for what schools should teach that offers key understanding to teachers of the relationship between knowledge (what to teach) and their own pedagogy (how to teach), and how both need to be informed by values of intellectual freedom and autonomy. This second edition includes new chapters on Chemistry, Drama, Music and Religious Education, and an updated chapter on Biology. A revised introduction reflects on emerging discourse around decolonizing the curriculum, and on the relationship between the knowledge that children encounter at school and in their homes. Praise for What Should Schools Teach? ‘This book brings profound questions about what children need to know back to the centre of educational enquiry where they belong. The additional chapters in this second edition are excellent. We all need to read it.’ Professor Elizabeth Rata, University of Auckland ‘I am afraid that what we actually teach is so often forgotten in debates about schools. Subjects – the way that most people choose to divide up human knowledge – are too rarely the focus of our interest. Yet the subjects we offer and the syllabus content of each is arguably the most important single element of the school system. This book bucks the trend and should be of great importance to all teachers.’ Barnaby Lenon, University of Buckingha
    corecore