3,681 research outputs found

    Continual Unsupervised Representation Learning

    Full text link
    Continual learning aims to improve the ability of modern learning systems to deal with non-stationary distributions, typically by attempting to learn a series of tasks sequentially. Prior art in the field has largely considered supervised or reinforcement learning tasks, and often assumes full knowledge of task labels and boundaries. In this work, we propose an approach (CURL) to tackle a more general problem that we will refer to as unsupervised continual learning. The focus is on learning representations without any knowledge about task identity, and we explore scenarios when there are abrupt changes between tasks, smooth transitions from one task to another, or even when the data is shuffled. The proposed approach performs task inference directly within the model, is able to dynamically expand to capture new concepts over its lifetime, and incorporates additional rehearsal-based techniques to deal with catastrophic forgetting. We demonstrate the efficacy of CURL in an unsupervised learning setting with MNIST and Omniglot, where the lack of labels ensures no information is leaked about the task. Further, we demonstrate strong performance compared to prior art in an i.i.d setting, or when adapting the technique to supervised tasks such as incremental class learning.Comment: NeurIPS 201

    Learning Representations for New Sound Classes With Continual Self-Supervised Learning

    Full text link
    In this paper, we work on a sound recognition system that continually incorporates new sound classes. Our main goal is to develop a framework where the model can be updated without relying on labeled data. For this purpose, we propose adopting representation learning, where an encoder is trained using unlabeled data. This learning framework enables the study and implementation of a practically relevant use case where only a small amount of the labels is available in a continual learning context. We also make the empirical observation that a similarity-based representation learning method within this framework is robust to forgetting even if no explicit mechanism against forgetting is employed. We show that this approach obtains similar performance compared to several distillation-based continual learning methods when employed on self-supervised representation learning methods.Comment: Accepted to IEEE Signal Processing Letter

    Recent Advances of Continual Learning in Computer Vision: An Overview

    Full text link
    In contrast to batch learning where all training data is available at once, continual learning represents a family of methods that accumulate knowledge and learn continuously with data available in sequential order. Similar to the human learning process with the ability of learning, fusing, and accumulating new knowledge coming at different time steps, continual learning is considered to have high practical significance. Hence, continual learning has been studied in various artificial intelligence tasks. In this paper, we present a comprehensive review of the recent progress of continual learning in computer vision. In particular, the works are grouped by their representative techniques, including regularization, knowledge distillation, memory, generative replay, parameter isolation, and a combination of the above techniques. For each category of these techniques, both its characteristics and applications in computer vision are presented. At the end of this overview, several subareas, where continuous knowledge accumulation is potentially helpful while continual learning has not been well studied, are discussed

    A Thick Industrial Design Studio Curriculum

    Get PDF
    This presentation was part of the session : Pedagogy: Procedures, Scaffolds, Strategies, Tactics24th National Conference on the Beginning Design StudentThis paper describes an industrial design studio course based in a private university in Izmir, Turkey where second year industrial design students, for the first time, engage in a studio project. The design studio course emphasises three distinct areas of competence in designing that are the focus of the curriculum. They are; design process: the intellectual act of solving a design problem; design concept: the imagination and sensibility to conceive of appropriate design ideas; and presentation: the ability to clearly and evocatively communicate design concepts. The studio is 'thick' with materials, tasks and activities that are intentionally sequenced to optimise learning in a process that is known as educational 'scaffolding.' The idea of a process--a patient journey toward it's destination, is implicit in the studio that is full of opportunities for reflection-in-action. A significant feature is the importance placed on drawing and model making. An exemplary design process should show evidence of 'breadth'--meaning a wide search for solutions where a range of alternatives explored throughout; followed by an incremental refinement of the chosen solution where elements of the final design concept are developed thoroughly and in detail--called 'depth.' Learning to design is predicated on an engagement in and manipulation of the elements of the design problem. Evidence of that learning will be found by examining the physical materials and results of the design process. The assessment criteria are published with the brief at the outset of design project and outcomes are spelt out at the end. Students are remind throughout project of the criteria, which is to say they are reminded of pedagogical aims of the studio. Assessment criteria are detailed and the advantages of summative assessment are described
    • …
    corecore