3,284 research outputs found

    Leveraging video annotations in video-based e-learning

    Get PDF
    The e-learning community has been producing and using video content for a long time, and in the last years, the advent of MOOCs greatly relied on video recordings of teacher courses. Video annotations are information pieces that can be anchored in the temporality of the video so as to sustain various processes ranging from active reading to rich media editing. In this position paper we study how video annotations can be used in an e-learning context - especially MOOCs - from the triple point of view of pedagogical processes, current technical platforms functionalities, and current challenges. Our analysis is that there is still plenty of room for leveraging video annotations in MOOCs beyond simple active reading, namely live annotation, performance annotation and annotation for assignment; and that new developments are needed to accompany this evolution.Comment: 7th International Conference on Computer Supported Education (CSEDU), Barcelone : Spain (2014

    Collaborative Project for Documenting Minority Languages in Indonesia and Malaysia

    Get PDF

    User-centered development of a Virtual Research Environment to support collaborative research events

    Get PDF
    This paper discusses the user-centred development process within the Collaborative Research Events on the Web (CREW) project, funded under the JISC Virtual Research Environments (VRE) programme. After presenting the project, its aims and the functionality of the CREW VRE, we focus on the user engagement approach, grounded in the method of co-realisation. We describe the different research settings and requirements of our three embedded user groups and the respective activities conducted so far. Finally we elaborate on the main challenges of our user engagement approach and end with the project’s next steps

    Students taking notes and creating summaries together (or not)

    Get PDF
    Two collaborative elearning projects using cloud-based productivity tools were undertaken in a large first-year common-core business information systems and technology unit at an Australian university. The first project involved collaborative synchronous and asynchronous note taking and the second project involved collaborative synchronous and asynchronous summarising of unit materials. Enrolment was optional and very low (less than 3 per cent of approximately 600 students) and active participation even lower (even with considerable support provided). Results seem to indicate students need strong motivation to actively participate (especially when lurking can provide seemingly similar results). Students who did actively participate suggest active participation is probably more useful than the collaboration and somewhat resented students lurking. Collaborative elearning offers many rewards for students, teachers, and organisations, and the technology is available to facilitate this, even in very large classes, but it seems significantly harder to achieve than anticipated

    Processing and Linking Audio Events in Large Multimedia Archives: The EU inEvent Project

    Get PDF
    In the inEvent EU project [1], we aim at structuring, retrieving, and sharing large archives of networked, and dynamically changing, multimedia recordings, mainly consisting of meetings, videoconferences, and lectures. More specifically, we are developing an integrated system that performs audiovisual processing of multimedia recordings, and labels them in terms of interconnected “hyper-events ” (a notion inspired from hyper-texts). Each hyper-event is composed of simpler facets, including audio-video recordings and metadata, which are then easier to search, retrieve and share. In the present paper, we mainly cover the audio processing aspects of the system, including speech recognition, speaker diarization and linking (across recordings), the use of these features for hyper-event indexing and recommendation, and the search portal. We present initial results for feature extraction from lecture recordings using the TED talks. Index Terms: Networked multimedia events; audio processing: speech recognition; speaker diarization and linking; multimedia indexing and searching; hyper-events. 1

    Synote: development of a Web-based tool for synchronized annotations

    No full text
    This paper discusses the development of a Web-based media annotation application named Synote, which addresses the important issue that while the whole of a multimedia resource on the Web can be easily bookmarked, searched, linked to and tagged, it is still difficult to search or associate notes or other resources with a certain part of a resource. Synote supports the creation of synchronized notes, bookmarks, tags, links, images and text captions. It is a freely available application that enables any user to make annotations in and search annotations to any fragment of a continuous multimedia resource in the most used browsers and operating systems. In the implementation, Synote categorized different media resources and synchronized them via time line. The presentation of synchronized resources makes full use of Web 2.0 AJAX technology to enrich interoperability for the user experience. Positive evaluation results about the performance, efficiency and effectiveness of Synote were returned when using it with students and teachers for a number of undergraduate courses
    • 

    corecore