8,033 research outputs found

    Mobile support in CSCW applications and groupware development frameworks

    No full text
    Computer Supported Cooperative Work (CSCW) is an established subset of the field of Human Computer Interaction that deals with the how people use computing technology to enhance group interaction and collaboration. Mobile CSCW has emerged as a result of the progression from personal desktop computing to the mobile device platforms that are ubiquitous today. CSCW aims to not only connect people and facilitate communication through using computers; it aims to provide conceptual models coupled with technology to manage, mediate, and assist collaborative processes. Mobile CSCW research looks to fulfil these aims through the adoption of mobile technology and consideration for the mobile user. Facilitating collaboration using mobile devices brings new challenges. Some of these challenges are inherent to the nature of the device hardware, while others focus on the understanding of how to engineer software to maximize effectiveness for the end-users. This paper reviews seminal and state-of-the-art cooperative software applications and development frameworks, and their support for mobile devices

    Managing the Adaptive Processing of Distributed Multimedia Information

    Get PDF
    The term multimedia conjures up visions of desktop computers reproducing digital movies, high-resolution images and stereo sound. While many current systems support such functionality, none do so elegantly-especially when data is fetched and synchronized from dissimilar sources distributed across resource-limited networks. Our research investigates general approaches for managing the flow of multimedia information in a distributed computing environment, providing adaptive support for time-sensitive retrieval and presentation based on multimedia document specifications. The benefit of our approach is that it provides flexible, content-based utilization of resources without over

    MediaSync: Handbook on Multimedia Synchronization

    Get PDF
    This book provides an approachable overview of the most recent advances in the fascinating field of media synchronization (mediasync), gathering contributions from the most representative and influential experts. Understanding the challenges of this field in the current multi-sensory, multi-device, and multi-protocol world is not an easy task. The book revisits the foundations of mediasync, including theoretical frameworks and models, highlights ongoing research efforts, like hybrid broadband broadcast (HBB) delivery and users' perception modeling (i.e., Quality of Experience or QoE), and paves the way for the future (e.g., towards the deployment of multi-sensory and ultra-realistic experiences). Although many advances around mediasync have been devised and deployed, this area of research is getting renewed attention to overcome remaining challenges in the next-generation (heterogeneous and ubiquitous) media ecosystem. Given the significant advances in this research area, its current relevance and the multiple disciplines it involves, the availability of a reference book on mediasync becomes necessary. This book fills the gap in this context. In particular, it addresses key aspects and reviews the most relevant contributions within the mediasync research space, from different perspectives. Mediasync: Handbook on Multimedia Synchronization is the perfect companion for scholars and practitioners that want to acquire strong knowledge about this research area, and also approach the challenges behind ensuring the best mediated experiences, by providing the adequate synchronization between the media elements that constitute these experiences

    A Web-Based Collaborative Multimedia Presentation Document System

    Get PDF
    With the distributed and rapidly increasing volume of data and expeditious development of modern web browsers, web browsers have become a possible legitimate vehicle for remote interactive multimedia presentation and collaboration, especially for geographically dispersed teams. To our knowledge, although there are a large number of applications developed for these purposes, there are some drawbacks in prior work including the lack of interactive controls of presentation flows, general-purpose collaboration support on multimedia, and efficient and precise replay of presentations. To fill the research gaps in prior work, in this dissertation, we propose a web-based multimedia collaborative presentation document system, which models a presentation as media resources together with a stream of media events, attached to associated media objects. It represents presentation flows and collaboration actions in events, implements temporal and spatial scheduling on multimedia objects, and supports real-time interactive control of the predefined schedules. As all events are represented by simple messages with an object-prioritized approach, our platform can also support fine-grained precise replay of presentations. Hundreds of kilobytes could be enough to store the events in a collaborative presentation session for accurate replays, compared with hundreds of megabytes in screen recording tools with a pixel-based replay mechanism

    Multiple multimodal mobile devices: Lessons learned from engineering lifelog solutions

    Get PDF
    For lifelogging, or the recording of one’s life history through digital means, to be successful, a range of separate multimodal mobile devices must be employed. These include smartphones such as the N95, the Microsoft SenseCam – a wearable passive photo capture device, or wearable biometric devices. Each collects a facet of the bigger picture, through, for example, personal digital photos, mobile messages and documents access history, but unfortunately, they operate independently and unaware of each other. This creates significant challenges for the practical application of these devices, the use and integration of their data and their operation by a user. In this chapter we discuss the software engineering challenges and their implications for individuals working on integration of data from multiple ubiquitous mobile devices drawing on our experiences working with such technology over the past several years for the development of integrated personal lifelogs. The chapter serves as an engineering guide to those considering working in the domain of lifelogging and more generally to those working with multiple multimodal devices and integration of their data
    • 

    corecore