5,815 research outputs found
MediaSync: Handbook on Multimedia Synchronization
This book provides an approachable overview of the most recent advances in the fascinating field of media synchronization (mediasync), gathering contributions from the most representative and influential experts. Understanding the challenges of this field in the current multi-sensory, multi-device, and multi-protocol world is not an easy task. The book revisits the foundations of mediasync, including theoretical frameworks and models, highlights ongoing research efforts, like hybrid broadband broadcast (HBB) delivery and users' perception modeling (i.e., Quality of Experience or QoE), and paves the way for the future (e.g., towards the deployment of multi-sensory and ultra-realistic experiences). Although many advances around mediasync have been devised and deployed, this area of research is getting renewed attention to overcome remaining challenges in the next-generation (heterogeneous and ubiquitous) media ecosystem. Given the significant advances in this research area, its current relevance and the multiple disciplines it involves, the availability of a reference book on mediasync becomes necessary. This book fills the gap in this context. In particular, it addresses key aspects and reviews the most relevant contributions within the mediasync research space, from different perspectives. Mediasync: Handbook on Multimedia Synchronization is the perfect companion for scholars and practitioners that want to acquire strong knowledge about this research area, and also approach the challenges behind ensuring the best mediated experiences, by providing the adequate synchronization between the media elements that constitute these experiences
A Deadline Aware Real-time Routing Protocol for Wireless Sensor Networks
Wireless sensor networks (WSN) find application in real-time events reporting and data gathering. When the sensor detects an event it is reported to the base stations, which then takes appropriate action. The course of action should have finite and bound delays defining a hard real-time constraint for time critical applications. This work proposes a network layer based, deadline aware real time routing protocol, which assumes a collision free known delay MAC (Medium Access Control) layer. The protocol works in three phases-the initialization phase, path establishment phase and the bandwidth division phase. This protocol ensures bounded delay in transmission of sensed data to the sink. It establishes a single path from each sensor node to the sink and allocates bandwidth for that path thereby reducing the time required for the sensed data to reach the sink
Delayed Visual Feedback of One’s Own Action Promotes Sense of Control for Auditory Events
Sense of control refers to one’s feelings to control environmental events through one’s own action. A prevailing view is that the sense of control is strong (or is not diminished) when predicted sensory signals, which are generated in motor control mechanisms, are consistent with afferent sensory signals. Such intact sense of control often leads to the misjudgment of temporal relation between timings of one’s action and its effect (so-called, intentional binding). The present study showed that the intentional binding could be enhanced by the delayed visual feedback of an agent’s action. We asked participants to press a button to produce a tone as action outcome. In some conditions, they were given the delayed visual feedback of their button press. Participants judged whether the onset of the auditory outcome was delayed from the timing of their button press. Consequently, delay detection thresholds were significantly higher when the feedback was given 0.2 and 0.4 sec delays than when no feedback was displayed to the participants. The results indicate that action agents misjudge the timing of their action (button press) in the presence of the delayed visual feedback of their action. Interestingly, delay detection thresholds were strongly correlated with the subjective magnitude of the sense of control. Thus, the sense of control is possibly determined by cross-modal processing for action-related and outcome-related sensory signals
Recommended from our members
Mental and motor representation for music performance
This research proposes a theory of nonconscious motor representation which precedes mental representation of the outcome of motor actions in music performance. The music performer faces the problem of how to escape sedimented musical paradigms to produce novel configurations of dynamics, timing and tone colour. If the sound were mentally represented as an action goal prior to being produced, it would tend to be assimilated to a known action goal. The proposed theory is intended to account for creativity in music performance, but has implications in other areas for both creativity and motor actions.
The investigation began with an ethnographic study of two 'posthardcore' rock bands in London and Bristol. Posthardcore musicians work with minimal explicit knowledge of music theory and cognitive involvement in performance is actively eschewed. Serendipitous musical felicities in performance are valued. Such felicities depend on adjustment and fine control of dynamics, timing and tone colour within the parameters of the given.
A selective survey of music aesthetics shows that the defining qualities of music are the production of immanent rather than representational meaning; polysemy; and processuality. Taking an analytic philosophy and cognitive science approach, I argue that apprehensions of immanent meaning depend on relationships between proximal percepts within the specious present. A general argument for nonconceptual perceptual content as perception of relations between magnitudes within the specious present is extended to music and argued to account for both the polysemic richness of music and its processuality. Nonconceptual relational perception can account for novel apprehensions by music listeners, but not for the production of novel configurations by the performer. I argue that motor creativity in music performance is achieved through the nonconscious parameterization of inverse models without conscious representation of the goal of the action. Conscious representation for the performer occurs when they hear their own performance
Scene analysis in the natural environment
The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to a number of important insights into problems of scene analysis, but not all of these insights are widely appreciated. Despite this progress, there are also critical shortcomings in current approaches that hinder further progress. Here we take the view that scene analysis is a universal problem solved by all animals, and that we can gain new insight by studying the problems that animals face in complex natural environments. In particular, the jumping spider, songbird, echolocating bat, and electric fish, all exhibit behaviors that require robust solutions to scene analysis problems encountered in the natural environment. By examining the behaviors of these seemingly disparate animals, we emerge with a framework for studying analysis comprising four essential properties: 1) the ability to solve ill-posed problems, 2) the ability to integrate and store information across time and modality, 3) efficient recovery and representation of 3D scene structure, and 4) the use of optimal motor actions for acquiring information to progress towards behavioral goals
Closing the gap: human factors in cross-device media synchronization
The continuing growth in the mobile phone arena, particularly in terms of device capabilities and ownership is having a transformational impact on media consumption. It is now possible to consider orchestrated multi-stream experiences delivered across many devices, rather than the playback of content from a single device. However, there are significant challenges in realising such a vision, particularly around the management of synchronicity between associated media streams. This is compounded by the heterogeneous nature of user devices, the networks upon which they operate, and the perceptions of users. This paper describes IMSync, an open inter-stream synchronisation framework that is QoE-aware. IMSync adopts efficient monitoring and control mechanisms, alongside a QoE perception model that has been derived from a series of subjective user experiments. Based on an observation of lag, IMSync is able to use this model of impact to determine an appropriate strategy to catch-up with playback whilst minimising the potential detrimental impacts on a users QoE. The impact model adopts a balanced approach: trading off the potential impact on QoE of initiating a re-synchronisation process compared with retaining the current levels of non-synchronicity, in order to maintain high levels of QoE. A series of experiments demonstrate the potential of the framework as a basis for enabling new, immersive media experiences
- …