498 research outputs found

    Design issues for the Generic Stream Encapsulation (GSE) of IP datagrams over DVB-S2

    Get PDF
    The DVB-S2 standard has brought an unprecedented degree of novelty and flexibility in the way IP datagrams or other network level packets can be transmitted over DVB satellite links, with the introduction of an IP-friendly link layer - he continuous Generic Streams - and the adaptive combination of advanced error coding, modulation and spectrum management techniques. Recently approved by the DVB, the Generic Stream Encapsulation (GSE) used for carrying IP datagrams over DVBS2 implements solutions stemmed from a design rationale quite different from the one behind IP encapsulation schemes over its predecessor DVB-S. This paper highlights GSE's original design choices under the perspective of DVB-S2's innovative features and possibilities

    Video Streaming in Evolving Networks under Fuzzy Logic Control

    Get PDF

    Understanding Timelines within MPEG Standards

    Get PDF
    (c) 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Nowadays, media content can be delivered via diverse broadband and broadcast technologies. Although these different technologies have somehow become rivals, their coordinated usage and convergence, by leveraging of their strengths and complementary characteristics, can bring many benefits to both operators and customers. For example, broadcast TV content can be augmented by on-demand broadband media content to provide enriched and personalized services, such as multi-view TV, audio language selection, and inclusion of real-time web feeds. A piece of evidence is the recent Hybrid Broadcast Broadband TV (HbbTV) standard, which aims at harmonizing the delivery and consumption of (hybrid) broadcast and broadband TV content. A key challenge in these emerging scenarios is the synchronization between the involved media streams, which can be originated by the same or different sources, and delivered via the same or different technologies. To enable synchronized (hybrid) media delivery services, some mechanisms providing timelines at the source side are necessary to accurately time align the involved media streams at the receiver-side. This paper provides a comprehensive review of how clock references (timing) and timestamps (time) are conveyed and interpreted when using the most widespread delivery technologies, such as DVB, RTP/RTCP and MPEG standards (e.g., MPEG-2, MPEG-4, MPEG-DASH, and MMT). It is particularly focused on the format, resolution, frequency, and the position within the bitstream of the fields conveying timing information, as well as on the involved components and packetization aspects. Finally, it provides a survey of proofs of concepts making use of these synchronization related mechanisms. This complete and thorough source of information can be very useful for scholars and practitioners interested in media services with synchronization demands.This work has been funded, partially, by the "Fondo Europeo de Desarrollo Regional" (FEDER) and the Spanish Ministry of Economy and Competitiveness, under its R&D&i Support Program in project with ref TEC2013-45492-R.Yuste, LB.; Boronat Segui, F.; Montagut Climent, MA.; Melvin, H. (2015). Understanding Timelines within MPEG Standards. Communications Surveys and Tutorials, IEEE Communications Society. 18(1):368-400. https://doi.org/10.1109/COMST.2015.2488483S36840018

    AUDIO VIDEO SYNCHRONIZATION USING BITSTREAM SYNTAX

    Get PDF
    This paper has proposal to develop audio-video synchronization in the bit-stream syntax using the Slice type and Macro-Block usage. The Media file like MP4, MOV and M4Vcontainer formats consists of both Audio and Video packed as tracks. The Video is encoded in H.264 format while Audio is encoded in AAC formats. There is a common way to achieve synchronization by looking into the type of Slice and the type of MB. This can be achieved by scanning across all the atoms present within the Media data. Time synchronization can be achieved by taking input from the Sample to Time atoms and feeding the data into Auxiliary units. The Slice type plays important role for synchronization as the decode order and display order for Video are different and sufficient time is taken to decode the video based on Profile, Level, type and number of Macroblocks for High profile. So the synchronization can be achieved for a Program stream or a Digital storage medium playback by synchronizing the formats based on type of slice information
    corecore