44 research outputs found

    Real-time scalable video coding for surveillance applications on embedded architectures

    Get PDF

    Scalable video compression with optimized visual performance and random accessibility

    Full text link
    This thesis is concerned with maximizing the coding efficiency, random accessibility and visual performance of scalable compressed video. The unifying theme behind this work is the use of finely embedded localized coding structures, which govern the extent to which these goals may be jointly achieved. The first part focuses on scalable volumetric image compression. We investigate 3D transform and coding techniques which exploit inter-slice statistical redundancies without compromising slice accessibility. Our study shows that the motion-compensated temporal discrete wavelet transform (MC-TDWT) practically achieves an upper bound to the compression efficiency of slice transforms. From a video coding perspective, we find that most of the coding gain is attributed to offsetting the learning penalty in adaptive arithmetic coding through 3D code-block extension, rather than inter-frame context modelling. The second aspect of this thesis examines random accessibility. Accessibility refers to the ease with which a region of interest is accessed (subband samples needed for reconstruction are retrieved) from a compressed video bitstream, subject to spatiotemporal code-block constraints. We investigate the fundamental implications of motion compensation for random access efficiency and the compression performance of scalable interactive video. We demonstrate that inclusion of motion compensation operators within the lifting steps of a temporal subband transform incurs a random access penalty which depends on the characteristics of the motion field. The final aspect of this thesis aims to minimize the perceptual impact of visible distortion in scalable reconstructed video. We present a visual optimization strategy based on distortion scaling which raises the distortion-length slope of perceptually significant samples. This alters the codestream embedding order during post-compression rate-distortion optimization, thus allowing visually sensitive sites to be encoded with higher fidelity at a given bit-rate. For visual sensitivity analysis, we propose a contrast perception model that incorporates an adaptive masking slope. This versatile feature provides a context which models perceptual significance. It enables scene structures that otherwise suffer significant degradation to be preserved at lower bit-rates. The novelty in our approach derives from a set of "perceptual mappings" which account for quantization noise shaping effects induced by motion-compensated temporal synthesis. The proposed technique reduces wavelet compression artefacts and improves the perceptual quality of video

    3D exemplar-based image inpainting in electron microscopy

    Get PDF
    In electron microscopy (EM) a common problem is the non-availability of data, which causes artefacts in reconstructions. In this thesis the goal is to generate artificial data where missing in EM by using exemplar-based inpainting (EBI). We implement an accelerated 3D version tailored to applications in EM, which reduces reconstruction times from days to minutes. We develop intelligent sampling strategies to find optimal data as input for reconstruction methods. Further, we investigate approaches to reduce electron dose and acquisition time. Sparse sampling followed by inpainting is the most promising approach. As common evaluation measures may lead to misinterpretation of results in EM and falsify a subsequent analysis, we propose to use application driven metrics and demonstrate this in a segmentation task. A further application of our technique is the artificial generation of projections in tiltbased EM. EBI is used to generate missing projections, such that the full angular range is covered. Subsequent reconstructions are significantly enhanced in terms of resolution, which facilitates further analysis of samples. In conclusion, EBI proves promising when used as an additional data generation step to tackle the non-availability of data in EM, which is evaluated in selected applications. Enhancing adaptive sampling methods and refining EBI, especially considering the mutual influence, promotes higher throughput in EM using less electron dose while not lessening quality.Ein hĂ€ufig vorkommendes Problem in der Elektronenmikroskopie (EM) ist die NichtverfĂŒgbarkeit von Daten, was zu Artefakten in Rekonstruktionen fĂŒhrt. In dieser Arbeit ist es das Ziel fehlende Daten in der EM kĂŒnstlich zu erzeugen, was durch Exemplar-basiertes Inpainting (EBI) realisiert wird. Wir implementieren eine auf EM zugeschnittene beschleunigte 3D Version, welche es ermöglicht, Rekonstruktionszeiten von Tagen auf Minuten zu reduzieren. Wir entwickeln intelligente Abtaststrategien, um optimale Datenpunkte fĂŒr die Rekonstruktion zu erhalten. AnsĂ€tze zur Reduzierung von Elektronendosis und Aufnahmezeit werden untersucht. Unterabtastung gefolgt von Inpainting fĂŒhrt zu den besten Resultaten. Evaluationsmaße zur Beurteilung der RekonstruktionsqualitĂ€t helfen in der EM oft nicht und können zu falschen SchlĂŒssen fĂŒhren, weswegen anwendungsbasierte Metriken die bessere Wahl darstellen. Dies demonstrieren wir anhand eines Beispiels. Die kĂŒnstliche Erzeugung von Projektionen in der neigungsbasierten Elektronentomographie ist eine weitere Anwendung. EBI wird verwendet um fehlende Projektionen zu generieren. Daraus resultierende Rekonstruktionen weisen eine deutlich erhöhte Auflösung auf. EBI ist ein vielversprechender Ansatz, um nicht verfĂŒgbare Daten in der EM zu generieren. Dies wird auf Basis verschiedener Anwendungen gezeigt und evaluiert. Adaptive Aufnahmestrategien und EBI können also zu einem höheren Durchsatz in der EM fĂŒhren, ohne die BildqualitĂ€t merklich zu verschlechtern

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Energy efficient hardware acceleration of multimedia processing tools

    Get PDF
    The world of mobile devices is experiencing an ongoing trend of feature enhancement and generalpurpose multimedia platform convergence. This trend poses many grand challenges, the most pressing being their limited battery life as a consequence of delivering computationally demanding features. The envisaged mobile application features can be considered to be accelerated by a set of underpinning hardware blocks Based on the survey that this thesis presents on modem video compression standards and their associated enabling technologies, it is concluded that tight energy and throughput constraints can still be effectively tackled at algorithmic level in order to design re-usable optimised hardware acceleration cores. To prove these conclusions, the work m this thesis is focused on two of the basic enabling technologies that support mobile video applications, namely the Shape Adaptive Discrete Cosine Transform (SA-DCT) and its inverse, the SA-IDCT. The hardware architectures presented in this work have been designed with energy efficiency in mind. This goal is achieved by employing high level techniques such as redundant computation elimination, parallelism and low switching computation structures. Both architectures compare favourably against the relevant pnor art in the literature. The SA-DCT/IDCT technologies are instances of a more general computation - namely, both are Constant Matrix Multiplication (CMM) operations. Thus, this thesis also proposes an algorithm for the efficient hardware design of any general CMM-based enabling technology. The proposed algorithm leverages the effective solution search capability of genetic programming. A bonus feature of the proposed modelling approach is that it is further amenable to hardware acceleration. Another bonus feature is an early exit mechanism that achieves large search space reductions .Results show an improvement on state of the art algorithms with future potential for even greater savings

    30th International Conference on Condition Monitoring and Diagnostic Engineering Management (COMADEM 2017)

    Get PDF
    Proceedings of COMADEM 201

    Joint coding/decoding techniques and diversity techniques for video and HTML transmission over wireless point/multipoint: a survey

    Get PDF
    I. Introduction The concomitant developments of the Internet, which offers to its users always larger and more evolved contents (from HTML (HyperText Markup Language) files to multimedia applications), and of wireless systems and handhelds integrating them, have progressively convinced a fair share of people of the interest to always be connected. Still, constraints of heterogeneity, reliability, quality and delay over the transmission channels are generally imposed to fulfill the requirements of these new needs and their corresponding economical goals. This implies different theoretical and practical challenges for the digital communications community of the present time. This paper presents a survey of the different techniques existing in the domain of HTML and video stream transmission over erroneous or lossy channels. In particular, the existing techniques on joint source and channel coding and decoding for multimedia or HTML applications are surveyed, as well as the related problems of streaming and downloading files over an IP mobile link. Finally, various diversity techniques that can be considered for such links, from antenna diversity to coding diversity, are presented...L’engouement du grand public pour les applications multimĂ©dia sans fil ne cesse de croĂźtre depuis le dĂ©veloppement d’Internet. Des contraintes d’hĂ©tĂ©rogĂ©nĂ©itĂ© de canaux de transmission, de fiabilitĂ©, de qualitĂ© et de dĂ©lai sont gĂ©nĂ©ralement exigĂ©es pour satisfaire les nouveaux besoins applicatifs entraĂźnant ainsi des enjeux Ă©conomiques importants. À l’heure actuelle, il reste encore un certain nombre de dĂ©fis pratiques et thĂ©oriques lancĂ©s par les chercheurs de la communautĂ© des communications numĂ©riques. C’est dans ce cadre que s’inscrit le panorama prĂ©sentĂ© ici. Cet article prĂ©sente d’une part un Ă©tat de l’art sur les principales techniques de codage et de dĂ©codage conjoint dĂ©veloppĂ©es dans la littĂ©rature pour des applications multimĂ©dia de type tĂ©lĂ©chargement et diffusion de contenu sur lien mobile IP. Sont tout d’abord rappelĂ©es des notions fondamentales des communications numĂ©riques Ă  savoir le codage de source, le codage de canal ainsi que les thĂ©orĂšmes de Shannon et leurs principales limitations. Les techniques de codage dĂ©codage conjoint prĂ©sentĂ©es dans cet article concernent essentiellement celles dĂ©veloppĂ©es pour des schĂ©mas de codage de source faisant intervenir des codes Ă  longueur variable (CLV) notamment les codes d’Huffman, arithmĂ©tiques et les codes entropiques universels de type Lempel-Ziv (LZ). Faisant face au problĂšme de la transmission de donnĂ©es (Hypertext Markup Language (HTML) et vidĂ©o) sur un lien sans fil, cet article prĂ©sente d’autre part un panorama de techniques de diversitĂ©s plus ou moins complexes en vue d’introduire le nouveau systĂšme Ă  multiples antennes d’émission et de rĂ©ception

    Design of large polyphase filters in the Quadratic Residue Number System

    Full text link

    Intelligent Sensor Networks

    Get PDF
    In the last decade, wireless or wired sensor networks have attracted much attention. However, most designs target general sensor network issues including protocol stack (routing, MAC, etc.) and security issues. This book focuses on the close integration of sensing, networking, and smart signal processing via machine learning. Based on their world-class research, the authors present the fundamentals of intelligent sensor networks. They cover sensing and sampling, distributed signal processing, and intelligent signal learning. In addition, they present cutting-edge research results from leading experts
    corecore