10 research outputs found

    A generalized, parametric PR-QMF/wavelet transform design approach for multiresolution signal decomposition

    Get PDF
    This dissertation aims to emphasize the interrelations and the linkages of the theories of discrete-time filter banks and wavelet transforms. It is shown that the Binomial-QMF banks are identical to the interscale coefficients or filters of the compactly supported orthonormal wavelet transform bases proposed by Daubechies. A generalized, parametric, smooth 2-band PR-QMF design approach based on Bernstein polynomial approximation is developed. It is found that the most regular compact support orthonormal wavelet filters, coiflet filters are only the special cases of the proposed filter bank design technique. A new objective performance measure called Non-aliasing Energy Ratio(NER) is developed. Its merits are proven with the comparative performance studies of the well known orthonormal signal decomposition techniques. This dissertation also addresses the optimal 2-band PR-QMF design problem. The variables of practical significance in image processing and coding are included in the optimization problem. The upper performance bounds of 2-band PR-QMF and their corresponding filter coefficients are derived. It is objectively shown that there are superior filter bank solutions available over the standard block transform, DCT. It is expected that the theoretical contributions of this dissertation will find its applications particularly in Visual Signal Processing and Coding

    An optimally well-localized multi-channel parallel perfect reconstruction filter bank.

    Get PDF
    Joint uncertainty for the overall L-channel, one-dimensional, parallel filter bank is quantified by a metric which is a weighted sum of the time and frequency localizations of the individual filters. Evidence is presented to show that a filter bank possessing a lower joint filter bank uncertainty with respect to this metric results in a computed multicomponent AM-FM image model that yields lower reconstruction errors. This strongly supports the theory that there is a direct relationship between joint uncertainty as quantified by the measures developed and the degree of local smoothness or "local coherency" that may be expected in the filter bank channel responses. Thus, as demonstrated by the examples, these new measures may be used to construct new filter banks that offer excellent localization properties on par with those of Gabor filter banks.This dissertation defines a measure of uncertainty for finite length discrete-time signals. Using this uncertainty measure, a relationship analogous to the well known continuous-time Heisenberg-Weyl inequality is developed. This uncertainty measure is applied to quantify the joint discrete time-discrete frequency localization of finite impulse response filters, which are used in a quadrature mirror filter bank (QMF). A formulation of a biorthogonal QMF where the low pass analysis filter minimizes the newly defined measure of uncertainty is presented. The search algorithm used in the design of the length-N linear phase low pass analysis FIR filter is given for N = 6 and 8. In each case, the other three filters, which constitute a perfect reconstruction QMF, are determined by adapting a method due to Vetterli and Le Gall. From a set of well known QMFs comprised of length six filters, L-channel perfect reconstruction parallel filter banks (PRPFB) are constructed. The Noble identities are used to show that the L-channel PRPFB is equivalent to a L - 1 level discrete wavelet filter bank. Several five-channel PRPFBs are implemented. A separable implementation of a five-channel, one-dimensional filter bank produces twenty-five channel, two-dimensional filter bank. Each non-low pass, two-dimensional filter is decomposed in a novel, nonseparable way to obtain equivalent channel filters that possess orientation selectivity. This results in a forty-one channel, two-dimensional, orientation selective, PRPFB

    The Telecommunications and Data Acquisition Report

    Get PDF
    This quarterly publication provides archival reports on developments in programs in space communications, radio navigation, radio science, and ground-based radio and radar astronomy. It reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standardization activities at the Jet Propulsion Laboratory for space data and information systems

    High-performance compression of visual information - A tutorial review - Part I : Still Pictures

    Get PDF
    Digital images have become an important source of information in the modern world of communication systems. In their raw form, digital images require a tremendous amount of memory. Many research efforts have been devoted to the problem of image compression in the last two decades. Two different compression categories must be distinguished: lossless and lossy. Lossless compression is achieved if no distortion is introduced in the coded image. Applications requiring this type of compression include medical imaging and satellite photography. For applications such as video telephony or multimedia applications, some loss of information is usually tolerated in exchange for a high compression ratio. In this two-part paper, the major building blocks of image coding schemes are overviewed. Part I covers still image coding, and Part II covers motion picture sequences. In this first part, still image coding schemes have been classified into predictive, block transform, and multiresolution approaches. Predictive methods are suited to lossless and low-compression applications. Transform-based coding schemes achieve higher compression ratios for lossy compression but suffer from blocking artifacts at high-compression ratios. Multiresolution approaches are suited for lossy as well for lossless compression. At lossy high-compression ratios, the typical artifact visible in the reconstructed images is the ringing effect. New applications in a multimedia environment drove the need for new functionalities of the image coding schemes. For that purpose, second-generation coding techniques segment the image into semantically meaningful parts. Therefore, parts of these methods have been adapted to work for arbitrarily shaped regions. In order to add another functionality, such as progressive transmission of the information, specific quantization algorithms must be defined. A final step in the compression scheme is achieved by the codeword assignment. Finally, coding results are presented which compare stateof- the-art techniques for lossy and lossless compression. The different artifacts of each technique are highlighted and discussed. Also, the possibility of progressive transmission is illustrated

    Wavelets and Subband Coding

    Get PDF
    First published in 1995, Wavelets and Subband Coding offered a unified view of the exciting field of wavelets and their discrete-time cousins, filter banks, or subband coding. The book developed the theory in both continuous and discrete time, and presented important applications. During the past decade, it filled a useful need in explaining a new view of signal processing based on flexible time-frequency analysis and its applications. Since 2007, the authors now retain the copyright and allow open access to the book

    Collective analog bioelectronic computation

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Cataloged from student submitted PDF version of thesis.Includes bibliographical references (p. 677-710).In this thesis, I present two examples of fast-and-highly-parallel analog computation inspired by architectures in biology. The first example, an RF cochlea, maps the partial differential equations that describe fluid-membrane-hair-cell wave propagation in the biological cochlea to an equivalent inductor-capacitor-transistor integrated circuit. It allows ultra-broadband spectrum analysis of RF signals to be performed in a rapid low-power fashion, thus enabling applications for universal or software radio. The second example exploits detailed similarities between the equations that describe chemical-reaction dynamics and the equations that describe subthreshold current flow in transistors to create fast-and-highly-parallel integrated-circuit models of protein-protein and gene-protein networks inside a cell. Due to a natural mapping between the Poisson statistics of molecular flows in a chemical reaction and Poisson statistics of electronic current flow in a transistor, stochastic effects are automatically incorporated into the circuit architecture, allowing highly computationally intensive stochastic simulations of large-scale biochemical reaction networks to be performed rapidly. I show that the exponentially tapered transmission-line architecture of the mammalian cochlea performs constant-fractional-bandwidth spectrum analysis with O(N) expenditure of both analysis time and hardware, where N is the number of analyzed frequency bins. This is the best known performance of any spectrum-analysis architecture, including the constant-resolution Fast Fourier Transform (FFT), which scales as O(N logN), or a constant-fractional-bandwidth filterbank, which scales as O (N2).(cont.) The RF cochlea uses this bio-inspired architecture to perform real-time, on-chip spectrum analysis at radio frequencies. I demonstrate two cochlea chips, implemented in standard 0.13m CMOS technology, that decompose the RF spectrum from 600MHz to 8GHz into 50 log-spaced channels, consume < 300mW of power, and possess 70dB of dynamic range. The real-time spectrum analysis capabilities of my chips make them uniquely suitable for ultra-broadband universal or software radio receivers of the future. I show that the protein-protein and gene-protein chips that I have built are particularly suitable for simulation, parameter discovery and sensitivity analysis of interaction networks in cell biology, such as signaling, metabolic, and gene regulation pathways. Importantly, the chips carry out massively parallel computations, resulting in simulation times that are independent of model complexity, i.e., O(1). They also automatically model stochastic effects, which are of importance in many biological systems, but are numerically stiff and simulate slowly on digital computers. Currently, non-fundamental data-acquisition limitations show that my proof-of-concept chips simulate small-scale biochemical reaction networks at least 100 times faster than modern desktop machines. It should be possible to get 103 to 106 simulation speedups of genome-scale and organ-scale intracellular and extracellular biochemical reaction networks with improved versions of my chips. Such chips could be important both as analysis tools in systems biology and design tools in synthetic biology.by Soumyajit Mandal.Ph.D

    The Fifth NASA Symposium on VLSI Design

    Get PDF
    The fifth annual NASA Symposium on VLSI Design had 13 sessions including Radiation Effects, Architectures, Mixed Signal, Design Techniques, Fault Testing, Synthesis, Signal Processing, and other Featured Presentations. The symposium provides insights into developments in VLSI and digital systems which can be used to increase data systems performance. The presentations share insights into next generation advances that will serve as a basis for future VLSI design

    Abstracts on Radio Direction Finding (1899 - 1995)

    Get PDF
    The files on this record represent the various databases that originally composed the CD-ROM issue of "Abstracts on Radio Direction Finding" database, which is now part of the Dudley Knox Library's Abstracts and Selected Full Text Documents on Radio Direction Finding (1899 - 1995) Collection. (See Calhoun record https://calhoun.nps.edu/handle/10945/57364 for further information on this collection and the bibliography). Due to issues of technological obsolescence preventing current and future audiences from accessing the bibliography, DKL exported and converted into the three files on this record the various databases contained in the CD-ROM. The contents of these files are: 1) RDFA_CompleteBibliography_xls.zip [RDFA_CompleteBibliography.xls: Metadata for the complete bibliography, in Excel 97-2003 Workbook format; RDFA_Glossary.xls: Glossary of terms, in Excel 97-2003 Workbookformat; RDFA_Biographies.xls: Biographies of leading figures, in Excel 97-2003 Workbook format]; 2) RDFA_CompleteBibliography_csv.zip [RDFA_CompleteBibliography.TXT: Metadata for the complete bibliography, in CSV format; RDFA_Glossary.TXT: Glossary of terms, in CSV format; RDFA_Biographies.TXT: Biographies of leading figures, in CSV format]; 3) RDFA_CompleteBibliography.pdf: A human readable display of the bibliographic data, as a means of double-checking any possible deviations due to conversion

    African Handbook of Climate Change Adaptation

    Get PDF
    This open access book discusses current thinking and presents the main issues and challenges associated with climate change in Africa. It introduces evidences from studies and projects which show how climate change adaptation is being - and may continue to be successfully implemented in African countries. Thanks to its scope and wide range of themes surrounding climate change, the ambition is that this book will be a lead publication on the topic, which may be regularly updated and hence capture further works. Climate change is a major global challenge. However, some geographical regions are more severly affected than others. One of these regions is the African continent. Due to a combination of unfavourable socio-economic and meteorological conditions, African countries are particularly vulnerable to climate change and its impacts. The recently released IPCC special report "Global Warming of 1.5º C" outlines the fact that keeping global warming by the level of 1.5º C is possible, but also suggested that an increase by 2º C could lead to crises with crops (agriculture fed by rain could drop by 50% in some African countries by 2020) and livestock production, could damage water supplies and pose an additonal threat to coastal areas. The 5th Assessment Report produced by IPCC predicts that wheat may disappear from Africa by 2080, and that maize— a staple—will fall significantly in southern Africa. Also, arid and semi-arid lands are likely to increase by up to 8%, with severe ramifications for livelihoods, poverty eradication and meeting the SDGs. Pursuing appropriate adaptation strategies is thus vital, in order to address the current and future challenges posed by a changing climate. It is against this background that the "African Handbook of Climate Change Adaptation" is being published. It contains papers prepared by scholars, representatives from social movements, practitioners and members of governmental agencies, undertaking research and/or executing climate change projects in Africa, and working with communities across the African continent. Encompassing over 100 contribtions from across Africa, it is the most comprehensive publication on climate change adaptation in Africa ever produced
    corecore