816 research outputs found

    Lossless Selection Views under Constraints

    Get PDF
    The problem of updating a database through a set of views consists in propagat-ing updates of the views to the base relations over which the view relations are defined, so that the changes to the database reflect exactly those to the views. This is a classical problem in database research, known as the view update prob

    Implementation of Image Compression Algorithm using Verilog with Area, Power and Timing Constraints

    Get PDF
    Image compression is the application of Data compression on digital images. A fundamental shift in the image compression approach came after the Discrete Wavelet Transform (DWT) became popular. To overcome the inefficiencies in the JPEG standard and serve emerging areas of mobile and Internet communications, the new JPEG2000 standard has been developed based on the principles of DWT. An image compression algorithm was comprehended using Matlab code, and modified to perform better when implemented in hardware description language. Using Verilog HDL, the encoder for the image compression employing DWT was implemented. Detailed analysis for power, timing and area was done for Booth multiplier which forms the major building block in implementing DWT. The encoding technique exploits the zero tree structure present in the bitplanes to compress the transform coefficients

    Optimum Implementation of Compound Compression of a Computer Screen for Real-Time Transmission in Low Network Bandwidth Environments

    Get PDF
    Remote working is becoming increasingly more prevalent in recent times. A large part of remote working involves sharing computer screens between servers and clients. The image content that is presented when sharing computer screens consists of both natural camera captured image data as well as computer generated graphics and text. The attributes of natural camera captured image data differ greatly to the attributes of computer generated image data. An image containing a mixture of both natural camera captured image and computer generated image data is known as a compound image. The research presented in this thesis focuses on the challenge of constructing a compound compression strategy to apply the ‘best fit’ compression algorithm for the mixed content found in a compound image. The research also involves analysis and classification of the types of data a given compound image may contain. While researching optimal types of compression, consideration is given to the computational overhead of a given algorithm because the research is being developed for real time systems such as cloud computing services, where latency has a detrimental impact on end user experience. The previous and current state of the art videos codec’s have been researched along many of the most current publishing’s from academia, to design and implement a novel approach to a low complexity compound compression algorithm that will be suitable for real time transmission. The compound compression algorithm will utilise a mixture of lossless and lossy compression algorithms with parameters that can be used to control the performance of the algorithm. An objective image quality assessment is needed to determine whether the proposed algorithm can produce an acceptable quality image after processing. Both traditional metrics such as Peak Signal to Noise Ratio will be used along with a new more modern approach specifically designed for compound images which is known as Structural Similarity Index will be used to define the quality of the decompressed Image. In finishing, the compression strategy will be tested on a set of generated compound images. Using open source software, the same images will be compressed with the previous and current state of the art video codec’s to compare the three main metrics, compression ratio, computational complexity and objective image quality

    An Introduction to Database Systems

    Get PDF
    This textbook introduces the basic concepts of database systems. These concepts are presented through numerous examples in modeling and design. The material in this book is geared to an introductory course in database systems offered at the junior or senior level of Computer Science. It could also be used in a first year graduate course in database systems, focusing on a selection of the advanced topics in the latter chapters

    CMB mapping experiments: a designer's guide

    Get PDF
    We apply state-of-the art data analysis methods to a number of fictitious CMB mapping experiments, including 1/f noise, distilling the cosmological information from time-ordered data to maps to power spectrum estimates, and find that in all cases, the resulting error bars can we well approximated by simple and intuitive analytic expressions. Using these approximations, we discuss how to maximize the scientific return of CMB mapping experiments given the practical constraints at hand, and our main conclusions are as follows. (1) For a given resolution and sensitivity, it is best to cover a sky area such that the signal-to-noise ratio per resolution element (pixel) is of order unity. (2) It is best to avoid excessively skinny observing regions, narrower than a few degrees. (3) The minimum-variance mapmaking method can reduce the effects of 1/f noise by a substantial factor, but only if the scan pattern is thoroughly interconnected. (4) 1/f noise produces a 1/l contribution to the angular power spectrum for well connected single-beam scanning, as compared to virtually white noise for a two-beam scan pattern such as that of the MAP satellite.Comment: 28 pages, with 13 figures included. Minor revisions to match accepted version. Color figures and links at http://www.sns.ias.edu/~max/strategy.html (faster from the US), from http://www.mpa-garching.mpg.de/~max/strategy.html (faster from Europe) or from [email protected]

    Combined Industry, Space and Earth Science Data Compression Workshop

    Get PDF
    The sixth annual Space and Earth Science Data Compression Workshop and the third annual Data Compression Industry Workshop were held as a single combined workshop. The workshop was held April 4, 1996 in Snowbird, Utah in conjunction with the 1996 IEEE Data Compression Conference, which was held at the same location March 31 - April 3, 1996. The Space and Earth Science Data Compression sessions seek to explore opportunities for data compression to enhance the collection, analysis, and retrieval of space and earth science data. Of particular interest is data compression research that is integrated into, or has the potential to be integrated into, a particular space or earth science data information system. Preference is given to data compression research that takes into account the scien- tist's data requirements, and the constraints imposed by the data collection, transmission, distribution and archival systems
    corecore