717 research outputs found

    Alpha Channel Fragile Watermarking for Color Image Integrity Protection

    Get PDF
    This paper presents a fragile watermarking algorithm`m for the protection of the integrity of color images with alpha channel. The system is able to identify modified areas with very high probability, even with small color or transparency changes. The main characteristic of the algorithm is the embedding of the watermark by modifying the alpha channel, leaving the color channels untouched and introducing a very small error with respect to the host image. As a consequence, the resulting watermarked images have a very high peak signal-to-noise ratio. The security of the algorithm is based on a secret key defining the embedding space in which the watermark is inserted by means of the Karhunen–Loève transform (KLT) and a genetic algorithm (GA). Its high sensitivity to modifications is shown, proving the security of the whole system

    Multistage classification of multispectral Earth observational data: The design approach

    Get PDF
    An algorithm is proposed which predicts the optimal features at every node in a binary tree procedure. The algorithm estimates the probability of error by approximating the area under the likelihood ratio function for two classes and taking into account the number of training samples used in estimating each of these two classes. Some results on feature selection techniques, particularly in the presence of a very limited set of training samples, are presented. Results comparing probabilities of error predicted by the proposed algorithm as a function of dimensionality as compared to experimental observations are shown for aircraft and LANDSAT data. Results are obtained for both real and simulated data. Finally, two binary tree examples which use the algorithm are presented to illustrate the usefulness of the procedure

    Abnormal ECG search in long-term electrocardiographic recordings from an animal model of heart failure

    Get PDF
    Heart failure is one of the leading causes of death in the United States. Five million Americans suffer from heart failure. Advances in portable electrocardiogram (ECG) monitoring systems and large data storage space allow the ECG to be recorded continuously for long periods. Long-term monitoring could potentially lead to better diagnosis and treatment if the progression of heart failure could be followed. The challenge is to analyze the sheer mass of data. Manual analysis using the classical methods is impossible. In this dissertation, a framework for analysis of long-term ECG recording and methods for searching an abnormal ECG are presented.;The data used in this research were collected from an animal model of heart failure. Chronic heart failure was gradually induced in rats by aldosterone infusion and a high Na and low Mg diet. The ECG was continuously recorded during the experimental period of 11-12 weeks through radiotelemetry. The ECG leads were placed subcutaneously in lead-II configuration. In the end, there were 80 GB of data from five animals. Besides the massive amount of data, noise and artifacts also caused problems in the analysis.;The framework includes data preparation, ECG beat detection, EMG noise detection, baseline fluctuation removal, ECG template generation, feature extraction, and abnormal ECG search. The raw data was converted from its original format and stored in a database for data retrieval. The beat detection technique was improved from the original algorithm so that it was less sensitive to signal baseline jump and more sensitive to beat size variation. A method for estimating a parameter required for baseline fluctuation removal is proposed. It provides a good result on test signals. A new algorithm for EMG noise detection was developed using morphological filters and moving variance. The resulting sensitivity and specificity are 94% and 100%, respectively. A procedure for ECG template generation was proposed to capture gradual change in ECG morphology and manage the matching process if numerous ECG templates are created. RR intervals and heart rate variability parameters are extracted and plotted to display progressive changes as heart failure develops. In the abnormal ECG search, premature ventricular complexes, elevated ST segment, and split-R-wave ECG are considered. New features are extracted from ECG morphology. The Fisher linear discriminant analysis is used to classify the normal and abnormal ECG. The results provide classification rate, sensitivity, and specificity of 97.35%, 96.02%, and 98.91%, respectively

    Unsupervised Texture Segmentation

    Get PDF

    Study of a imaging indexing technique in JPEG Compressed domain

    Get PDF
    In our computers all stored images are in JPEG compressed format even when we download an image from the internet that is also in JPEG compressed format, so it is very essential that we should have content based image indexing its retrieval conducted directly in the compressed domain. In this paper we used a partial decoding algorithm for all the JPEG compressed images to index the images directly in the JPEG compressed domain. We also compare the performance of the approaches in DCT domain and the original images in the pixel domain. This technology will prove preciously in those applications where fast image key generation is required. Image and audio techniques are very important in the multimedia applications. In this paper, we comprise an analytical review of the compressed domain indexing techniques, in which we used transform domain techniques such as Fourier transform, karhunen-loeve transform, Cosine transform, subbands and spatial domain techniques, which are using vector quantization and fractrals. So after comparing other research papers we come on the conclusion that when we have to compress the original image then we should convert the image by using the 8X8 pixels of image blocks and after that convert into DCT form and so on. So after doing research on the same concept we can divide image pixels blocks into 4X4X4 blocks of pixels. So by doing the same we can compress the original image by using the steps further

    Signal Flow Graph Approach to Efficient DST I-IV Algorithms

    Get PDF
    In this paper, fast and efficient discrete sine transformation (DST) algorithms are presented based on the factorization of sparse, scaled orthogonal, rotation, rotation-reflection, and butterfly matrices. These algorithms are completely recursive and solely based on DST I-IV. The presented algorithms have low arithmetic cost compared to the known fast DST algorithms. Furthermore, the language of signal flow graph representation of digital structures is used to describe these efficient and recursive DST algorithms having (n−1)(n-1) points signal flow graph for DST-I and nn points signal flow graphs for DST II-IV

    Design of a digital compression technique for shuttle television

    Get PDF
    The determination of the performance and hardware complexity of data compression algorithms applicable to color television signals, were studied to assess the feasibility of digital compression techniques for shuttle communications applications. For return link communications, it is shown that a nonadaptive two dimensional DPCM technique compresses the bandwidth of field-sequential color TV to about 13 MBPS and requires less than 60 watts of secondary power. For forward link communications, a facsimile coding technique is recommended which provides high resolution slow scan television on a 144 KBPS channel. The onboard decoder requires about 19 watts of secondary power
    • …
    corecore