40 research outputs found

    A method of sieves for multiresolution spectrum estimation and radar imaging

    Full text link

    Quantifying Antarctic icebergs and their melting in the ocean.

    Get PDF
    From the Antarctic Ice Sheet calves every year into the Southern Ocean, an average of 2000 km3 of icebergs. The meltwater is spread over a large area in the Southern Ocean but the large temporal variability in iceberg calving and the clustering of iceberg distribution means that meltwater injection can be locally very high. This study quantifies iceberg distribution, movement and melting using remote sensing observations and modelling. Icebergs were detected and tracked on Synthetic Aperture Radar images using a new computer-based iceberg detection method. The method allows an efficient and systematic processing of large volumes of SAR images, necessary to build a climatology of icebergs in the Southern Ocean. Tests were conducted using ground data from a field campaign and against manual image classification. The method was applied to several SAR image collections, namely the RADARS AT RAMP mosaic for the totality of coastal Antarctica, providing the first picture of iceberg distribution over such a large area. Giant icebergs (icebergs above 100 km2 in area) were shown to carry over half the total mass of the Antarctic iceberg population. Estimates of the spatial distribution of giant iceberg melting over the ocean were made using observed tracks and modelling the melting and spreading along its path. The modelling of basal melting was tested using ICESat laser altimetry to measure the reduction in the freeboard of three giant icebergs in the Ross. The distribution of meltwater for giant icebergs was combined with an existing simulation of meltwater distribution from smaller icebergs to produce the first map of total iceberg meltwater for the Southern Ocean. The iceberg contribution to the freshwater flux is shown to be relevant to both the Weddell Sea and the Southern Ocean south of the Polar Front

    Image Registration Workshop Proceedings

    Get PDF
    Automatic image registration has often been considered as a preliminary step for higher-level processing, such as object recognition or data fusion. But with the unprecedented amounts of data which are being and will continue to be generated by newly developed sensors, the very topic of automatic image registration has become and important research topic. This workshop presents a collection of very high quality work which has been grouped in four main areas: (1) theoretical aspects of image registration; (2) applications to satellite imagery; (3) applications to medical imagery; and (4) image registration for computer vision research

    Multi-scale texture segmentation of synthetic aperture radar images

    Get PDF
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Data Hiding in Digital Video

    Get PDF
    With the rapid development of digital multimedia technologies, an old method which is called steganography has been sought to be a solution for data hiding applications such as digital watermarking and covert communication. Steganography is the art of secret communication using a cover signal, e.g., video, audio, image etc., whereas the counter-technique, detecting the existence of such as a channel through a statistically trained classifier, is called steganalysis. The state-of-the art data hiding algorithms utilize features; such as Discrete Cosine Transform (DCT) coefficients, pixel values, motion vectors etc., of the cover signal to convey the message to the receiver side. The goal of embedding algorithm is to maximize the number of bits sent to the decoder side (embedding capacity) with maximum robustness against attacks while keeping the perceptual and statistical distortions (security) low. Data Hiding schemes are characterized by these three conflicting requirements: security against steganalysis, robustness against channel associated and/or intentional distortions, and the capacity in terms of the embedded payload. Depending upon the application it is the designer\u27s task to find an optimum solution amongst them. The goal of this thesis is to develop a novel data hiding scheme to establish a covert channel satisfying statistical and perceptual invisibility with moderate rate capacity and robustness to combat steganalysis based detection. The idea behind the proposed method is the alteration of Video Object (VO) trajectory coordinates to convey the message to the receiver side by perturbing the centroid coordinates of the VO. Firstly, the VO is selected by the user and tracked through the frames by using a simple region based search strategy and morphological operations. After the trajectory coordinates are obtained, the perturbation of the coordinates implemented through the usage of a non-linear embedding function, such as a polar quantizer where both the magnitude and phase of the motion is used. However, the perturbations made to the motion magnitude and phase were kept small to preserve the semantic meaning of the object motion trajectory. The proposed method is well suited to the video sequences in which VOs have smooth motion trajectories. Examples of these types could be found in sports videos in which the ball is the focus of attention and exhibits various motion types, e.g., rolling on the ground, flying in the air, being possessed by a player, etc. Different sports video sequences have been tested by using the proposed method. Through the experimental results, it is shown that the proposed method achieved the goal of both statistical and perceptual invisibility with moderate rate embedding capacity under AWGN channel with varying noise variances. This achievement is important as the first step for both active and passive steganalysis is the detection of the existence of covert channel. This work has multiple contributions in the field of data hiding. Firstly, it is the first example of a data hiding method in which the trajectory of a VO is used. Secondly, this work has contributed towards improving steganographic security by providing new features: the coordinate location and semantic meaning of the object

    NASA SBIR abstracts of 1991 phase 1 projects

    Get PDF
    The objectives of 301 projects placed under contract by the Small Business Innovation Research (SBIR) program of the National Aeronautics and Space Administration (NASA) are described. These projects were selected competitively from among proposals submitted to NASA in response to the 1991 SBIR Program Solicitation. The basic document consists of edited, non-proprietary abstracts of the winning proposals submitted by small businesses. The abstracts are presented under the 15 technical topics within which Phase 1 proposals were solicited. Each project was assigned a sequential identifying number from 001 to 301, in order of its appearance in the body of the report. Appendixes to provide additional information about the SBIR program and permit cross-reference of the 1991 Phase 1 projects by company name, location by state, principal investigator, NASA Field Center responsible for management of each project, and NASA contract number are included

    Program and Abstracts of the Annual Meeting of the Georgia Academy of Science, 2011

    Get PDF
    The annual meeting of the Georgia Academy of Science took place March 23–24, 2011, at Gainesville State College, Oakwood, Georgia. Presentations were provided by members of the Academy who represented the following sections: I. Biological Sciences, II. Chemistry, III. Earth & Atmospheric Sciences, IV. Physics, Mathematics, Computer Science, Engineering & Technology, V. Biomedical Sciences, VI. Philosophy & History of Science, VII. Science Education, and VIII. Anthropology

    Estudo comparativo entre os metodos de Rosenblatt-Parzen e Grenander na estimação de densidades

    Get PDF
    Orientador: Mauro S. de Freitas MarquesDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação CientificaResumo: Desde 1890 diferentes formas de estimar uma função densidade de probabilidade têm sido propostas. Uma destas é devida a Pearson entre 1890 e 1900, e é obtida como solução de uma equação diferencial (Johnson, N. & Kotz, S., 1988). A partir de 1956 os métodos de estimação de funções de densidade de probabilidade não paramétricos têm-se consolidado como uma alternativa sofisticada ao tratamento tradicional de estudar conjuntos de dados. Esta alternativa. baseia-se na possibilidade de analisar os dados sem assumir um comportamento distribucional específico. Sobre o problema da estimação de funções de densidade de probabilidade trata o Capítulo I desta dissertação. Descrevemos também de maneira resumida algumas das propostas para obter estes estimadores e definimos propriedades estatísticas que serão estudadas nas diferentes situações consideradas. O Capítulo 11 dedica-se ao estudo de duas propostas de estimadores da função de densidade. O primeiro estimador estudado é o de Rosenblatt-Parzen. As primeiras idéias deste estimador devem-se a Rosenblatt (1956), idéias posteriormente generalizadas por Parzen (1962), obtendose o atualmente conhecido como estimador de Rosenblatt-Parzen ou "kernel". A seguir estuda-se um caso particular do estimador proposto por Grenander (1981). O estimador obtido segundo esta metodologia é conhecido como estimador de Grenander ou "sieves" de convolução. Geman & Hwang (1982) mostraram a forma do estimador de Grenander quando a densi~ade gaussiana é utilizada, na convolução, como a função núcleo. No Capítulo 111 estudamos d~mo obter a forma do estimador "sieves" de convolução em situações mais gerais de duas maneiras diferentes. Uma destas maneiras é uma generalização das idéias de Geman & Hwang (19,8'2) e a outra utiliza o modelo de dados incompletos. Estes resultados constituem a proposta teórica mais importante. Como a forma dos estimadores de Grenander obtidos através de convoluções pode ser vista como um modelo de mistura finita de densidades, realizamos no Capítulo IV um estudo do algoritmo EM para o caso particular destes modelos. Nele, apresentamos a teoria geral dos modelos de mistura de densidade e provamos que é possível utilizar o algoritmo EM na estimação de densidades segundo a proposta de Grenander no caso de convoluções, exemplificando este algoritmo quando é utilizada na convolução a densidade gaussiana. Finalmente, comparamos a performance do estimador de Grenander em relação ao estimador de Rosenblatt-Parzen, através de dados simulados para diferentes funções de densidade. Este estudo constitui o Capítulo VAbstract: Not informedMestradoMestre em Estatístic
    corecore