688 research outputs found

    Estimating the granularity coefficient of a Potts-Markov random field within an MCMC algorithm

    Get PDF
    This paper addresses the problem of estimating the Potts parameter B jointly with the unknown parameters of a Bayesian model within a Markov chain Monte Carlo (MCMC) algorithm. Standard MCMC methods cannot be applied to this problem because performing inference on B requires computing the intractable normalizing constant of the Potts model. In the proposed MCMC method the estimation of B is conducted using a likelihood-free Metropolis-Hastings algorithm. Experimental results obtained for synthetic data show that estimating B jointly with the other unknown parameters leads to estimation results that are as good as those obtained with the actual value of B. On the other hand, assuming that the value of B is known can degrade estimation performance significantly if this value is incorrect. To illustrate the interest of this method, the proposed algorithm is successfully applied to real bidimensional SAR and tridimensional ultrasound images

    SAR Amplitude Probability Density Function Estimation Based on a Generalized Gaussian Model

    Get PDF
    International audienceIn the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed “method-of-log-cumulants” (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena

    Statistical Modeling of SAR Images: A Survey

    Get PDF
    Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last

    A Generalized Gaussian Extension to the Rician Distribution for SAR Image Modeling

    Get PDF
    In this paper, we present a novel statistical model, the generalized-Gaussian-Rician\textit{the generalized-Gaussian-Rician} (GG-Rician) distribution, for the characterization of synthetic aperture radar (SAR) images. Since accurate statistical models lead to better results in applications such as target tracking, classification, or despeckling, characterizing SAR images of various scenes including urban, sea surface, or agricultural, is essential. The proposed statistical model is based on the Rician distribution to model the amplitude of a complex SAR signal, the in-phase and quadrature components of which are assumed to be generalized-Gaussian distributed. The proposed amplitude GG-Rician model is further extended to cover the intensity SAR signals. In the experimental analysis, the GG-Rician model is investigated for amplitude and intensity SAR images of various frequency bands and scenes in comparison to state-of-the-art statistical models that include K\mathcal{K}, Weibull, Gamma, and Lognormal. In order to decide on the most suitable model, statistical significance analysis via Kullback-Leibler divergence and Kolmogorov-Smirnov statistics are performed. The results demonstrate the superior performance and flexibility of the proposed model for all frequency bands and scenes and its applicability on both amplitude and intensity SAR images.Comment: 20 Pages, 9 figures, 8 table

    Modeling the statistics of high resolution SAR images

    Get PDF
    In the context of remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of pixel intensities. In this work, we develop a parametric finite mixture model for modelling the statistics of intensities in high resolution Synthetic Aperture Radar (SAR) images. Along with the models we design an efficient parameter estimation scheme by integrating the Stochastic Expectation Maximization scheme and the Method of log-cumulants with an automatic technique to select, for each mixture component, an optimal parametric model taken from a predefined dictionary of parametric probability density functions (pdf). In particular, the proposed dictionary consists of eight most efficient state-of-the-art SAR-specific pdfs: Nakagami, log-normal, generalized Gaussian Rayleigh, Heavy-tailed Rayleigh, Weibull, K-root, Fisher and generalized Gamma. The experiment results with a set of several real SAR (COSMO-SkyMed) images demonstrate the high accuracy of the designed algorithm, both from the viewpoint of a visual comparison of the histograms, and from the viewpoint of quantitive measures such as correlation coefficient (always above 99,5%) . We stress, in particular, that the method proves to be effective on all the considered images, remaining accurate for multimodal and highly heterogeneous images

    Electromagnetic models for ultrasound image processing

    Get PDF
    Speckle noise appears when coherent illumination is employed, as for example Laser, Synthetic Aperture Radar (SAR), Sonar, Magnetic Resonance, X-ray and Ultrasound imagery. Backscattered echoes from the randomly distributed scatterers in the microscopic structure of the medium are the origin of speckle phenomenon, which characterizes coherent imaging with a granular appearance. It can be shown that speckle noise is of multiplicative nature, strongly correlated and more importantly, with non-Gaussian statistics. These characteristics differ greatly from the traditional assumption of white additive Gaussian noise, often taken in image segmentation, filtering, and in general, image processing; which leads to reduction of the methods effectiveness for final image information extraction; therefore, this kind of noise severely impairs human and machine ability to image interpretation. Statistical modeling is of particular relevance when dealing with speckled data in order to obtain efficient image processing algorithms; but, additionally, clinical ultrasound imaging systems employ nonlinear signal processing to reduce the dynamic range of the input echo signal to match the smaller dynamic range of the display device and to emphasize objects with weak backscatter. This reduction in dynamic range is normally achieved through a logarithmic amplifier i.e. logarithmic compression, which selectively compresses large input signals. This kind of nonlinear compression totally changes the statistics of the input envelope signal; and, a closed form expression for the density function of the logarithmic transformed data is usually hard to derive. This thesis is concerned with the statistical distributions of the Log-compressed amplitude signal in coherent imagery, and its main objective is to develop a general statistical model for log-compressed ultrasound B-scan images. The developed model is adapted, making the pertinent physical analogies, from the multiplicative model in Synthetic Aperture Radar (SAR) context. It is shown that the proposed model can successfully describe log-compressed data generated from different models proposed in the specialized ultrasound image processing literature. Also, the model is successfully applied to model in-vivo echo-cardiographic (ultrasound) B-scan images. Necessary theorems are established to account for a rigorous mathematical proof of the validity and generality of the model. Additionally, a physical interpretation of the parameters is given, and the connections between the generalized central limit theorems, the multiplicative model and the compound representations approaches for the different models proposed up-to-date, are established. It is shown that the log-amplifier parameters are included as model parameters and all the model parameters are estimated using moments and maximum likelihood methods. Finally, three applications are developed: speckle noise identification and filtering; segmentation of in vivo echo-cardiographic (ultrasound) B-scan images and a novel approach for heart ejection fraction evaluationEl ruido Speckle aparece cuando se utilizan sistemas de iluminación coherente, como por ejemplo Láser, Radar de Apertura Sintética (SAR), Sonar, Resonancia Magnética, rayos X y ultrasonidos. Los ecos dispersados por los centros dispersores distribuidos al azar en la estructura microscópica del medio son el origen de este fenómeno, que caracteriza las imágenes coherentes con un aspecto granular. Se puede demostrar que el ruido Speckle es de carácter multiplicativo, fuertemente correlacionados y lo más importante, con estadística no Gaussiana. Estas características son muy diferentes de la suposición tradicional de ruido aditivo gaussiano blanco, a menudo asumida en la segmentación de imágenes, filtrado, y en general, en el procesamiento de imágenes; lo cual se traduce en la reducción de la eficacia de los métodos para la extracción de información de la imagen final. La modelización estadística es de particular relevancia cuando se trata con datos Speckle, a fin de obtener algoritmos de procesamiento de imágenes eficientes. Además, el procesamiento no lineal de señales empleado en sistemas clínicos de imágenes por ultrasonido para reducir el rango dinámico de la señal de eco de entrada de manera que coincida con el rango dinámico más pequeño del dispositivo de visualización y resaltar así los objetos con dispersión más débil, modifica radicalmente la estadística de los datos. Esta reducción en el rango dinámico se logra normalmente a través de un amplificador logarítmico es decir, la compresión logarítmica, que comprime selectivamente las señales de entrada y una forma analítica para la expresión de la función de densidad de los datos transformados logarítmicamente es por lo general difícil de derivar. Esta tesis se centra en las distribuciones estadísticas de la amplitud de la señal comprimida logarítmicamente en las imágenes coherentes, y su principal objetivo es el desarrollo de un modelo estadístico general para las imágenes por ultrasonido comprimidas logarítmicamente en modo-B. El modelo desarrollado se adaptó, realizando las analogías físicas relevantes, del modelo multiplicativo en radares de apertura sintética (SAR). El Modelo propuesto puede describir correctamente los datos comprimidos logarítmicamente a partir datos generados con los diferentes modelos propuestos en la literatura especializada en procesamiento de imágenes por ultrasonido. Además, el modelo se aplica con éxito para modelar ecocardiografías en vivo. Se enuncian y demuestran los teoremas necesarios para dar cuenta de una demostración matemática rigurosa de la validez y generalidad del modelo. Además, se da una interpretación física de los parámetros y se establecen las conexiones entre el teorema central del límite generalizado, el modelo multiplicativo y la composición de distribuciones para los diferentes modelos propuestos hasta a la fecha. Se demuestra además que los parámetros del amplificador logarítmico se incluyen dentro de los parámetros del modelo y se estiman usando los métodos estándar de momentos y máxima verosimilitud. Por último, tres aplicaciones se desarrollan: filtrado de ruido Speckle, segmentación de ecocardiografías y un nuevo enfoque para la evaluación de la fracción de eyección cardiaca.Postprint (published version

    Constant False Alarm Rate Target Detection in Synthetic Aperture Radar Imagery

    Get PDF
    Target detection plays a significant role in many synthetic aperture radar (SAR) applications, ranging from surveillance of military tanks and enemy territories to crop monitoring in agricultural uses. Detection of targets faces two major problems namely, first, how to remotely acquire high resolution images of targets, second, how to efficiently extract information regarding features of clutter-embedded targets. The first problem is addressed by the use of high penetration radar like synthetic aperture radar. The second problem is tackled by efficient algorithms for accurate and fast detection. So far, there are many methods of target detection for SAR imagery available such as CFAR, generalized likelihood ratio test (GLRT) method, multiscale autoregressive method, wavelet transform based method etc. The CFAR method has been extensively used because of its attractive features like simple computation and fast detection of targets. The CFAR algorithm incorporates precise statistical description of background clutter which determines how accurately target detection is achieved. The primary goal of this project is to investigate the statistical distribution of SAR background clutter from homogeneous and heterogeneous ground areas and analyze suitability of statistical distributions mathematically modelled for SAR clutter. The threshold has to be accurately computed based on statistical distribution so as to efficiently distinguish target from SAR clutter. Several distributions such as lognormal, Weibull, K, KK, G0, generalized Gamma (GGD) distributions are considered for clutter amplitude modeling in SAR images. The CFAR detection algorithm based on appropriate background clutter distribution is applied to moving and stationary target acquisition and recognition (MSTAR) images. The experimental results show that, CFAR detector based on GGD outmatches CFAR detectors based on lognormal, Weibull, K, KK, G0 distributions in terms of accuracy and computation time.
    corecore