2,726 research outputs found

    Electromagnetic models for ultrasound image processing

    Get PDF
    Speckle noise appears when coherent illumination is employed, as for example Laser, Synthetic Aperture Radar (SAR), Sonar, Magnetic Resonance, X-ray and Ultrasound imagery. Backscattered echoes from the randomly distributed scatterers in the microscopic structure of the medium are the origin of speckle phenomenon, which characterizes coherent imaging with a granular appearance. It can be shown that speckle noise is of multiplicative nature, strongly correlated and more importantly, with non-Gaussian statistics. These characteristics differ greatly from the traditional assumption of white additive Gaussian noise, often taken in image segmentation, filtering, and in general, image processing; which leads to reduction of the methods effectiveness for final image information extraction; therefore, this kind of noise severely impairs human and machine ability to image interpretation. Statistical modeling is of particular relevance when dealing with speckled data in order to obtain efficient image processing algorithms; but, additionally, clinical ultrasound imaging systems employ nonlinear signal processing to reduce the dynamic range of the input echo signal to match the smaller dynamic range of the display device and to emphasize objects with weak backscatter. This reduction in dynamic range is normally achieved through a logarithmic amplifier i.e. logarithmic compression, which selectively compresses large input signals. This kind of nonlinear compression totally changes the statistics of the input envelope signal; and, a closed form expression for the density function of the logarithmic transformed data is usually hard to derive. This thesis is concerned with the statistical distributions of the Log-compressed amplitude signal in coherent imagery, and its main objective is to develop a general statistical model for log-compressed ultrasound B-scan images. The developed model is adapted, making the pertinent physical analogies, from the multiplicative model in Synthetic Aperture Radar (SAR) context. It is shown that the proposed model can successfully describe log-compressed data generated from different models proposed in the specialized ultrasound image processing literature. Also, the model is successfully applied to model in-vivo echo-cardiographic (ultrasound) B-scan images. Necessary theorems are established to account for a rigorous mathematical proof of the validity and generality of the model. Additionally, a physical interpretation of the parameters is given, and the connections between the generalized central limit theorems, the multiplicative model and the compound representations approaches for the different models proposed up-to-date, are established. It is shown that the log-amplifier parameters are included as model parameters and all the model parameters are estimated using moments and maximum likelihood methods. Finally, three applications are developed: speckle noise identification and filtering; segmentation of in vivo echo-cardiographic (ultrasound) B-scan images and a novel approach for heart ejection fraction evaluationEl ruido Speckle aparece cuando se utilizan sistemas de iluminación coherente, como por ejemplo Láser, Radar de Apertura Sintética (SAR), Sonar, Resonancia Magnética, rayos X y ultrasonidos. Los ecos dispersados por los centros dispersores distribuidos al azar en la estructura microscópica del medio son el origen de este fenómeno, que caracteriza las imágenes coherentes con un aspecto granular. Se puede demostrar que el ruido Speckle es de carácter multiplicativo, fuertemente correlacionados y lo más importante, con estadística no Gaussiana. Estas características son muy diferentes de la suposición tradicional de ruido aditivo gaussiano blanco, a menudo asumida en la segmentación de imágenes, filtrado, y en general, en el procesamiento de imágenes; lo cual se traduce en la reducción de la eficacia de los métodos para la extracción de información de la imagen final. La modelización estadística es de particular relevancia cuando se trata con datos Speckle, a fin de obtener algoritmos de procesamiento de imágenes eficientes. Además, el procesamiento no lineal de señales empleado en sistemas clínicos de imágenes por ultrasonido para reducir el rango dinámico de la señal de eco de entrada de manera que coincida con el rango dinámico más pequeño del dispositivo de visualización y resaltar así los objetos con dispersión más débil, modifica radicalmente la estadística de los datos. Esta reducción en el rango dinámico se logra normalmente a través de un amplificador logarítmico es decir, la compresión logarítmica, que comprime selectivamente las señales de entrada y una forma analítica para la expresión de la función de densidad de los datos transformados logarítmicamente es por lo general difícil de derivar. Esta tesis se centra en las distribuciones estadísticas de la amplitud de la señal comprimida logarítmicamente en las imágenes coherentes, y su principal objetivo es el desarrollo de un modelo estadístico general para las imágenes por ultrasonido comprimidas logarítmicamente en modo-B. El modelo desarrollado se adaptó, realizando las analogías físicas relevantes, del modelo multiplicativo en radares de apertura sintética (SAR). El Modelo propuesto puede describir correctamente los datos comprimidos logarítmicamente a partir datos generados con los diferentes modelos propuestos en la literatura especializada en procesamiento de imágenes por ultrasonido. Además, el modelo se aplica con éxito para modelar ecocardiografías en vivo. Se enuncian y demuestran los teoremas necesarios para dar cuenta de una demostración matemática rigurosa de la validez y generalidad del modelo. Además, se da una interpretación física de los parámetros y se establecen las conexiones entre el teorema central del límite generalizado, el modelo multiplicativo y la composición de distribuciones para los diferentes modelos propuestos hasta a la fecha. Se demuestra además que los parámetros del amplificador logarítmico se incluyen dentro de los parámetros del modelo y se estiman usando los métodos estándar de momentos y máxima verosimilitud. Por último, tres aplicaciones se desarrollan: filtrado de ruido Speckle, segmentación de ecocardiografías y un nuevo enfoque para la evaluación de la fracción de eyección cardiaca.Postprint (published version

    Hypothesis Testing Using Spatially Dependent Heavy-Tailed Multisensor Data

    Get PDF
    The detection of spatially dependent heavy-tailed signals is considered in this dissertation. While the central limit theorem, and its implication of asymptotic normality of interacting random processes, is generally useful for the theoretical characterization of a wide variety of natural and man-made signals, sensor data from many different applications, in fact, are characterized by non-Gaussian distributions. A common characteristic observed in non-Gaussian data is the presence of heavy-tails or fat tails. For such data, the probability density function (p.d.f.) of extreme values decay at a slower-than-exponential rate, implying that extreme events occur with greater probability. When these events are observed simultaneously by several sensors, their observations are also spatially dependent. In this dissertation, we develop the theory of detection for such data, obtained through heterogeneous sensors. In order to validate our theoretical results and proposed algorithms, we collect and analyze the behavior of indoor footstep data using a linear array of seismic sensors. We characterize the inter-sensor dependence using copula theory. Copulas are parametric functions which bind univariate p.d.f. s, to generate a valid joint p.d.f. We model the heavy-tailed data using the class of alpha-stable distributions. We consider a two-sided test in the Neyman-Pearson framework and present an asymptotic analysis of the generalized likelihood test (GLRT). Both, nested and non-nested models are considered in the analysis. We also use a likelihood maximization-based copula selection scheme as an integral part of the detection process. Since many types of copula functions are available in the literature, selecting the appropriate copula becomes an important component of the detection problem. The performance of the proposed scheme is evaluated numerically on simulated data, as well as using indoor seismic data. With appropriately selected models, our results demonstrate that a high probability of detection can be achieved for false alarm probabilities of the order of 10^-4. These results, using dependent alpha-stable signals, are presented for a two-sensor case. We identify the computational challenges associated with dependent alpha-stable modeling and propose alternative schemes to extend the detector design to a multisensor (multivariate) setting. We use a hierarchical tree based approach, called vines, to model the multivariate copulas, i.e., model the spatial dependence between multiple sensors. The performance of the proposed detectors under the vine-based scheme are evaluated on the indoor footstep data, and significant improvement is observed when compared against the case when only two sensors are deployed. Some open research issues are identified and discussed

    Breaking the waves: asymmetric random periodic features for low-bitrate kernel machines

    Full text link
    Many signal processing and machine learning applications are built from evaluating a kernel on pairs of signals, e.g. to assess the similarity of an incoming query to a database of known signals. This nonlinear evaluation can be simplified to a linear inner product of the random Fourier features of those signals: random projections followed by a periodic map, the complex exponential. It is known that a simple quantization of those features (corresponding to replacing the complex exponential by a different periodic map that takes binary values, which is appealing for their transmission and storage), distorts the approximated kernel, which may be undesirable in practice. Our take-home message is that when the features of only one of the two signals are quantized, the original kernel is recovered without distortion; its practical interest appears in several cases where the kernel evaluations are asymmetric by nature, such as a client-server scheme. Concretely, we introduce the general framework of asymmetric random periodic features, where the two signals of interest are observed through random periodic features: random projections followed by a general periodic map, which is allowed to be different for both signals. We derive the influence of those periodic maps on the approximated kernel, and prove uniform probabilistic error bounds holding for all signal pairs from an infinite low-complexity set. Interestingly, our results allow the periodic maps to be discontinuous, thanks to a new mathematical tool, i.e. the mean Lipschitz smoothness. We then apply this generic framework to semi-quantized kernel machines (where only one signal has quantized features and the other has classical random Fourier features), for which we show theoretically that the approximated kernel remains unchanged (with the associated error bound), and confirm the power of the approach with numerical simulations

    Contourlet Domain Image Modeling and its Applications in Watermarking and Denoising

    Get PDF
    Statistical image modeling in sparse domain has recently attracted a great deal of research interest. Contourlet transform as a two-dimensional transform with multiscale and multi-directional properties is known to effectively capture the smooth contours and geometrical structures in images. The objective of this thesis is to study the statistical properties of the contourlet coefficients of images and develop statistically-based image denoising and watermarking schemes. Through an experimental investigation, it is first established that the distributions of the contourlet subband coefficients of natural images are significantly non-Gaussian with heavy-tails and they can be best described by the heavy-tailed statistical distributions, such as the alpha-stable family of distributions. It is shown that the univariate members of this family are capable of accurately fitting the marginal distributions of the empirical data and that the bivariate members can accurately characterize the inter-scale dependencies of the contourlet coefficients of an image. Based on the modeling results, a new method in image denoising in the contourlet domain is proposed. The Bayesian maximum a posteriori and minimum mean absolute error estimators are developed to determine the noise-free contourlet coefficients of grayscale and color images. Extensive experiments are conducted using a wide variety of images from a number of databases to evaluate the performance of the proposed image denoising scheme and to compare it with that of other existing schemes. It is shown that the proposed denoising scheme based on the alpha-stable distributions outperforms these other methods in terms of the peak signal-to-noise ratio and mean structural similarity index, as well as in terms of visual quality of the denoised images. The alpha-stable model is also used in developing new multiplicative watermark schemes for grayscale and color images. Closed-form expressions are derived for the log-likelihood-based multiplicative watermark detection algorithm for grayscale images using the univariate and bivariate Cauchy members of the alpha-stable family. A multiplicative multichannel watermark detector is also designed for color images using the multivariate Cauchy distribution. Simulation results demonstrate not only the effectiveness of the proposed image watermarking schemes in terms of the invisibility of the watermark, but also the superiority of the watermark detectors in providing detection rates higher than that of the state-of-the-art schemes even for the watermarked images undergone various kinds of attacks

    A whispering gallery mode based biosensor platform for single enzyme analysis

    Get PDF
    Enzymes catalyze most of the biochemical reactions in our cells. The functionality of enzymes depends on their dynamics starting from small bond vibrations in the fs timescale to large domain motions in the microsecond-millisecond timescale. Understanding the precise and rapid positioning of atoms within a catalytic site by an enzyme’s molecular movements is crucial for understanding biomolecular processes and for realizing synthetic biomolecular machines in the longer term. Hence, sensors capable of studying enzymes over a wide range of amplitudes and timescale and ideally one enzyme at a time are required. Many capable single-molecule techniques have been established in the past three decades, each with its pros and cons. This thesis presents the development of one such single-molecule sensor. The sensor is based on plasmonically enhanced whispering gallery mode resonators and is capable of studying enzyme kinetics and large-scale dynamics over the timescale of ns-seconds. Unlike fluorescence techniques which require labeling of the enzymes with dyes, the technique presented in this work detects single enzymes immobilized on the surface of plasmonic gold nanoparticles. A fast, low-noise, lock-in method is utilized to extract sensor signals in the microsecond timescale. Using a model enzyme, the ability of the sensor to detect conformational fluctuations of single enzymes is shown. Further, the thermodynamics of the enzyme is studied and the relevant thermodynamic parameters are extracted from the single-molecule data. Additionally, we extract the heat capacity changes associated with the enzyme using the single-molecule data. The sensor system presented in this thesis in the future could enable a fast, real-time, rapid throughput, lab-on-chip sensor system for studying single enzymes for both research and clinical use.Engineering and Physical Sciences Research Council (EPSRC)Engineering and Physical Sciences Research Council (EPSRC

    Fifty Years of Noise Modeling and Mitigation in Power-Line Communications.

    Get PDF
    Building on the ubiquity of electric power infrastructure, power line communications (PLC) has been successfully used in diverse application scenarios, including the smart grid and in-home broadband communications systems as well as industrial and home automation. However, the power line channel exhibits deleterious properties, one of which is its hostile noise environment. This article aims for providing a review of noise modeling and mitigation techniques in PLC. Specifically, a comprehensive review of representative noise models developed over the past fifty years is presented, including both the empirical models based on measurement campaigns and simplified mathematical models. Following this, we provide an extensive survey of the suite of noise mitigation schemes, categorizing them into mitigation at the transmitter as well as parametric and non-parametric techniques employed at the receiver. Furthermore, since the accuracy of channel estimation in PLC is affected by noise, we review the literature of joint noise mitigation and channel estimation solutions. Finally, a number of directions are outlined for future research on both noise modeling and mitigation in PLC
    • …
    corecore