1,138 research outputs found

    Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)

    Get PDF
    The implicit objective of the biennial "international - Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST) is to foster collaboration between international scientific teams by disseminating ideas through both specific oral/poster presentations and free discussions. For its second edition, the iTWIST workshop took place in the medieval and picturesque town of Namur in Belgium, from Wednesday August 27th till Friday August 29th, 2014. The workshop was conveniently located in "The Arsenal" building within walking distance of both hotels and town center. iTWIST'14 has gathered about 70 international participants and has featured 9 invited talks, 10 oral presentations, and 14 posters on the following themes, all related to the theory, application and generalization of the "sparsity paradigm": Sparsity-driven data sensing and processing; Union of low dimensional subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph sensing/processing; Blind inverse problems and dictionary learning; Sparsity and computational neuroscience; Information theory, geometry and randomness; Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?; Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website: http://sites.google.com/site/itwist1

    Wavelet Theory

    Get PDF
    The wavelet is a powerful mathematical tool that plays an important role in science and technology. This book looks at some of the most creative and popular applications of wavelets including biomedical signal processing, image processing, communication signal processing, Internet of Things (IoT), acoustical signal processing, financial market data analysis, energy and power management, and COVID-19 pandemic measurements and calculations. The editor’s personal interest is the application of wavelet transform to identify time domain changes on signals and corresponding frequency components and in improving power amplifier behavior

    A Panorama on Multiscale Geometric Representations, Intertwining Spatial, Directional and Frequency Selectivity

    Full text link
    The richness of natural images makes the quest for optimal representations in image processing and computer vision challenging. The latter observation has not prevented the design of image representations, which trade off between efficiency and complexity, while achieving accurate rendering of smooth regions as well as reproducing faithful contours and textures. The most recent ones, proposed in the past decade, share an hybrid heritage highlighting the multiscale and oriented nature of edges and patterns in images. This paper presents a panorama of the aforementioned literature on decompositions in multiscale, multi-orientation bases or dictionaries. They typically exhibit redundancy to improve sparsity in the transformed domain and sometimes its invariance with respect to simple geometric deformations (translation, rotation). Oriented multiscale dictionaries extend traditional wavelet processing and may offer rotation invariance. Highly redundant dictionaries require specific algorithms to simplify the search for an efficient (sparse) representation. We also discuss the extension of multiscale geometric decompositions to non-Euclidean domains such as the sphere or arbitrary meshed surfaces. The etymology of panorama suggests an overview, based on a choice of partially overlapping "pictures". We hope that this paper will contribute to the appreciation and apprehension of a stream of current research directions in image understanding.Comment: 65 pages, 33 figures, 303 reference

    Medical image enhancement

    Get PDF
    Each image acquired from a medical imaging system is often part of a two-dimensional (2-D) image set whose total presents a three-dimensional (3-D) object for diagnosis. Unfortunately, sometimes these images are of poor quality. These distortions cause an inadequate object-of-interest presentation, which can result in inaccurate image analysis. Blurring is considered a serious problem. Therefore, “deblurring” an image to obtain better quality is an important issue in medical image processing. In our research, the image is initially decomposed. Contrast improvement is achieved by modifying the coefficients obtained from the decomposed image. Small coefficient values represent subtle details and are amplified to improve the visibility of the corresponding details. The stronger image density variations make a major contribution to the overall dynamic range, and have large coefficient values. These values can be reduced without much information loss

    Space-Varying Coefficient Models for Diffusion Tensor Imaging using 3d Wavelets

    Get PDF
    In this paper, the space-varying coefficients model on the basis of B-splines (Heim et al., (2006)) is adapted to wavelet basis functions and re-examined using artificial and real data. For an introduction to diffusion tensor imaging refer to Heim et al. (2005, Chap. 2). First, wavelet theory is introduced and explained by means of 1d and 2d examples (Sections 1.1 { 1.3). Section 1.4 is dedicated to the most common thresholding techniques that serve as regularization concepts for wavelet based models. Prior to application of the 3d wavelet decomposition to the space-varying coe cient elds, the SVCM needs to be rewritten. The necessary steps are outlined in Section 2 together with the incorporation of the positive de niteness constraint using log-Cholesky parametrization. Section 3 provides a simulation study as well as a comparison with the results obtained through B-splines and standard kernel application. Finally, a real data example is presented and discussed. The theoretical parts are based on books of Gen cay et al. (2002, Chap. 1, 4-6), Härdle et al. (1998), Ogden (1997) and Jansen (2001) if not stated otherwise

    Parameterisation of M.R. system performance : towards optimised measures of image quality

    Get PDF
    This thesis proposes optimal measures for the inter-system comparison of signal properties when assessing the imaging performance of Magnetic Resonance Imaging (MRI) scanners. MRI has become a popular clinical imaging modality and there are many manufacturers producing systems of various quality. It is essential, therefore, that the performance of each MRI system can be measured and compared. Five criteria have been identified as being of prime importance, namely, the signal-to-noise ratio (SNR), signal non-uniformity, resolution, system induced ghost artefacts and patient induced ghost artefacts. The research concentrated directly on the derivation of performance parameters from test object images. For each criterion a specific algorithm has been developed to obtain optimal parameters. For SNR, a method of evaluation has been derived that utilises the Wiener spectrum to distinguish between random and non-random noise in the MR image. The assessment of signal non-uniformity has been improved by applying statistical parameters. The Modulation Transfer Function has been used in the evaluation and comparison of resolution of MRI systems. Crosscorrelation techniques have enabled the complete automatic location and analysis of ghost artefacts in MR test object images. An autocorrelation technique has been created to compare the degree of respiratory motion artefact present in an MR image. All the techniques, wherever possible, have been optimised for speed and automated to eliminate operator dependency. The strength of this thesis lies in the fact that the data used is not simulated, it is actual data gathered with the full support of each manufacturer in the country of origin. This enables truly applicable comparison parameters to be derived. This is a prominent deficiency for workers who mathematically create images or who work with only one system. The success of the five parameterisations is demonstrated by performing an inter-system comparison of ten commercially available scanners.Open Acces

    Radon spectrogram-based approach for automatic IFs separation

    Get PDF
    The separation of overlapping components is a well-known and difficult problem in multicomponent signals analysis and it is shared by applications dealing with radar, biosonar, seismic, and audio signals. In order to estimate the instantaneous frequencies of a multicomponent signal, it is necessary to disentangle signal modes in a proper domain. Unfortunately, if signal modes supports overlap both in time and frequency, separation is only possible through a parametric approach whenever the signal class is a priori fixed. In this work, time-frequency analysis and Radon transform are jointly used for the unsupervised separation of modes of a generic frequency modulated signal in noisy environment. The proposed method takes advantage of the ability of the Radon transform of a proper time-frequency distribution in separating overlapping modes. It consists of a blind segmentation of signal components in Radon domain by means of a near-to-optimal threshold operation. The inversion of the Radon transform on each detected region allows us to isolate the instantaneous frequency curves of each single mode in the time-frequency domain. Experimental results performed on constant amplitudes chirp signals confirm the effectiveness of the proposed method, opening the way for its extension to more complex frequency modulated signals

    Deriving probabilistic short-range forecasts from a deterministic high-resolution model

    Get PDF
    In order to take full advantage of short-range forecasts from deterministic high-resolution NWP models, the direct model output must be addressed in a probabilistic framework. A promising approach is mesoscale ensemble prediction. However, its operational use is still hampered by conceptual deficiencies and large computational costs. This study tackles two relevant issues: (1) the representation of model-related forecast uncertainty in mesoscale ensemble prediction systems and (2) the development of post-processing procedures that retrieve additional probabilistic information from a single model simulation. Special emphasis is laid on mesoscale forecast uncertainty of summer precipitation and 2m-temperature in Europe. Source of forecast guidance is the deterministic high-resolution model Lokal-Modell (LM) of the German Weather Service. This study gains more insight into the effect and usefulness of stochastic parametrisation schemes in the representation of short-range forecast uncertainty. A stochastic parametrisation scheme is implemented into the LM in an attempt to simulate the stochastic effect of sub-grid scale processes. Experimental ensembles show that the scheme has a substantial effect on the forecast of precipitation amount. However, objective verification reveals that the ensemble does not attain better forecast goodness than a single LM simulation. Urgent issues for future research are identified. In the context of statistical post-processing, two schemes are designed: the neighbourhood method and wavelet smoothing. Both approaches fall under the framework of estimating a large array of statistical parameters on the basis of a single realisation on each parameter. The neighbourhood method is based on the notion of spatio-temporal ergodicity including explicit corrections for enhanced predictability from topographic forcing. The neighbourhood method derives estimates of quantiles, exceedance probabilities and expected values at each grid point of the LM. If the post-processed precipitation forecast is formulated in terms of probabilities or quantiles, it attains clear superiority in comparison to the raw model output. Wavelet smoothing originates from the field of image denoising and includes concepts of multiresolution analysis and non-parametric regression. In this study, the method is used to produce estimates of the expected value, but it may be easily extended to the additional estimation of exceedance probabilities. Wavelet smoothing is not only computationally more efficient than the neighbourhood method, but automatically adapts the amount of spatial smoothing to local properties of the underlying data. The method apparently detects deterministically predictable temperature patterns on the basis of statistical guidance only
    corecore