33,102 research outputs found

    Non-linear Kalman filters for calibration in radio interferometry

    Full text link
    We present a new calibration scheme based on a non-linear version of Kalman filter that aims at estimating the physical terms appearing in the Radio Interferometry Measurement Equation (RIME). We enrich the filter's structure with a tunable data representation model, together with an augmented measurement model for regularization. We show using simulations that it can properly estimate the physical effects appearing in the RIME. We found that this approach is particularly useful in the most extreme cases such as when ionospheric and clock effects are simultaneously present. Combined with the ability to provide prior knowledge on the expected structure of the physical instrumental effects (expected physical state and dynamics), we obtain a fairly cheap algorithm that we believe to be robust, especially in low signal-to-noise regime. Potentially the use of filters and other similar methods can represent an improvement for calibration in radio interferometry, under the condition that the effects corrupting visibilities are understood and analytically stable. Recursive algorithms are particularly well adapted for pre-calibration and sky model estimate in a streaming way. This may be useful for the SKA-type instruments that produce huge amounts of data that have to be calibrated before being averaged

    Sparsity driven ultrasound imaging

    Get PDF
    An image formation framework for ultrasound imaging from synthetic transducer arrays based on sparsity-driven regularization functionals using single-frequency Fourier domain data is proposed. The framework involves the use of a physics-based forward model of the ultrasound observation process, the formulation of image formation as the solution of an associated optimization problem, and the solution of that problem through efficient numerical algorithms. The sparsity-driven, model-based approach estimates a complex-valued reflectivity field and preserves physical features in the scene while suppressing spurious artifacts. It also provides robust reconstructions in the case of sparse and reduced observation apertures. The effectiveness of the proposed imaging strategy is demonstrated using experimental data

    Bibliometric cartography of information retrieval research by using co-word analysis

    Get PDF
    The aim of this study is to map the intellectual structure of the field of Information Retrieval (IR) during the period of 1987-1997. Co-word analysis was employed to reveal patterns and trends in the IR field by measuring the association strengths of terms representative of relevant publications or other texts produced in IR field. Data were collected from Science Citation Index (SCI) and Social Science Citation Index (SSCI) for the period of 1987-1997. In addition to the keywords added by the SCI and SSCI databases, other important keywords were extracted from titles and abstracts manually. These keywords were further standardized using vocabulary control tools. In order to trace the dynamic changes of the IR field, the whole 11-year period was further separated into two consecutive periods: 1987-1991 and 1992-1997. The results show that the IR field has some established research themes and it also changes rapidly to embrace new themes

    Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Get PDF
    Automated source extraction and parameterization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper we present a new algorithm, dubbed CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parameterization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, including also different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the ASKAP-EMU survey. The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.Comment: 15 pages, 9 figure

    Complex Independent Component Analysis of Frequency-Domain Electroencephalographic Data

    Full text link
    Independent component analysis (ICA) has proven useful for modeling brain and electroencephalographic (EEG) data. Here, we present a new, generalized method to better capture the dynamics of brain signals than previous ICA algorithms. We regard EEG sources as eliciting spatio-temporal activity patterns, corresponding to, e.g., trajectories of activation propagating across cortex. This leads to a model of convolutive signal superposition, in contrast with the commonly used instantaneous mixing model. In the frequency-domain, convolutive mixing is equivalent to multiplicative mixing of complex signal sources within distinct spectral bands. We decompose the recorded spectral-domain signals into independent components by a complex infomax ICA algorithm. First results from a visual attention EEG experiment exhibit (1) sources of spatio-temporal dynamics in the data, (2) links to subject behavior, (3) sources with a limited spectral extent, and (4) a higher degree of independence compared to sources derived by standard ICA.Comment: 21 pages, 11 figures. Added final journal reference, fixed minor typo

    Blind deconvolution of medical ultrasound images: parametric inverse filtering approach

    Get PDF
    ©2007 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or distribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.DOI: 10.1109/TIP.2007.910179The problem of reconstruction of ultrasound images by means of blind deconvolution has long been recognized as one of the central problems in medical ultrasound imaging. In this paper, this problem is addressed via proposing a blind deconvolution method which is innovative in several ways. In particular, the method is based on parametric inverse filtering, whose parameters are optimized using two-stage processing. At the first stage, some partial information on the point spread function is recovered. Subsequently, this information is used to explicitly constrain the spectral shape of the inverse filter. From this perspective, the proposed methodology can be viewed as a ldquohybridizationrdquo of two standard strategies in blind deconvolution, which are based on either concurrent or successive estimation of the point spread function and the image of interest. Moreover, evidence is provided that the ldquohybridrdquo approach can outperform the standard ones in a number of important practical cases. Additionally, the present study introduces a different approach to parameterizing the inverse filter. Specifically, we propose to model the inverse transfer function as a member of a principal shift-invariant subspace. It is shown that such a parameterization results in considerably more stable reconstructions as compared to standard parameterization methods. Finally, it is shown how the inverse filters designed in this way can be used to deconvolve the images in a nonblind manner so as to further improve their quality. The usefulness and practicability of all the introduced innovations are proven in a series of both in silico and in vivo experiments. Finally, it is shown that the proposed deconvolution algorithms are capable of improving the resolution of ultrasound images by factors of 2.24 or 6.52 (as judged by the autocorrelation criterion) depending on the type of regularization method used

    Real-time terahertz imaging with a single-pixel detector

    Get PDF
    Terahertz (THz) radiation is poised to have an essential role in many imaging applications, from industrial inspections to medical diagnosis. However, commercialization is prevented by impractical and expensive THz instrumentation. Single-pixel cameras have emerged as alternatives to multi-pixel cameras due to reduced costs and superior durability. Here, by optimizing the modulation geometry and post-processing algorithms, we demonstrate the acquisition of a THz-video (32 × 32 pixels at 6 frames-per-second), shown in real-time, using a single-pixel fiber-coupled photoconductive THz detector. A laser diode with a digital micromirror device shining visible light onto silicon acts as the spatial THz modulator. We mathematically account for the temporal response of the system, reduce noise with a lock-in free carrier-wave modulation and realize quick, noise-robust image undersampling. Since our modifications do not impose intricate manufacturing, require long post-processing, nor sacrifice the time-resolving capabilities of THz-spectrometers, their greatest asset, this work has the potential to serve as a foundation for all future single-pixel THz imaging systems

    Sparsity and adaptivity for the blind separation of partially correlated sources

    Get PDF
    Blind source separation (BSS) is a very popular technique to analyze multichannel data. In this context, the data are modeled as the linear combination of sources to be retrieved. For that purpose, standard BSS methods all rely on some discrimination principle, whether it is statistical independence or morphological diversity, to distinguish between the sources. However, dealing with real-world data reveals that such assumptions are rarely valid in practice: the signals of interest are more likely partially correlated, which generally hampers the performances of standard BSS methods. In this article, we introduce a novel sparsity-enforcing BSS method coined Adaptive Morphological Component Analysis (AMCA), which is designed to retrieve sparse and partially correlated sources. More precisely, it makes profit of an adaptive re-weighting scheme to favor/penalize samples based on their level of correlation. Extensive numerical experiments have been carried out which show that the proposed method is robust to the partial correlation of sources while standard BSS techniques fail. The AMCA algorithm is evaluated in the field of astrophysics for the separation of physical components from microwave data.Comment: submitted to IEEE Transactions on signal processin
    • …
    corecore