1,591 research outputs found

    Using baseline-dependent window functions for data compression and field-of-interest shaping in radio interferometry

    Full text link
    In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is a simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities "decorrelate", and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as "smearing", which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. In this work we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be treated as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. In particular, we show improved amplitude response over a chosen field of interest, and better attenuation of sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off, and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Karl G. Jansky Very Large Array (VLA) and the European Very-long-baseline interferometry Network (EVN)

    Coseismic surface deformation from air photos: The Kickapoo step over in the 1992 Landers rupture

    Get PDF
    Coseismic deformation of the ground can be measured from aerial views taken before and after an earthquake. We chose the area of the Kickapoo-Landers step over along the 1992 Landers earthquake zone, using air photos (scale 1:40,000) scanned at 0.4 m resolution. Two photos acquired after the earthquake are used to assess the accuracy and to evaluate various sources of noise. Optical distortions, film deformation, scanning errors, or errors in viewing parameters can yield metric bias at wavelength larger than 1 km. Offset field at shorter wavelength is more reliable and mainly affected by temporal decorrelation of the images induced by changes in radiometry with time. Temporal decorrelation and resulting uncertainty on offsets are estimated locally from the correlation degree between the images. Relative surface displacements are measured independently every about 15 m and with uncertainty typically below 10 cm (RMS). The offset field reveals most of the surface ruptures mapped in the field. The fault slip is accurate to about 7 cm (RMS) and measured independently every 200 m from stacked profiles. Slip distribution compares well with field measurements at the kilometric scale but reveals local discrepancies suggesting that deformation is generally, although not systematically, localized on the major fault zone located in the field. This type of data can provide useful insight into the fault zone's mechanical properties. Our measurements indicate that elastic coseismic strain near the fault zone can be as large as 0.5 × 10^(−3), while anelastic yielding was attained for strain in excess of about 1–2 × 10^(−3)

    Localisation of mobile nodes in wireless networks with correlated in time measurement noise.

    Get PDF
    Wireless sensor networks are an inherent part of decision making, object tracking and location awareness systems. This work is focused on simultaneous localisation of mobile nodes based on received signal strength indicators (RSSIs) with correlated in time measurement noises. Two approaches to deal with the correlated measurement noises are proposed in the framework of auxiliary particle filtering: with a noise augmented state vector and the second approach implements noise decorrelation. The performance of the two proposed multi model auxiliary particle filters (MM AUX-PFs) is validated over simulated and real RSSIs and high localisation accuracy is demonstrated

    System approach to robust acoustic echo cancellation through semi-blind source separation based on independent component analysis

    Get PDF
    We live in a dynamic world full of noises and interferences. The conventional acoustic echo cancellation (AEC) framework based on the least mean square (LMS) algorithm by itself lacks the ability to handle many secondary signals that interfere with the adaptive filtering process, e.g., local speech and background noise. In this dissertation, we build a foundation for what we refer to as the system approach to signal enhancement as we focus on the AEC problem. We first propose the residual echo enhancement (REE) technique that utilizes the error recovery nonlinearity (ERN) to "enhances" the filter estimation error prior to the filter adaptation. The single-channel AEC problem can be viewed as a special case of semi-blind source separation (SBSS) where one of the source signals is partially known, i.e., the far-end microphone signal that generates the near-end acoustic echo. SBSS optimized via independent component analysis (ICA) leads to the system combination of the LMS algorithm with the ERN that allows for continuous and stable adaptation even during double talk. Second, we extend the system perspective to the decorrelation problem for AEC, where we show that the REE procedure can be applied effectively in a multi-channel AEC (MCAEC) setting to indirectly assist the recovery of lost AEC performance due to inter-channel correlation, known generally as the "non-uniqueness" problem. We develop a novel, computationally efficient technique of frequency-domain resampling (FDR) that effectively alleviates the non-uniqueness problem directly while introducing minimal distortion to signal quality and statistics. We also apply the system approach to the multi-delay filter (MDF) that suffers from the inter-block correlation problem. Finally, we generalize the MCAEC problem in the SBSS framework and discuss many issues related to the implementation of an SBSS system. We propose a constrained batch-online implementation of SBSS that stabilizes the convergence behavior even in the worst case scenario of a single far-end talker along with the non-uniqueness condition on the far-end mixing system. The proposed techniques are developed from a pragmatic standpoint, motivated by real-world problems in acoustic and audio signal processing. Generalization of the orthogonality principle to the system level of an AEC problem allows us to relate AEC to source separation that seeks to maximize the independence, hence implicitly the orthogonality, not only between the error signal and the far-end signal, but rather, among all signals involved. The system approach, for which the REE paradigm is just one realization, enables the encompassing of many traditional signal enhancement techniques in analytically consistent yet practically effective manner for solving the enhancement problem in a very noisy and disruptive acoustic mixing environment.PhDCommittee Chair: Biing-Hwang Juang; Committee Member: Brani Vidakovic; Committee Member: David V. Anderson; Committee Member: Jeff S. Shamma; Committee Member: Xiaoli M

    Measuring ground displacements from SAR amplitude images: Application to the Landers Earthquake

    Get PDF
    ERS SAR amplitude images are utilized to map ground displacements from a sub‐pixel correlation method. It yields a ground two‐dimensional displacement field with independent measurements about every 128m in azimuth and 250m in range. The accuracy depends on the characteristics of the images. For the Landers test case, the 1‐σ uncertainty is 0.8m in range and 0.4m in azimuth. We show that this measurement provides a map of major surface fault ruptures accurate to better than 1km and information on coseismic deformation comparable to the 92 GPS measurements available. Although less accurate, this technique is more robust than SAR interferometry and provides complementary information since interferograms are only sensitive to the displacement in range

    Speckle Statistics in Adaptively Corrected Images

    Full text link
    (abridged) Imaging observations are generally affected by a fluctuating background of speckles, a particular problem when detecting faint stellar companions at small angular separations. Knowing the distribution of the speckle intensities at a given location in the image plane is important for understanding the noise limits of companion detection. The speckle noise limit in a long-exposure image is characterized by the intensity variance and the speckle lifetime. In this paper we address the former quantity through the distribution function of speckle intensity. Previous theoretical work has predicted a form for this distribution function at a single location in the image plane. We developed a fast readout mode to take short exposures of stellar images corrected by adaptive optics at the ground-based UCO/Lick Observatory, with integration times of 5 ms and a time between successive frames of 14.5 ms (λ=2.2\lambda=2.2 μ\mum). These observations temporally oversample and spatially Nyquist sample the observed speckle patterns. We show, for various locations in the image plane, the observed distribution of speckle intensities is consistent with the predicted form. Additionally, we demonstrate a method by which IcI_c and IsI_s can be mapped over the image plane. As the quantity IcI_c is proportional to the PSF of the telescope free of random atmospheric aberrations, this method can be used for PSF calibration and reconstruction.Comment: 7 pages, 4 figures, ApJ accepte

    Data compression, field of interest shaping and fast algorithms for direction-dependent deconvolution in radio interferometry

    Get PDF
    In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities “decorrelate”, and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as “smearing”, which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. Averaging also results in baseline length and a position-dependent point spread function (PSF). In this work, we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be understood as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. Specifically, we can improve amplitude response over a chosen field of interest and attenuate sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Jansky Very Large Array and the European Very Long Baseline Interferometry Network. Furthermore, we show that the position-dependent PSF shape induced by averaging can be approximated using linear algebraic properties to effectively reduce the computational complexity for evaluating the PSF at each sky position. We conclude by implementing a position-dependent PSF deconvolution in an imaging and deconvolution framework. Using the Low-Frequency Array radio interferometer, we show that deconvolution with position-dependent PSFs results in higher image fidelity compared to a simple CLEAN algorithm and its derivatives
    corecore