420 research outputs found

    Radar Coincidence Imaging for Off-Grid Target Using Frequency-Hopping Waveforms

    Get PDF
    Radar coincidence imaging (RCI) is a high-resolution staring imaging technique without the limitation of the target relative motion. To achieve better imaging performance, sparse reconstruction is commonly used. While its performance is based on the assumption that the scatterers are located at the prediscretized grid-cell centers, otherwise, off-grid emerges and the performance of RCI degrades significantly. In this paper, RCI using frequency-hopping (FH) waveforms is considered. The off-grid effects are analyzed, and the corresponding constrained Cramér-Rao bound (CCRB) is derived based on the mean square error (MSE) of the “oracle” estimator. For off-grid RCI, the process is composed of two stages: grid matching and off-grid error (OGE) calibration, where two-dimension (2D) band-excluded locally optimized orthogonal matching pursuit (BLOOMP) and alternating iteration minimization (AIM) algorithms are proposed, respectively. Unlike traditional sparse recovery methods, BLOOMP realizes the recovery in the refinement grids by overwhelming the shortages of coherent dictionary and is robust to noise and OGE. AIM calibration algorithm adaptively adjusts the OGE and, meanwhile, seeks the optimal target reconstruction result

    Waveform Analysis and Optimization for Radar Coincidence Imaging with Modeling Error

    Get PDF
    RCI is a novel superresolution staring imaging technique based on the idea of wavefront modulation and temporal-spatial stochastic radiation field. For RCI, the reference matrix should be known accurately, and the imaging performance depends on the incoherence property of the reference matrix. Unfortunately, the modeling error, which degrades the performance significantly, exists generally. In this paper, RCI using frequency-hopping waveforms (FH-RCI) is considered, and a FH code design method aiming to increase the robustness of RCI to modeling error is proposed. First, we derive the upper bound of imaging error for RCI with modeling error and conclude that the condition number of the reference matrix determines the imaging performance. Then the object function for waveform design which minimizes the condition number of the reference matrix is achieved, and the quantum simulated annealing (QSA) is employed to optimize the FH code. Numerical simulations show that the optimized FH code could decrease the condition number of the reference matrix and improve the imaging performance of RCI with modeling error

    High-Dimensional Information Detection based on Correlation Imaging Theory

    Get PDF
    Radar is a device that uses electromagnetic(EM) waves to detect targets; it can measure the position parameters and motion parameters and extract target characteristics information by analyzing the reflected signal from the target. From the perspective of the radar theoretical basis of physics, the more than 70 years of development of radar are based on the EM field fluctuation theory of physics. Many theories have been developed towards one-dimensional signal processing. For example, a variety of threshold filtering have widely used as methods to resist interference during detection. The optimal state estimation describes the propagation process of the statistical characteristics of the target over time in the probability domain. Compressed sensing greatly improves the reconstructing efficiency of the sparse signal. These theories are one-dimensional information processing. The information obtained by them is a deterministic description of the EM field. The correlated imaging technique is from the high-order coherence property of the EM field, which uses the fluctuation characteristic of the EM field to realize non-local imaging. Correlated imaging radar, a combination of correlated imaging techniques and modern information theory, will provide a novel remote sensing detection and imaging method. More importantly, correlated imaging radar is a new research field. Therefore, a complete theoretical frame and application system should be urgently built up and improved. Based on the coherence theory of the EM field, the work in this thesis explores the method of determining the statistical characteristics of the EM field so that the high dimensional target information can be detected, including theoretical analysis, principle design, imaging modes, target detecting models, image reconstruction algorithms, the enhancement of visibility, and system design. The simulations and real experiments are set up to prove the theory's validity and the systems' feasibility

    Spatial statistics and analysis of earth's ionosphere

    Full text link
    Thesis (Ph.D.)--Boston UniversityThe ionosphere, a layer of Earths upper atmosphere characterized by energetic charged particles, serves as a natural plasma laboratory and supplies proxy diagnostics of space weather drivers in the magnetosphere and the solar wind. The ionosphere is a highly dynamic medium, and the spatial structure of observed features (such as auroral light emissions, charge density, temperature, etc.) is rich with information when analyzed in the context of fluid, electromagnetic, and chemical models. Obtaining measurements with higher spatial and temporal resolution is clearly advantageous. For instance, measurements obtained with a new electronically-steerable incoherent scatter radar (ISR) present a unique space-time perspective compared to those of a dish-based ISR. However, there are unique ambiguities for this modality which must be carefully considered. The ISR target is stochastic, and the fidelity of fitted parameters (ionospheric densities and temperatures) requires integrated sampling, creating a tradeoff between measurement uncertainty and spatio-temporal resolution. Spatial statistics formalizes the relationship between spatially dispersed observations and the underlying process(es) they represent. A spatial process is regarded as a random field with its distribution structured (e.g., through a correlation function) such that data, sampled over a spatial domain, support inference or prediction of the process. Quantification of uncertainty, an important component of scientific data analysis, is a core value of spatial statistics. This research applies the formalism of spatial statistics to the analysis of Earth's ionosphere using remote sensing diagnostics. In the first part, we consider the problem of volumetric imaging using phased-array ISR based on optimal spatial prediction ("kriging"). In the second part, we develop a technique for reconstructing two-dimensional ion flow fields from line-of-sight projections using Tikhonov regularization. In the third part, we adapt our spatial statistical approach to global ionospheric imaging using total electron content (TEC) measurements derived from navigation satellite signals

    27th Annual Computational Neuroscience Meeting (CNS*2018): Part One

    Get PDF

    Sensor Signal and Information Processing II

    Get PDF
    In the current age of information explosion, newly invented technological sensors and software are now tightly integrated with our everyday lives. Many sensor processing algorithms have incorporated some forms of computational intelligence as part of their core framework in problem solving. These algorithms have the capacity to generalize and discover knowledge for themselves and learn new information whenever unseen data are captured. The primary aim of sensor processing is to develop techniques to interpret, understand, and act on information contained in the data. The interest of this book is in developing intelligent signal processing in order to pave the way for smart sensors. This involves mathematical advancement of nonlinear signal processing theory and its applications that extend far beyond traditional techniques. It bridges the boundary between theory and application, developing novel theoretically inspired methodologies targeting both longstanding and emergent signal processing applications. The topic ranges from phishing detection to integration of terrestrial laser scanning, and from fault diagnosis to bio-inspiring filtering. The book will appeal to established practitioners, along with researchers and students in the emerging field of smart sensors processing

    Novel Approaches for Nondestructive Testing and Evaluation

    Get PDF
    Nondestructive testing and evaluation (NDT&E) is one of the most important techniques for determining the quality and safety of materials, components, devices, and structures. NDT&E technologies include ultrasonic testing (UT), magnetic particle testing (MT), magnetic flux leakage testing (MFLT), eddy current testing (ECT), radiation testing (RT), penetrant testing (PT), and visual testing (VT), and these are widely used throughout the modern industry. However, some NDT processes, such as those for cleaning specimens and removing paint, cause environmental pollution and must only be considered in limited environments (time, space, and sensor selection). Thus, NDT&E is classified as a typical 3D (dirty, dangerous, and difficult) job. In addition, NDT operators judge the presence of damage based on experience and subjective judgment, so in some cases, a flaw may not be detected during the test. Therefore, to obtain clearer test results, a means for the operator to determine flaws more easily should be provided. In addition, the test results should be organized systemically in order to identify the cause of the abnormality in the test specimen and to identify the progress of the damage quantitatively
    • …
    corecore