89 research outputs found

    Synthèse des traitements STAP pour la détection en environnement hétérogène

    Get PDF
    Cet article synthétise les différents algorithmes spatio-temporels adaptatifs (STAP) développés et/ou utilisés pour la détection en environnement non-homogène. Nous rappelons en premier lieu les causes principales qui peuvent conduire à un environnement hétérogène. Puis nous présentons les stratégies STAP les plus communément utilisées dans de tels environnements

    On Detection and Ranking Methods for a Distributed Radio-Frequency Sensor Network: Theory and Algorithmic Implementation

    Get PDF
    A theoretical foundation for pre-detection fusion of sensors is needed if the United States Air Force is to ever field a system of distributed and layered sensors that can detect and perform parameter estimation of complex, extended targets in difficult interference environments, without human intervention, in near real-time. This research is relevant to the United States Air Force within its layered sensing and cognitive radar/sensor initiatives. The asymmetric threat of the twenty-first century introduces stressing sensing conditions that may exceed the ability of traditional monostatic sensing systems to perform their required intelligence, surveillance and reconnaissance missions. In particular, there is growing interest within the United States Air Force to move beyond single sensor sensing systems, and instead begin fielding and leveraging distributed sensing systems to overcome the inherent challenges imposed by the modern threat space. This thesis seeks to analyze the impact of integrating target echoes in the angular domain, to determine if better detection and ranking performance is achieved through the use of a distributed sensor network. Bespoke algorithms are introduced for detection and ranking ISR missions leveraging a distributed network of radio-frequency sensors: the first set of bespoke algorithms area based upon a depth-based nonparametric detection algorithm, which is to shown to enhance the recovery of targets under lower signal-to-noise ratios than an equivalent monostatic radar system; the second set of bespoke algorithms are based upon random matrix theoretic and concentration of measure mathematics, and demonstrated to outperform the depth-based nonparametric approach. This latter approach shall be shown to be effective across a broad range of signal-to-noise ratios, both positive and negative

    Courbure discrète : théorie et applications

    Get PDF
    International audienceThe present volume contains the proceedings of the 2013 Meeting on discrete curvature, held at CIRM, Luminy, France. The aim of this meeting was to bring together researchers from various backgrounds, ranging from mathematics to computer science, with a focus on both theory and applications. With 27 invited talks and 8 posters, the conference attracted 70 researchers from all over the world. The challenge of finding a common ground on the topic of discrete curvature was met with success, and these proceedings are a testimony of this wor

    Estimation et détection en milieu non-homogène, application au traitement spatio-temporel adaptatif

    Get PDF
    Pour un radar aéroporté, la détection de cibles nécessite, de par la nature du fouillis de sol, la mise en place d'un filtre spatio-temporel adaptatif (STAP). Les détecteurs basés sur l'hypothèse d'un milieu homogène sont souvent mis à mal dans un environnement réel, où les caractéristiques du fouillis peuvent varier significativement en distance et en angle. Diverses stratégies existent pour contrer les effets délétères de l'hétérogénéité. La thèse propose d'approfondir deux de ces stratégies. Plus précisément, un nouveau modèle d'environnement est présenté dans un contexte Bayésien : il intègre à la fois une relation originale d'hétérogénéité et de la connaissance a priori. De nouveaux estimateurs de la matrice de covariance du bruit ainsi que de nouveaux détecteurs sont calculés à partir de ce modèle. Ils sont étudiés de manière théorique et par simulations numériques. Les résultats obtenus montrent que le modèle proposé permet d'intégrer de manière intelligente l'information a priori dans le processus de détection. ABSTRACT : Space-time adaptive processing is required in future airborne radar systems to improve the detection of targets embedded in clutter. Performance of detectors based on the assumption of a homogeneous environment can be severely degraded in practical applications. Indeed real world clutter can vary significantly in both angle and range. So far, different strategies have been proposed to overcome the deleterious effect of heterogeneity. This dissertation proposes to study two of these strategies. More precisely a new data model is introduced in a Bayesian framework ; it allows to incorporate both an original relation of heterogeneity and a priori knowledge. New estimation and detection schemes are derived according to the model ; their performances are also studied theoretically and through numerical simulations. Results show that the proposed model and algorithms allow to incorporate in an appropriate way a priori information in the detection schem

    Radiometrically-Accurate Hyperspectral Data Sharpening

    Get PDF
    Improving the spatial resolution of hyperpsectral image (HSI) has traditionally been an important topic in the field of remote sensing. Many approaches have been proposed based on various theories including component substitution, multiresolution analysis, spectral unmixing, Bayesian probability, and tensor representation. However, these methods have some common disadvantages, such as that they are not robust to different up-scale ratios and they have little concern for the per-pixel radiometric accuracy of the sharpened image. Moreover, many learning-based methods have been proposed through decades of innovations, but most of them require a large set of training pairs, which is unpractical for many real problems. To solve these problems, we firstly proposed an unsupervised Laplacian Pyramid Fusion Network (LPFNet) to generate a radiometrically-accurate high-resolution HSI. First, with the low-resolution hyperspectral image (LR-HSI) and the high-resolution multispectral image (HR-MSI), the preliminary high-resolution hyperspectral image (HR-HSI) is calculated via linear regression. Next, the high-frequency details of the preliminary HR-HSI are estimated via the subtraction between it and the CNN-generated-blurry version. By injecting the details to the output of the generative CNN with the low-resolution hyperspectral image (LR-HSI) as input, the final HR-HSI is obtained. LPFNet is designed for fusing the LR-HSI and HR-MSI covers the same Visible-Near-Infrared (VNIR) bands, while the short-wave infrared (SWIR) bands of HSI are ignored. SWIR bands are equally important to VNIR bands, but their spatial details are more challenging to be enhanced because the HR-MSI, used to provide the spatial details in the fusion process, usually has no SWIR coverage or lower-spatial-resolution SWIR. To this end, we designed an unsupervised cascade fusion network (UCFNet) to sharpen the Vis-NIR-SWIR LR-HSI. First, the preliminary high-resolution VNIR hyperspectral image (HR-VNIR-HSI) is obtained with a conventional hyperspectral algorithm. Then, the HR-MSI, the preliminary HR-VNIR-HSI, and the LR-SWIR-HSI are passed to the generative convolutional neural network to produce an HR-HSI. In the training process, the cascade sharpening method is employed to improve stability. Furthermore, the self-supervising loss is introduced based on the cascade strategy to further improve the spectral accuracy. Experiments are conducted on both LPFNet and UCFNet with different datasets and up-scale ratios. Also, state-of-the-art baseline methods are implemented and compared with the proposed methods with different quantitative metrics. Results demonstrate that proposed methods outperform the competitors in all cases in terms of spectral and spatial accuracy

    An orientation field approach to modelling fibre-generated spatial point processes

    Get PDF
    This thesis introduces a new approach to analysing spatial point data clustered along or around a system of curves or fibres with additional background noise. Such data arise in catalogues of galaxy locations, recorded locations of earthquakes, aerial images of minefields, and pore patterns on fingerprints. Finding the underlying curvilinear structure of these point-pattern data sets may not only facilitate a better understanding of how they arise but also aid reconstruction of missing data. We base the space of fibres on the set of integral lines of an orientation field. Using an empirical Bayes approach, we estimate the field of orientations from anisotropic features of the data. The orientation field estimation draws on ideas from tensor field theory (an area recently motivated by the study of magnetic resonance imaging scans), using symmetric positive-definite matrices to estimate local anisotropies in the point pattern through the tensor method. We also propose a new measure of anisotropy, the modified square Fractional Anisotropy, whose statistical properties are estimated for tensors calculated via the tensor method. A continuous-time Markov chain Monte Carlo algorithm is used to draw samples from the posterior distribution of fibres, exploring models with different numbers of clusters, and fitting fibres to the clusters as it proceeds. The Bayesian approach permits inference on various properties of the clusters and associated fibres, and the resulting algorithm performs well on a number of very different curvilinear structures

    Signal Processing for Non-Gaussian Statistics: Clutter Distribution Identification and Adaptive Threshold Estimation

    Get PDF
    We examine the problem of determining a decision threshold for the binary hypothesis test that naturally arises when a radar system must decide if there is a target present in a range cell under test. Modern radar systems require predictable, low, constant rates of false alarm (i.e. when unwanted noise and clutter returns are mistaken for a target). Measured clutter returns have often been fitted to heavy tailed, non-Gaussian distributions. The heavy tails on these distributions cause an unacceptable rise in the number of false alarms. We use the class of spherically invariant random vectors (SIRVs) to model clutter returns. SIRVs arise from a phenomenological consideration of the radar sensing problem, and include both the Gaussian distribution and most commonly reported non-Gaussian clutter distributions (e.g. K distribution, Weibull distribution). We propose an extension of a prior technique called the Ozturk algorithm. The Ozturk algorithm generates a graphical library of points corresponding to known SIRV distributions. These points are generated from linked vectors whose magnitude is derived from the order statistics of the SIRV distributions. Measured data is then compared to the library and a distribution is chosen that best approximates the measured data. Our extension introduces a framework of weighting functions and examines both a distribution classification technique as well as a method of determining an adaptive threshold in data that may or may not belong to a known distribution. The extensions are then compared to neural networking techniques. Special attention is paid to producing a robust, adaptive estimation of the detection threshold. Finally, divergence measures of SIRVs are examined

    Forecasting seismic activity induced from hydraulic fracturing operations

    Get PDF
    As the world transitions towards a carbon-neutral economy in order to meet the Paris climate change accords, many countries are utilising natural gas as a transition fuel while the renewable energy sector continues to develop. As part of this transition in the UK, it is the intent that the use of domestic natural gas, including gas from unconventional reservoirs such as shale, is fully realised. The extraction of natural gas from shale is not without environmental risk and seismic events induced by the hydraulic fracturing process are cited as the reason for the current suspension of hydraulic fracture operations in the UK. The reactive control approach, of which the ‘traffic light system’ procedures are part of, is the most widely used method to forecast seismic events. This ties the likelihood of a seismic event occurring to a single seismological derived parameter. There have been challenges in this approach, and newly developed probabilistic forecasting methods that are capable of predicting seismic events likely to occur in the future are still in development and yet to be the primary decision making system to control the injection schedule. The primary objective of this thesis was to research forecasting approaches that alleviates the disadvantages posed by these current methods. A software system was developed based on relating a forecasting model to real-time changes in the fracture network from the two causes of induced seismicity; an increase in pore-pressure re-activating fault lines and the transfer of stress from other seismic events. This software system analyses microseismic records using four geophysical signal analysis methods which when combined produces two maps updated in real-time; a fracture map highlighting hydraulic connections and a Coulomb stress change map. To verify the software system, the causes of a magnitude 3.9 earthquake on the 12 January 2016 from a shale gas production well in Fox Creek, Canada were retrospectively investigated and the usage of the system to forecast seismic events evaluated. The fracture map generated from the microseismic records indicated a hydraulic connection between stage 23 of the hydraulic fracture process and a legacy fault line. The input of fracture fluid increased the pore-pressure on the fault line, ultimately causing slip and the magnitude 3.9 earthquake. There was no evidence to show that static stress transfer from other seismic events in the area was affecting the triggering process. The forecast model was validated by comparing the fracture maps in the time leading up to 12 January event to the forecast model. Although numerous events were positioned to be part of a transition between the hydraulic tensile fractures to the fault line, it was not possible to analyse these events due to the low signal to noise ratio and therefore not viable to fully validate the forecast model with this case study. Further research with different case studies where the acquisition geometry is closer to the events is recommended to fully validate the forecast model before field implementation
    • …
    corecore