1,193 research outputs found

    Angular Resolution Limit for Vector-Sensor Arrays: Detection and Information Theory Approaches

    No full text
    International audienceThe Angular Resolution Limit (ARL) on resolving two closely spaced polarized sources using vector-sensor arrays is considered in this paper. The proposed method is based on the information theory. In particular, the Stein's lemma provides, asymptotically, a link between the probability of false alarm and the relative entropy between two hypothesis of a given statistical binary test. We show that the relative entropy can be approximated by a quadratic function in the ARL. This property allows us to derive and analyze a closed-form expression of the ARL. To illustrate the interest of our approach the ARL, in the sense of the detection theory, is also derived. Finally, we show that the ARL is only sensitive to the norm of the polarization state vector and not to the particular values of the polarization parameters

    Advancements in seismic tomography with application to tunnel detection and volcano imaging

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 1998Practical geotomography is an inverse problem with no unique solution. A priori information must be imposed for a stable solution to exist. Commonly used types of a priori information smooth and attenuate anomalies, resulting in 'blurred' tomographic images. Small or discrete anomalies, such as tunnels, magma conduits, or buried channels are extremely difficult imaging objectives. Composite distribution inversion (CDI) is introduced as a theory seeking physically simple, rather than distributionally simple, solutions of non-unique problems. Parameters are assumed to be members of a composite population, including both well-known and anomalous components. Discrete and large amplitude anomalies are allowed, while a well-conditioned inverse is maintained. Tunnel detection is demonstrated using CDI tomography and data collected near the northern border of South Korea. Accurate source and receiver location information is necessary. Borehole deviation corrections are estimated by minimizing the difference between empirical distributions of apparent parameter values as a function of location correction. Improved images result. Traveltime computation and raytracing are the most computationally intensive components of seismic tomography when imaging structurally complex media. Efficient, accurate, and robust raytracing is possible by first recovering approximate raypaths from traveltime fields, and then refining the raypaths to a desired accuracy level. Dynamically binned queuing is introduced. The approach optimizes graph-theoretic traveltime computation costs. Pseudo-bending is modified to efficiently refine raypaths in general media. Hypocentral location density functions and relative phase arrival population analysis are used to investigate the Spring, 1996, earthquake swarm at Akutan Volcano, Alaska. The main swarm is postulated to have been associated with a 0.2 km\sp3 intrusion at a depth of less than four kilometers. Decay sequence seismicity is postulated to be a passive response to the stress transient caused by the intrusion. Tomograms are computed for Mt. Spurr, Augustine, and Redoubt Volcanoes, Alaska. Relatively large amplitude, shallow anomalies explain most of the traveltime residual. No large amplitude anomalies are found at depth, and no magma storage areas are imaged. A large amplitude low-velocity anomaly is coincident with a previously proposed geothermal region on the southeast flank of Mt. Spurr. Mt. St. Augustine is found to have a high velocity core

    Signal Subspace Processing in the Beam Space of a True Time Delay Beamformer Bank

    Get PDF
    A number of techniques for Radio Frequency (RF) source location for wide bandwidth signals have been described that utilize coherent signal subspace processing, but often suffer from limitations such as the requirement for preliminary source location estimation, the need to apply the technique iteratively, computational expense or others. This dissertation examines a method that performs subspace processing of the data from a bank of true time delay beamformers. The spatial diversity of the beamformer bank alleviates the need for a preliminary estimate while simultaneously reducing the dimensionality of subsequent signal subspace processing resulting in computational efficiency. The pointing direction of the true time delay beams is independent of frequency, which results in a mapping from element space to beam space that is wide bandwidth in nature. This dissertation reviews previous methods, introduces the present method, presents simulation results that demonstrate the assertions, discusses an analysis of performance in relation to the Cramer-Rao Lower Bound (CRLB) with various levels of noise in the system, and discusses computational efficiency. One limitation of the method is that in practice it may be appropriate for systems that can tolerate a limited field of view. The application of Electronic Intelligence is one such application. This application is discussed as one that is appropriate for a method exhibiting high resolution of very wide bandwidth closely spaced sources and often does not require a wide field of view. In relation to system applications, this dissertation also discusses practical employment of the novel method in terms of antenna elements, arrays, platforms, engagement geometries, and other parameters. The true time delay beam space method is shown through modeling and simulation to be capable of resolving closely spaced very wideband sources over a relevant field of view in a single algorithmic pass, requiring no course preliminary estimation, and exhibiting low computational expense superior to many previous wideband coherent integration techniques

    An information theoretic approach for generating an aircraft avoidance Markov decision process

    Get PDF
    Developing a collision avoidance system that can meet safety standards required of commercial aviation is challenging. A dynamic programming approach to collision avoidance has been developed to optimize and generate logics that are robust to the complex dynamics of the national airspace. The current approach represents the aircraft avoidance problem as Markov Decision Processes and independently optimizes a horizontal and vertical maneuver avoidance logics. This is a result of the current memory requirements for each logic, simply combining the logics will result in a significantly larger representation. The "curse of dimensionality" makes it computationally inefficient and unfeasible to optimize this larger representation. However, existing and future collision avoidance systems have mostly defined the decision process by hand. In response, a simulation-based framework was built to better understand how each potential state quantifies the aircraft avoidance problem with regards to safety and operational components. The framework leverages recent advances in signals processing and database, while enabling the highest fidelity analysis of Monte Carlo aircraft encounter simulations to date. This framework enabled the calculation of how well each state of the decision process quantifies the collision risk and the associated memory requirements. Using this analysis, a collision avoidance logic that leverages both horizontal and vertical actions was built and optimized using this simulation based approach
    corecore