2,480 research outputs found

    Seismic geometric attribute analysis for fracture characterization: New methodologies and applications

    Get PDF
    In 3D subsurface exploration, detection of faults and fractures from 3D seismic data is vital to robust structural and stratigraphic analysis in the subsurface, and great efforts have been made in the development and application of various seismic attributes (e.g. coherence, semblance, curvature, and flexure). However, the existing algorithms and workflows are not accurate and efficient enough for robust fracture detection, especially in naturally fractured reservoirs with complicated structural geometry and fracture network. My Ph.D. research is proposing the following scopes of work to enhance our capability and to help improve the resolution on fracture characterization and prediction.;For discontinuity attribute, previous methods have difficulty highlighting subtle discontinuities from seismic data in cases where the local amplitude variation is non-zero mean. This study proposes implementing a gray-level transformation and the Canny edge detector for improved imaging of discontinuities. Specifically, the new process transforms seismic signals to be zero mean and helps amplify subtle discontinuities, leading to an enhanced visualization for structural and stratigraphic details. Applications to various 3D seismic datasets demonstrate that the new algorithm is superior to previous discontinuity-detection methods. Integrating both discontinuity magnitude and discontinuity azimuth helps better define channels, faults and fractures, than the traditional similarity, amplitude gradient and semblance attributes.;For flexure attribute, the existing algorithm is computationally intensive and limited by the lateral resolution for steeply-dipping formations. This study proposes a new and robust volume-based algorithm that evaluate flexure attribute more accurately and effectively. The algorithms first volumetrically fit a cubic surface by using a diamond 13-node grid cell to seismic data, and then compute flexure using the spatial derivatives of the built surface. To avoid introducing interpreter bias, this study introduces a new workflow for automatically building surfaces that best represent the geometry of seismic reflections. A dip-steering approach based on 3D complex seismic trace analysis is implemented to enhance the accuracy of surface construction and to reduce computational time. Applications to two 3D seismic surveys demonstrate the accuracy and efficiency of the new flexure algorithm for characterizing faults and fractures in fractured reservoirs.;For robust fracture detection, this study presents a new methodology to compute both magnitude and directions of most extreme flexure attribute. The new method first computes azimuthal flexure; and then implements a discrete azimuth-scanning approach to finding the magnitude and azimuth of most extreme flexure. Specially, a set of flexure values is estimated and compared by substituting all possible azimuths between 0 degree (Inline) and 180 degree (Crossline) into the newly-developed equation for computing azimuthal flexure. The added value of the new algorithm is demonstrated through applications to the seismic data set from Teapot Dome of Wyoming. The results indicate that most extreme flexure and its associated azimuthal directions help reveal structural complexities that are not discernible from conventional coherence or geometric attributes.;Given that the azimuth-scanning approach for computing maximum/minimum flexure is time-consuming, this study proposes fracture detection using most positive/negative flexures; since for gently-dipping structures, most positive is similar to maximum flexure while most negative flexure to minimum flexure. After setting the first reflection derivatives (or apparent dips) to be zero, the localized reflection is rotated to be horizontal and thereby the equation for computing azimuthal flexure is significantly simplified, from which a new analytical approach is proposed for computing most positive/negative flexures. Comparisons demonstrate that positive/negative flexures can provide quantitative fracture characterization similar to most extreme flexure, but the computation is 8 times faster than the azimuth-scanning approach.;Due to the overestimate by using most positive/negative flexure for fracture characterization, 3D surface rotation is then introduced for flexure extraction in the presence of structural dip, which makes it possible for solving the problem in an analytical manner. The improved computational efficiency and accuracy is demonstrated by both synthetic testing and applications to real 3D seismic datasets, compared to the existing discrete azimuth-scanning approach.;Last but not the least, strain analysis is also important for understanding structural deformation, predicting natural fracture system, and planning well bores. Physically, open fractures are most likely to develop in extensional domains whereas closed fractures in compressional ones. The beam model has been proposed for describing the strain distribution within a geological formation with a certain thickness, in which, however, the extensional zone cannot be distinguished from the compression one with the aid of traditional geometric attributes, including discontinuity, dip, and curvature. To resolve this problem, this study proposes a new algorithm for strain reconstruction using apparent dips at each sample location within a seismic cube

    The Chandra Source Catalog

    Get PDF
    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public ACIS imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents <~ 30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1 sigma uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of <~ 1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively.Comment: To appear in The Astrophysical Journal Supplement Series, 53 pages, 27 figure

    Synthetic aperture radar/LANDSAT MSS image registration

    Get PDF
    Algorithms and procedures necessary to merge aircraft synthetic aperture radar (SAR) and LANDSAT multispectral scanner (MSS) imagery were determined. The design of a SAR/LANDSAT data merging system was developed. Aircraft SAR images were registered to the corresponding LANDSAT MSS scenes and were the subject of experimental investigations. Results indicate that the registration of SAR imagery with LANDSAT MSS imagery is feasible from a technical viewpoint, and useful from an information-content viewpoint

    Extraction of Key-Frames from an Unstable Video Feed

    Get PDF
    The APOLI project deals with Automated Power Line Inspection using Highly-automated Unmanned Aerial Systems. Beside the Real-time damage assessment by on-board high-resolution image data exploitation a postprocessing of the video data is necessary. This Master Thesis deals with the implementation of an Isolator Detector Framework and a Work ow in the Automotive Data and Time-triggered Framework(ADTF) that loads a video direct from a camera or from a storage and extracts the Key Frames which contain objects of interest. This is done by the implementation of an object detection system using C++ and the creation of ADTF Filters that perform the task of detection of the objects of interest and extract the Key Frames using a supervised learning platform. The use case is the extraction of frames from video samples that contain Images of Isolators from Power Transmission Lines

    Pattern Recognition, Tracking and Vertex Reconstruction in Particle Detectors

    Get PDF
    The book describes methods of track and vertex resonstruction in particle detectors. The main topics are pattern recognition and statistical estimation of geometrical and physical properties of charged particles and of interaction and decay vertices

    Muon g-2: Review of Theory and Experiment

    Get PDF
    A review of the experimental and theoretical determinations of the anomalous magnetic moment of the muon is given. The anomaly is defined by a=(g-2)/2, where the Land\'e g-factor is the proportionality constant that relates the spin to the magnetic moment. For the muon, as well as for the electron and tauon, the anomaly a differs slightly from zero (of order 10^{-3}) because of radiative corrections. In the Standard Model, contributions to the anomaly come from virtual `loops' containing photons and the known massive particles. The relative contribution from heavy particles scales as the square of the lepton mass over the heavy mass, leading to small differences in the anomaly for e, \mu, and \tau. If there are heavy new particles outside the Standard Model which couple to photons and/or leptons, the relative effect on the muon anomaly will be \sim (m_\mu/ m_e)^2 \approx 43\times 10^3 larger compared with the electron anomaly. Because both the theoretical and experimental values of the muon anomaly are determined to high precision, it is an excellent place to search for the effects of new physics, or to constrain speculative extensions to the Standard Model. Details of the current theoretical evaluation, and of the series of experiments that culminates with E821 at the Brookhaven National Laboratory are given. At present the theoretical and the experimental values are known with a similar relative precision of 0.5 ppm. There is, however, a 3.4 standard deviation difference between the two, strongly suggesting the need for continued experimental and theoretical studyComment: 103 pages, 57 figures, submitted to Reports on Progress in Physics Final version as published, several minor clarifications to text and a number of references were correcte

    Pattern Recognition, Tracking and Vertex Reconstruction in Particle Detectors

    Get PDF
    This open access book is a comprehensive review of the methods and algorithms that are used in the reconstruction of events recorded by past, running and planned experiments at particle accelerators such as the LHC, SuperKEKB and FAIR. The main topics are pattern recognition for track and vertex finding, solving the equations of motion by analytical or numerical methods, treatment of material effects such as multiple Coulomb scattering and energy loss, and the estimation of track and vertex parameters by statistical algorithms. The material covers both established methods and recent developments in these fields and illustrates them by outlining exemplary solutions developed by selected experiments. The clear presentation enables readers to easily implement the material in a high-level programming language. It also highlights software solutions that are in the public domain whenever possible. It is a valuable resource for PhD students and researchers working on online or offline reconstruction for their experiments

    Automatic Alignment of 3D Multi-Sensor Point Clouds

    Get PDF
    Automatic 3D point cloud alignment is a major research topic in photogrammetry, computer vision and computer graphics. In this research, two keypoint feature matching approaches have been developed and proposed for the automatic alignment of 3D point clouds, which have been acquired from different sensor platforms and are in different 3D conformal coordinate systems. The first proposed approach is based on 3D keypoint feature matching. First, surface curvature information is utilized for scale-invariant 3D keypoint extraction. Adaptive non-maxima suppression (ANMS) is then applied to retain the most distinct and well-distributed set of keypoints. Afterwards, every keypoint is characterized by a scale, rotation and translation invariant 3D surface descriptor, called the radial geodesic distance-slope histogram. Similar keypoints descriptors on the source and target datasets are then matched using bipartite graph matching, followed by a modified-RANSAC for outlier removal. The second proposed method is based on 2D keypoint matching performed on height map images of the 3D point clouds. Height map images are generated by projecting the 3D point clouds onto a planimetric plane. Afterwards, a multi-scale wavelet 2D keypoint detector with ANMS is proposed to extract keypoints on the height maps. Then, a scale, rotation and translation-invariant 2D descriptor referred to as the Gabor, Log-Polar-Rapid Transform descriptor is computed for all keypoints. Finally, source and target height map keypoint correspondences are determined using a bi-directional nearest neighbour matching, together with the modified-RANSAC for outlier removal. Each method is assessed on multi-sensor, urban and non-urban 3D point cloud datasets. Results show that unlike the 3D-based method, the height map-based approach is able to align source and target datasets with differences in point density, point distribution and missing point data. Findings also show that the 3D-based method obtained lower transformation errors and a greater number of correspondences when the source and target have similar point characteristics. The 3D-based approach attained absolute mean alignment differences in the range of 0.23m to 2.81m, whereas the height map approach had a range from 0.17m to 1.21m. These differences meet the proximity requirements of the data characteristics and the further application of fine co-registration approaches
    • …
    corecore