3,308 research outputs found

    Segmentation of Fault Networks Determined from Spatial Clustering of Earthquakes

    Full text link
    We present a new method of data clustering applied to earthquake catalogs, with the goal of reconstructing the seismically active part of fault networks. We first use an original method to separate clustered events from uncorrelated seismicity using the distribution of volumes of tetrahedra defined by closest neighbor events in the original and randomized seismic catalogs. The spatial disorder of the complex geometry of fault networks is then taken into account by defining faults as probabilistic anisotropic kernels, whose structures are motivated by properties of discontinuous tectonic deformation and previous empirical observations of the geometry of faults and of earthquake clusters at many spatial and temporal scales. Combining this a priori knowledge with information theoretical arguments, we propose the Gaussian mixture approach implemented in an Expectation-Maximization (EM) procedure. A cross-validation scheme is then used and allows the determination of the number of kernels that should be used to provide an optimal data clustering of the catalog. This three-steps approach is applied to a high quality relocated catalog of the seismicity following the 1986 Mount Lewis (Ml=5.7M_l=5.7) event in California and reveals that events cluster along planar patches of about 2 km2^2, i.e. comparable to the size of the main event. The finite thickness of those clusters (about 290 m) suggests that events do not occur on well-defined euclidean fault core surfaces, but rather that the damage zone surrounding faults may be seismically active at depth. Finally, we propose a connection between our methodology and multi-scale spatial analysis, based on the derivation of spatial fractal dimension of about 1.8 for the set of hypocenters in the Mnt Lewis area, consistent with recent observations on relocated catalogs

    Medical ultrasonic tomographic system

    Get PDF
    An electro-mechanical scanning assembly was designed and fabricated for the purpose of generating an ultrasound tomogram. A low cost modality was demonstrated in which analog instrumentation methods formed a tomogram on photographic film. Successful tomogram reconstructions were obtained on in vitro test objects by using the attenuation of the fist path ultrasound signal as it passed through the test object. The nearly half century tomographic methods of X-ray analysis were verified as being useful for ultrasound imaging

    Signal Processing

    Get PDF
    Contains reports on three research projects.Joint Services Electronics Programs (U.S. Army, U. S. Navy, and U. S. Air Force) under Contract DAAB07-71-C-0300U. S. Coast Guard (Contract DOT-CG -13446-A)M.I.T. Lincoln Laboratory Purchase Order CC-57

    Simultaneous Image Registration and Monocular Volumetric Reconstruction of a fluid flow

    Get PDF
    We propose to combine image registration and volumetric reconstruction from a monocular video of a draining off Hele-Shaw cell filled with water. A Hele-Shaw cell is a tank whose depth is small (e.g. 1 mm) compared to the other dimensions (e.g. 400 800 mm2). We use a technique known as molecular tagging which consists in marking by photobleaching a pattern in the fluid and then tracking its deformations. The evolution of the pattern is filmed with a camera whose principal axis coincides with the depth of the cell. The velocity of the fluid along this direction is not constant. Consequently,tracking the pattern cannot be achieved with classical methods because what is observed is the integration of the marked particles over the entire depth of the cell. The proposed approach is built on top of classical direct image registration in which we incorporate a volumetric image formation model. It allows us to accurately measure the motion and the velocity profiles for the entire volume (including the depth of the cell) which is something usually hard to achieve. The results we obtain are consistent with the theoretical hydrodynamic behaviour for this flow which is known as the laminar Poiseuille flow

    Predicting climate change using response theory: global averages and spatial patterns

    Get PDF
    The provision of accurate methods for predicting the climate response to anthropogenic and natural forcings is a key contemporary scientific challenge. Using a simplified and efficient open-source general circulation model of the atmosphere featuring O(105105) degrees of freedom, we show how it is possible to approach such a problem using nonequilibrium statistical mechanics. Response theory allows one to practically compute the time-dependent measure supported on the pullback attractor of the climate system, whose dynamics is non-autonomous as a result of time-dependent forcings. We propose a simple yet efficient method for predicting—at any lead time and in an ensemble sense—the change in climate properties resulting from increase in the concentration of CO22 using test perturbation model runs. We assess strengths and limitations of the response theory in predicting the changes in the globally averaged values of surface temperature and of the yearly total precipitation, as well as in their spatial patterns. The quality of the predictions obtained for the surface temperature fields is rather good, while in the case of precipitation a good skill is observed only for the global average. We also show how it is possible to define accurately concepts like the inertia of the climate system or to predict when climate change is detectable given a scenario of forcing. Our analysis can be extended for dealing with more complex portfolios of forcings and can be adapted to treat, in principle, any climate observable. Our conclusion is that climate change is indeed a problem that can be effectively seen through a statistical mechanical lens, and that there is great potential for optimizing the current coordinated modelling exercises run for the preparation of the subsequent reports of the Intergovernmental Panel for Climate Change

    Novel Approaches in Structured Light Illumination

    Get PDF
    Among the various approaches to 3-D imaging, structured light illumination (SLI) is widely spread. SLI employs a pair of digital projector and digital camera such that the correspondences can be found based upon the projecting and capturing of a group of designed light patterns. As an active sensing method, SLI is known for its robustness and high accuracy. In this dissertation, I study the phase shifting method (PSM), which is one of the most employed strategy in SLI. And, three novel approaches in PSM have been proposed in this dissertation. First, by regarding the design of patterns as placing points in an N-dimensional space, I take the phase measuring profilometry (PMP) as an example and propose the edge-pattern strategy which achieves maximum signal to noise ratio (SNR) for the projected patterns. Second, I develop a novel period information embedded pattern strategy for fast, reliable 3-D data acquisition and reconstruction. The proposed period coded phase shifting strategy removes the depth ambiguity associated with traditional phase shifting patterns without reducing phase accuracy or increasing the number of projected patterns. Thus, it can be employed for high accuracy realtime 3-D system. Then, I propose a hybrid approach for high quality 3-D reconstructions with only a small number of illumination patterns by maximizing the use of correspondence information from the phase, texture, and modulation data derived from multi-view, PMP-based, SLI images, without rigorously synchronizing the cameras and projectors and calibrating the device gammas. Experimental results demonstrate the advantages of the proposed novel strategies for 3-D SLI systems

    Spatial Resolution Analysis of a Variable Resolution X-ray Cone-beam Computed Tomography System

    Get PDF
    A new cone-beam computed tomography (CBCT) system is designed and implemented that can adaptively provide high resolution CT images for objects of different sizes. The new system, called Variable Resolution X-ray Cone-beam CT (VRX-CBCT) uses a CsI-based amorphous silicon flat panel detector (FPD) that can tilt about its horizontal (u) axis and vertical (v) axis independently. The detector angulation improves the spatial resolution of the CT images by changing the effective size of each detector cell. Two components of spatial resolution of the system, namely the transverse and axial modulation transfer functions (MTF), are analyzed in three different situations: (1) when the FPD is tilted only about its vertical axis (v), (2) when the FPD is tilted only about its horizontal axis (u), and (3) when the FPD is tilted isotropically about both its vertical and horizontal axes. Custom calibration and MTF phantoms were designed and used to calibrate and measure the spatial resolution of the system for each case described above. A new 3D reconstruction algorithm was developed and tested for the VRX-CBCT system, which combined with a novel 3D reconstruction algorithm, has improved the overall resolution of the system compared to an FDK-based algorithm
    • …
    corecore