445 research outputs found

    Global optimization of data quality checks on 2‐D and 3‐D networks of GPR cross‐well tomographic data for automatic correction of unknown well deviations

    Full text link
    Significant errors related to poor time zero estimation, well deviation or mislocation of the transmitter (TX) and receiver (RX) stations can render even the most sophisticated modeling and inversion routine useless. Previous examples of methods for the analysis and correction of data errors in geophysical tomography include the works of Maurer and Green (1997), Squires et al. (1992) and Peterson (2001). Here we follow the analysis and techniques of Peterson (2001) for data quality control and error correction. Through our data acquisition and quality control procedures we have very accurate control on the surface locations of wells, the travel distance of both the transmitter and receiver within the boreholes, and the change in apparent zero time. However, we often have poor control on well deviations, either because of economic constraints or the nature of the borehole itself prevented the acquisition of well deviation logs. Also, well deviation logs can sometimes have significant errors. Problems with borehole deviations can be diagnosed prior to inversion of travel-time tomography data sets by plotting the apparent velocity of a straight ray connecting a transmitter (TX) to a receiver (RX) against the take-off angle of the ray. Issues with the time-zero pick or distances between wells appear as symmetric smiles or frown in these QC plots. Well deviation or dipping-strong anisotropy will result in an asymmetric correlation between apparent velocity and take-off angle (Figure 1-B). In addition, when a network of interconnected GPR tomography data is available, one has the additional quality constraint of insuring that there is continuity in velocity between immediately adjacent tomograms. A sudden shift in the mean velocity indicates that either position deviations are present or there is a shift in the pick times. Small errors in well geometry may be effectively treated during inversion by including weighting, or relaxation, parameters into the inversion (e.g. Bautu et al., 2006). In the technique of algebraic reconstruction tomography (ART), which is used herein for the travel time inversion (Peterson et al., 1985), a small relaxation parameter will smooth imaging artifacts caused by data errors at the expense of resolution and contrast (Figure 2). However, large data errors such as unaccounted well deviations cannot be adequately suppressed through inversion weighting schemes. Previously, problems with tomograms were treated manually. However, in large data sets and/or networks of data sets, trial and error changes to well geometries become increasingly difficult and ineffective. Mislocation of the transmitter and receiver stations of GPR cross-well tomography data sets can lead to serious imaging artifacts if not accounted for prior to inversion. Previously, problems with tomograms have been treated manually prior to inversion. In large data sets and/or networks of tomographic data sets, trial and error changes to well geometries become increasingly difficult and ineffective. Our approach is to use cross-well data quality checks and a simplified model of borehole deviation with particle swarm optimization (PSO) to automatically correct for source and receiver locations prior to tomographic inversion. We present a simple model of well deviation, which is designed to minimize potential corruption of actual data trends. We also provide quantitative quality control measures based on minimizing correlations between take-off angle and apparent velocity, and a quality check on the continuity of velocity between adjacent wells. This methodology is shown to be accurate and robust for simple 2-D synthetic test cases. Plus, we demonstrate the method on actual field data where it is compared to deviation logs. This study shows the promise for automatic correction of well deviations in GPR tomographic data. Analysis of synthetic data shows that very precise estimates of well deviation can be made for small deviations, even in the presence of static data errors. However, the analysis of the synthetic data and the application of the method to a large network of field data show that the technique is sensitive to data errors varying between neighboring tomograms

    Global water cycle

    Get PDF
    This research is the MSFC component of a joint MSFC/Pennsylvania State University Eos Interdisciplinary Investigation on the global water cycle extension across the earth sciences. The primary long-term objective of this investigation is to determine the scope and interactions of the global water cycle with all components of the Earth system and to understand how it stimulates and regulates change on both global and regional scales. Significant accomplishments in the past year are presented and include the following: (1) water vapor variability; (2) multi-phase water analysis; (3) global modeling; and (4) optimal precipitation and stream flow analysis and hydrologic processes

    Scaling laws governing stochastic growth and division of single bacterial cells

    Get PDF
    Uncovering the quantitative laws that govern the growth and division of single cells remains a major challenge. Using a unique combination of technologies that yields unprecedented statistical precision, we find that the sizes of individual Caulobacter crescentus cells increase exponentially in time. We also establish that they divide upon reaching a critical multiple (≈\approx1.8) of their initial sizes, rather than an absolute size. We show that when the temperature is varied, the growth and division timescales scale proportionally with each other over the physiological temperature range. Strikingly, the cell-size and division-time distributions can both be rescaled by their mean values such that the condition-specific distributions collapse to universal curves. We account for these observations with a minimal stochastic model that is based on an autocatalytic cycle. It predicts the scalings, as well as specific functional forms for the universal curves. Our experimental and theoretical analysis reveals a simple physical principle governing these complex biological processes: a single temperature-dependent scale of cellular time governs the stochastic dynamics of growth and division in balanced growth conditions.Comment: Text+Supplementar

    Verification of model simulated mass balance, flow fields and tabular calving events of the Antarctic ice sheet against remotely sensed observations

    Get PDF
    The Antarctic ice sheet (AIS) has the greatestpotential for global sea level rise. This study simulates AISice creeping, sliding, tabular calving, and estimates the totalmass balances, using a recently developed, advanced icedynamics model, known as SEGMENT-Ice. SEGMENTIceis written in a spherical Earth coordinate system.Because the AIS contains the South Pole, a projectiontransfer is performed to displace the pole outside of thesimulation domain. The AIS also has complex ice-watergranularmaterial-bedrock configurations, requiringsophisticated lateral and basal boundary conditions.Because of the prevalence of ice shelves, a ‘girder yield’type calving scheme is activated. The simulations of presentsurface ice flow velocities compare favorably with InSARmeasurements, for various ice-water-bedrock configurations.The estimated ice mass loss rate during 2003–2009agrees with GRACE measurements and provides morespatial details not represented by the latter. The modelestimated calving frequencies of the peripheral ice shelvesfrom 1996 (roughly when the 5-km digital elevation andthickness data for the shelves were collected) to 2009compare well with archived scatterometer images. SEGMENT-Ice’s unique, non-local systematic calving schemeis found to be relevant for tabular calving. However, theexact timing of calving and of iceberg sizes cannot besimulated accurately at present. A projection of the futuremass change of the AIS is made, with SEGMENT-Iceforced by atmospheric conditions from three differentcoupled general circulation models. The entire AIS is estimatedto be losing mass steadily at a rate of*120 km3/a atpresent and this rate possibly may double by year 2100

    Seismogenic zone structure of the southern Middle America Trench, Costa Rica

    Get PDF
    The shallow seismogenic portion of subduction zones generates damaging large and great earthquakes. This study provides structural constraints on the seismogenic zone of the Middle America Trench offshore central Costa Rica and insights into the physical and mechanical characteristics controlling seismogenesis. We have located ~300 events that occurred following the MW 6.9, 20 August 1999, Quepos, Costa Rica, underthrusting earthquake using a three-dimensional velocity model and arrival time data recorded by a temporary local network of land and ocean bottom seismometers. We use aftershock locations to define the geometry and characteristics of the seismogenic zone in this region. These events define a plane dipping at 19° that marks the interface between the Cocos Plate and the Panama Block. The majority of aftershocks occur below 10 km and above 30 km depth below sea level, corresponding to 30–35 km and 95 km from the trench axis, respectively. Relative event relocation produces a seismicity pattern similar to that obtained using absolute locations, increasing confidence in the geometry of the seismogenic zone. The aftershock locations spatially correlate with the downdip extension of the oceanic Quepos Plateau and reflect the structure of the main shock rupture asperity. This strengthens an earlier argument that the 1999 Quepos earthquake ruptured specific bathymetric highs on the downgoing plate. We believe that subduction of this highly disrupted seafloor has established a set of conditions which presently limit the seismogenic zone to be between 10 and 35 km below sea level

    Impact of electronic medical record on physician practice in office settings: a systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Increased investments are being made for electronic medical records (EMRs) in Canada. There is a need to learn from earlier EMR studies on their impact on physician practice in office settings. To address this need, we conducted a systematic review to examine the impact of EMRs in the physician office, factors that influenced their success, and the lessons learned.</p> <p>Results</p> <p>For this review we included publications cited in Medline and CINAHL between 2000 and 2009 on physician office EMRs. Studies were included if they evaluated the impact of EMR on physician practice in office settings. The Clinical Adoption Framework provided a conceptual scheme to make sense of the findings and allow for future comparison/alignment to other Canadian eHealth initiatives.</p> <p>In the final selection, we included 27 controlled and 16 descriptive studies. We examined six areas: prescribing support, disease management, clinical documentation, work practice, preventive care, and patient-physician interaction. Overall, 22/43 studies (51.2%) and 50/109 individual measures (45.9%) showed positive impacts, 18.6% studies and 18.3% measures had negative impacts, while the remaining had no effect. Forty-eight distinct factors were identified that influenced EMR success. Several lessons learned were repeated across studies: (a) having robust EMR features that support clinical use; (b) redesigning EMR-supported work practices for optimal fit; (c) demonstrating value for money; (d) having realistic expectations on implementation; and (e) engaging patients in the process.</p> <p>Conclusions</p> <p>Currently there is limited positive EMR impact in the physician office. To improve EMR success one needs to draw on the lessons from previous studies such as those in this review.</p

    Data publication with the structural biology data grid supports live analysis

    Get PDF
    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data. sbgrid. org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of the original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. It is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis
    • 

    corecore