482 research outputs found

    Comparison of Time Series and Random-Vibration Theory Site-Response Methods

    Get PDF
    The random-vibration theory (RVT) approach to equivalent-linear site-response analysis is often used to simulate site amplification, particularly when large numbers of simulations are required for incorporation into probabilistic seismic-hazard analysis. The fact that RVT site-response analysis does not require the specification of input-time series makes it an attractive alternative to other site-response methods. However, some studies have indicated that the site amplification predicted by RVT site-response analysis systematically differs from that predicted by time-series approaches. This study confirms that RVT site-response analysis predicts site amplification at the natural site frequencies as much as 20%-50% larger than time-series analysis, with the largest overprediction occurring for sites with smaller natural frequencies and sites underlain by hard rock. The overprediction is caused by an increase in duration generated by the site response, which is not taken into account in the RVT calculation. Correcting for this change in duration brings the RVT results within 20% of the time-series results. A similar duration effect is observed for the RVT shear-strain calculation used to estimate the equivalent-linear strain-compatible soil properties. An alternative to applying a duration correction to improve the agreement between RVT and time-series analysis is the modeling of shear-wave velocity variability. It is shown that introducing shear-wave velocity variability through Monte Carlo simulation brings the RVT results consistently within +/- 20% of the time-series results.Nuclear Regulatory Commission NRC-04-07-122Civil, Architectural, and Environmental Engineerin

    Momentum-resolved study of the saturation intensity in multiple ionization

    Full text link
    We present a momentum-resolved study of strong field multiple ionization of ionic targets. Using a deconvolution method we are able to reconstruct the electron momenta from the ion momentum distributions after multiple ionization up to four sequential ionization steps. This technique allows an accurate determination of the saturation intensity as well as of the electron release times during the laser pulse. The measured results are discussed in comparison to typically used models of over-the-barrier ionization and tunnel ionization

    Recent Advances in Predicting Earthquake-Induced Sliding Displacements of Slopes

    Get PDF
    This paper summarizes recent research related to predicting earthquake-induced sliding displacements of earth slopes. Recently developed empirical models for the prediction of sliding displacements for shallow (rigid) failure surfaces are discussed, and comparisons of the different models demonstrate that including peak ground velocity, along with peak ground acceleration, reduces the median displacement prediction and the standard deviation of the prediction. Thus, peak velocity provides important information regarding the level of sliding displacement. A framework is developed such that the recently developed empirical displacement models for rigid sliding can be used for deeper, flexible failure surfaces, where the dynamic response of the sliding mass is important. This framework includes predicting the seismic loading for the sliding mass in terms of the maximum seismic coefficient (kmax) and the maximum velocity of the seismic coefficient-time history (k-velmax). The predictive models for kmax and k-velmax are a function of the peak ground acceleration (PGA), peak ground velocity (PGV), the natural period of the sliding mass (Ts), and the mean period of the earthquake motion (Tm). With a slight modification, the empirical predictive models for rigid sliding masses can be used, with PGA replaced by kmax and PGV replaced by k-velmax. The standard deviations for the modified predictive models for flexible sliding masses are slightly smaller than those for rigid sliding masses

    Application of Single-Station Sigma and Site-Response Characterization in a Probabilistic Seismic-Hazard Analysis for a New Nuclear Site

    Get PDF
    Aleatory variability in ground-motion prediction, represented by the standard deviation (sigma) of a ground-motion prediction equation, exerts a very strong influence on the results of probabilistic seismic-hazard analysis (PSHA). This is especially so at the low annual exceedance frequencies considered for nuclear facilities; in these cases, even small reductions in sigma can have a marked effect on the hazard estimates. Proper separation and quantification of aleatory variability and epistemic uncertainty can lead to defensible reductions in sigma. One such approach is the single-station sigma concept, which removes that part of sigma corresponding to repeatable site-specific effects. However, the site-to-site component must then be constrained by site-specific measurements or else modeled as epistemic uncertainty and incorporated into the modeling of site effects. The practical application of the single-station sigma concept, including the characterization of the dynamic properties of the site and the incorporation of site-response effects into the hazard calculations, is illustrated for a PSHA conducted at a rock site under consideration for the potential construction of a nuclear power plant.Civil, Architectural, and Environmental Engineerin

    Coherent control at its most fundamental: CEP-dependent electron localization in photodissoziation of a H2+ molecular ion beam target

    Get PDF
    Measurements and calculations of the absolute carrier-envelope phase (CEP) effects in the photodissociation of the simplest molecule, H2+, with a 4.5-fs Ti:Sapphire laser pulse at intensities up to (4 +- 2)x10^14 Watt/cm^2 are presented. Localization of the electron with respect to the two nuclei (during the dissociation process) is controlled via the CEP of the ultra-short laser pulses. In contrast to previous CEP-dependent experiments with neutral molecules, the dissociation of the molecular ions is not preceded by a photoionization process, which strongly influences the CEP dependence. Kinematically complete data is obtained by time- and position-resolved coincidence detection. The phase dependence is determined by a single-shot phase measurement correlated to the detection of the dissoziation fragments. The experimental results show quantitative agreement with ab inito 3D-TDSE calculations that include nuclear vibration and rotation.Comment: new version includes minore changes and adding the supp_material.pd

    Human-automation collaboration in occluded trajectory smoothing

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2010.Cataloged from PDF version of thesis.Includes bibliographical references (p. 133-134).Deciding if and what objects should be engaged in a Ballistic Missile Defense System (BMDS) scenario involves a number of complex issues. The system is large and the timelines may be on the order of a few minutes, which drives designers to highly automate these systems. On the other hand, the critical nature of BMD engagement decisions suggests exploring a human-in-the-loop (HIL) approach to allow for judgment and knowledge-based decisions, which provide for potential automated system override decisions. This BMDS problem is reflective of the role allocation conundrum faced in many supervisory control systems, which is how to determine which functions should be mutually exclusive and which should be collaborative. Clearly there are some tasks that are too computationally intensive for human assistance, while other tasks may be completed without automation. Between the extremes are a number of cases in which degrees of collaboration between the human and computer are possible. This thesis motivates and outlines two experiments that quantitatively investigate human/automation tradeoffs in the specific domain of tracking problems. Human participants in both experiments were tested in their ability to smooth trajectories in different scenarios. In the first experiment, they clearly demonstrated an ability to assist the algorithm in more difficult, shorter timeline scenarios. The second experiment combined the strengths of both human and automation to create a human-augmented system. Comparison of the augmented system to the algorithm showed that adjusting the criterion for having human participation could significantly alter the solution. The appropriate criterion would be specific to each application of this augmented system. Future work should be focused on further examination of appropriate criteria.by Jason M. Rathje.S.M

    Geotechnical Lessons Learned From Earthquakes

    Get PDF
    Geotechnical earthquake engineering is an experience-driven discipline. Field observations are particularly important because it is difficult to replicate in the laboratory, the characteristics and response of soil deposits built by nature over thousands of years. Further, much of the data generated by a major earthquake is perishable, so it is critical that it is collected soon after the event occurs. Detailed mapping and surveying of damaged and undamaged areas provides the data for the well-documented case histories that drive the development of many of the design procedures used by geotechnical engineers. Thus, documenting the key lessons learned from major earthquake events around the world contributes significantly to advancing research and practice in geotechnical earthquake engineering. This is one of the primary objectives of the Geotechnical Extreme Events Reconnaissance (GEER) Association. Some of GEER’s findings from recent earthquakes are described in this paper. In particular, the use of advanced reconnaissance techniques is highlighted, as well as specific technical findings from the 1999 Kocaeli, Turkey earthquake, the 2007 Pisco, Peru earthquake, the 2010 Haiti earthquake, and the 2010 Maule, Chile earthquake

    CEP-stable Tunable THz-Emission Originating from Laser-Waveform-Controlled Sub-Cycle Plasma-Electron Bursts

    Full text link
    We study THz-emission from a plasma driven by an incommensurate-frequency two-colour laser field. A semi-classical transient electron current model is derived from a fully quantum-mechanical description of the emission process in terms of sub-cycle field-ionization followed by continuum-continuum electron transitions. For the experiment, a CEP-locked laser and a near-degenerate optical parametric amplifier are used to produce two-colour pulses that consist of the fundamental and its near-half frequency. By choosing two incommensurate frequencies, the frequency of the CEP-stable THz-emission can be continuously tuned into the mid-IR range. This measured frequency dependence of the THz-emission is found to be consistent with the semi-classical transient electron current model, similar to the Brunel mechanism of harmonic generation

    Momentum distributions of sequential ionization generated by an intense laser pulse

    Get PDF
    Journals published by the American Physical Society can be found at http://publish.aps.org/The relative yield and momentum distributions of all multiply charged atomic ions generated by a short (30 fs) intense (10(14)-5 x 10(18) W/cm(2)) laser pulse are investigated using a Monte Carlo simulation. We predict a substantial shift in the maximum (centroid) of the ion-momentum distribution along the laser polarization as a function of the absolute phase. This effect should be experimentally detectable with currently available laser systems even for relatively long pulses, such as 25-30 fs. In addition to the numerical results, we present semianalytical scaling for the position of the maximum

    Virtualizing the Stampede2 Supercomputer with Applications to HPC in the Cloud

    Full text link
    Methods developed at the Texas Advanced Computing Center (TACC) are described and demonstrated for automating the construction of an elastic, virtual cluster emulating the Stampede2 high performance computing (HPC) system. The cluster can be built and/or scaled in a matter of minutes on the Jetstream self-service cloud system and shares many properties of the original Stampede2, including: i) common identity management, ii) access to the same file systems, iii) equivalent software application stack and module system, iv) similar job scheduling interface via Slurm. We measure time-to-solution for a number of common scientific applications on our virtual cluster against equivalent runs on Stampede2 and develop an application profile where performance is similar or otherwise acceptable. For such applications, the virtual cluster provides an effective form of "cloud bursting" with the potential to significantly improve overall turnaround time, particularly when Stampede2 is experiencing long queue wait times. In addition, the virtual cluster can be used for test and debug without directly impacting Stampede2. We conclude with a discussion of how science gateways can leverage the TACC Jobs API web service to incorporate this cloud bursting technique transparently to the end user.Comment: 6 pages, 0 figures, PEARC '18: Practice and Experience in Advanced Research Computing, July 22--26, 2018, Pittsburgh, PA, US
    • …
    corecore