1,514 research outputs found

    Integration of geological and seismological data for the analysis of seismic hazard: A case study of Japan

    Get PDF
    Seismic hazard analyses are associated with large uncertainties when historical data are insufficient to define secular rates of seismicity. Such uncertainties may be decreased with geological data in areas where seismicity is shallow and produced by Quaternary faulting. To illustrate, we examine intraplate Japan. Large intraplate earthquakes in Japan characteristically produce surface ruptures along mappable Quaternary faults and show a systematic relation between seismic moment, M_0 and rupture length I (log M_0 = 23.5 + 1.94 × log I). It is observed that, within the bounds placed by geologically assessed slip rates, the mean regional moment release rate M_0 resulting from slip on mapped Quaternary faults is in accord with estimates of M_0 determined with the 400-yr record of seismicity. Recent work also shows that when the repeat time T of earthquakes on Quaternary faults in southwest Japan is assumed to equal M_0/M_0^g (where M_0 is estimated for rupture extended over the entire fault length and M_0^g is the geologically assessed moment release rate of each fault), the moment frequency distribution of earthquakes predicted from the geologic record is virtually identical to that seen with the 400-yr record of seismicity. These observations indicate that the geologic record of Quaternary fault offsets contains sufficient information to predict both the spatial and size distribution of intraplate earthquakes in Japan. A contour map of the average recurrence time of ground shaking of JMA intensity ≧V is thus computed using an empirical relation between seismic moment and the areal distribution of seismic intensity and assuming that the repeat time T of earthquakes on each Quaternary fault equals M_0/M_0^g. The map demonstrates how Quaternary fault data may be used to assess long-term seismic hazard in areas of active faulting where historical records of seismicity are relatively short or absent. Another shortcoming of conventional seismic hazard analysis is that hazard is not considered a function of the time since each fault in a region last ruptured. A simple procedure is used to demonstrate how the time-dependent nature of the earthquake cycle affects the evaluation of seismic hazard. The distribution of seismic shaking characteristic of large interplate earthquakes offshore of Japan is estimated from published isoseismal maps. The observed average repeat times of ruptures along specific segments of the plate boundaries then provide the basis to make probabilistic estimates of the next expected time of seismic shaking due to plate boundary earthquakes. When data are too few to document the average repeat times of rupture, the estimates of probability are calculated with data relating to the relative coseismic slip during past earthquakes and the rate of interseismic strain accumulation, interpreted within the framework of the time predictable model of earthquake occurrence. Results are displayed as maps of instantaneous seismic hazard: the probability that seismic shaking will occur conditional to knowledge of where in time each fault in a region presently resides with respect to the earthquake cycle

    Quantum model for magnetic multivalued recording in coupled multilayers

    Full text link
    In this paper, we discuss the possibilities of realizing the magnetic multi-valued (MMV) recording in a magnetic coupled multilayer. The hysteresis loop of a double-layer system is studied analytically, and the conditions for achieving the MMV recording are given. The conditions are studied from different respects, and the phase diagrams for the anisotropic parameters are given in the end.Comment: 8 pages, LaTex formatted, 7 figures (those who are interested please contact the authors requring the figures) Submitted to Physal Review B. Email: [email protected]

    IC immunity modeling process validation using on-chip measurements

    Get PDF
    International audienceDeveloping integrated circuit (IC) immunity models and simulation flow has become one of the major concerns of ICs suppliers to predict whether a chip will pass susceptibility tests before fabrication and avoid redesign cost. This paper presents an IC immunity modeling process including the standard immunity test applied to a dedicated test chip. An on-chip voltage sensor is used to characterize the radio frequency interference propagation inside the chip and thus validate the immunity modeling process

    Fluorophotometry as a diagnostic tool for the evaluation of dry eye disease

    Get PDF
    BACKGROUND: Dry eye disease is a common debilitating ocular disease. Current diagnostic tests used in dry eye disease are often neither sensitive nor reproducible, making it difficult to accurately diagnose and determine end points for clinical trials, or evaluate the usefulness of different medications in the treatment of dry eye disease. The recently developed fluorophotometer can objectively detect changes in the corneal epithelium by quantitatively measuring its barrier function or permeability. The purpose of the study is to investigate the use of corneal fluorescein penetration measured by the fluorophotometer as a diagnostic tool in the evaluation of dry eye patients. METHODS: Dry eye patients (16 eyes), who presented with a chief complaint of ocular irritation corresponding with dry eye, low Schirmer's one test (<10 mm after 5 minutes) and corneal fluorescein staining score of more than two, were included in the study. Normal subjects (16 eyes), who came for refraction error evaluation, served as controls. Institutional Review Board (IRB) approved consent was obtained before enrolling the subjects in the study and all questions were answered while explaining the risks, benefits and alternatives. All Fluorophotometry of the central corneal epithelium was done utilizing the Fluorotron Master (TradeMark). Each eye had a baseline fluorescein scan performed, after which 50 l of 1% sodium fluorescein dye was instilled. Three minutes later, the fluorescein was washed with 50 ml of normal saline. Fluorescein scans were then started immediately after washing and were recorded at 10, 20, 40, and 60 minutes thereafter. The corneal peak values of fluorescein concentration were recorded within the central cornea in both dry eyes and in controls. RESULTS: Ten minutes after fluorescein installition, patients with dry eye disease averaged a five-fold increase in corneal tissue fluorescein concentration (mean = 375.26 ± 202.67 ng/ml) compared with that of normal subjects (mean = 128.19 ± 85.84 ng/ml). Sixty minutes after dye installation, patients with dry eye disease still revealed higher corneal tissue fluorescein concentration (mean = 112.87 ± 52.83 ng/ml) compared with that of controls (mean = 40.64 ± 7.96 ng/ml), averaging a three-fold increase. CONCLUSION: Patients with dry eye disease demonstrated an increased corneal permeability and a slower rate of elimination to topically administered fluorescein when measured by the fluorophotometer. This suggests that fluorophotometry may serve as a valuable quantitative and objective tool for the diagnosis of dry eye disease, and in following patients' response to new treatment modalities. Fluorophotometry may serve as an objective non-invasive tool for end-point analysis in clinical trials of new treatments for dry eye disease

    A unifying framework for mean-field theories of asymmetric kinetic Ising systems

    Get PDF
    Kinetic Ising models are powerful tools for studying the non-equilibrium dynamics of complex systems. As their behavior is not tractable for large networks, many mean-field methods have been proposed for their analysis, each based on unique assumptions about the system’s temporal evolution. This disparity of approaches makes it challenging to systematically advance mean-field methods beyond previous contributions. Here, we propose a unifying framework for mean-field theories of asymmetric kinetic Ising systems from an information geometry perspective. The framework is built on Plefka expansions of a system around a simplified model obtained by an orthogonal projection to a sub-manifold of tractable probability distributions. This view not only unifies previous methods but also allows us to develop novel methods that, in contrast with traditional approaches, preserve the system’s correlations. We show that these new methods can outperform previous ones in predicting and assessing network properties near maximally fluctuating regimes
    corecore