5,821 research outputs found

    A minimum principle for the quasi-static problem in linear viscoelasticity

    Get PDF
    A minimum principle is set up for the quasi-static boundary-value problem (QSP) in linear viscoelasticity. A linear homogeneous and isotropic viscoelastic solid under unidimensional displacements is considered along with the complete set of thermodynamic restrictions on the relaxation function. It is assumed that boundary conditions are of Dirichlet type and initial history data are not given. The variational formulation of QSP is set up through a convex functional based on a "weighted" L2L^2 inner product as the bilinear form and is strictly related to the thermodynamic restrictions on the relaxation function. As an aside, the same technique is proved to be applicable to analogous physical problems such as the quasi-static heat flux equation

    New variational principles in quasi-static viscoelasticity

    Get PDF
    A "saddle point" (or maximum-minimum) principle is set up for the quasi-static boundary-value problem in linear viscoelasticity. The appropriate class of convolution-type functionals for it is taken in terms of bilinear forms with a weight function involving Fourier transform. The "minimax" property is shown to hold as a direct consequence of the thermodynamic restrictions on the relaxation function. This approach can be extended to further linear evolution problems where initial data are not prescribed

    The Assumption of Poisson Seismic-Rate Variability in CSEP/RELM Experiments

    Get PDF
    Evaluating the performances of earthquake forecasting/prediction models is the main rationale behind some recent international efforts like the Regional Earthquake Likelihood Model (RELM) and the Collaboratory for the Study of Earthquake Predictability (CSEP). Basically, the evaluation process consists of two steps: 1) to run simultaneously all codes to forecast future seismicity in well-defined testing regions; 2) to compare the forecasts through a suite of statistical tests. The tests are based on the likelihood score and they check both the time and space performances. All these tests rely on some basic assumptions that have never been deeply discussed and analyzed. In particular, models are required to specify a rate in space-time-magnitude bins, and it is assumed that these rates are independent and characterized by Poisson uncertainty. In this work we have explored in detail these assumptions and their impact on CSEP testing procedures when applied to a widely used class of models, i.e., the Epidemic-Type Aftershock Sequence (ETAS) models. Our results show that, if an ETAS model is an accurate representation of seismicity, the same "right" model is rejected by the current CSEP testing procedures a number of times significantly higher than expected. We show that this deficiency is due to the fact that the ETAS models produce forecasts with a variability significantly higher than that of a Poisson process, invalidating one of the main assumption that stands behind the CSEP/RELM evaluation process. Certainly, this shortcoming does not negate the paramount importance of the CSEP experiments as a whole, but it does call for a specific revision of the testing procedures to allow a better understanding of the results of such experiments

    Uniform attractors for a non-autonomous semilinear heat equation with memory

    Get PDF
    n this paper we investigate the asymptotic behavior, as time tends to infinity, of the solutions of a non-autonomous integro-partial differential equation describing the heat how in a rigid heat conductor with memory. Existence and uniqueness of solutions is provided. Moreover, under proper assumptions on the heat flux memory kernel and on the magnitude of nonlinearity, the existence of uniform absorbing sets and of a global uniform attractor is achieved. In the case of quasiperiodic dependence of time of the external heat supply the above attractor is shown to have finite Hausdorff dimension

    The ETAS model for daily forecasting of Italian seismicity in the CSEP experiment

    Get PDF
    This paper investigates the basic properties of the recent shallow seismicity in Italy through stochastic modeling and statistical methods. Assuming that the earthquakes are the realization of a stochastic point process, we model the occurrence rate density in space, time and magnitude by means of an Epidemic Type Aftershock Sequence (ETAS) model. By applying the maximum likelihood procedure, we estimates the parameters of the model that best fit the Italian instrumental catalog, recorded by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) from April 16th 2005 to June 1st 2009. Then we apply the estimated model on a second independent dataset (June 1st 2009- Sep 1st 2009). We find that the model performs well on this second database, by using proper statistical tests. The model proposed in the present study is suitable for computing earthquake occurrence probability in real time and to take part in international initiatives such as the Collaboratory Study for Earthquake Predictability (CSEP). Specifically we have submitted this model for the daily forecasting of Italian seismicity above Ml4.0

    On the Increase of Background Seismicity Rate during the 1997-1998 Umbria-Marche, Central Italy, Sequence: Apparent Variation or Fluid-Driven Triggering?

    Get PDF
    We investigate the temporal evolution of background seismicity rate in the Umbria-Marche sector of the northern Apennines that was struck by the 1997-98 Colfiorito seismic sequence. Specifically we apply the ETAS model to separate the background seismicity rate from the coseismic triggered rate of earthquake production. Analyzed data are extracted from the CSI1.1 catalog of Italian seismicity (1981-2002), which contains for the study area 12.163 events with ML > 1.5. The capability of the ETAS model to match the observed seismicity rate is tested by analyzing the model residuals and by applying two non-parametric statistical tests (the RUNS and the Kolmogorov-Smirnov tests) to verify the fit of residuals to Poisson hypothesis. We first apply the ETAS model to the seismicity occurred in the study area during the whole period covered by the CSI1.1 catalog. Our results show that the ETAS model does not explain the temporal evolution of seismicity in a time interval defined by change points identified from time-evolution of residuals and encompassing the Colfiorito seismic sequence. We therefore restrict our analysis to this period and analyze only those events belonging to the 1997-1998 seismic sequence. We again obtain the inadequacy of a stationary ETAS model with constant background rate to reproduce the temporal pattern of observed seismicity. We verify that the failure of ETAS model to fit the observed data is caused by the increase of the background seismicity rate associated with the repeated Colfiorito main shocks. We interpret the inferred increase of background rate as a consequence of the perturbation to the coseismic stress field caused by fluid flow and/or pore pressure relaxation. In particular we show that the transient perturbation caused by poroelastic relaxation can explain the temporal increase of background rate that therefore represents a fluid signal in the seismicity pattern

    VISTO: An open-source device to measure exposure time in psychological experiments

    Get PDF
    The study of higher cognitive processes often relies on the manipulation of bottom-up stimulus characteristics such as exposure time. While several software exist that can schedule the onset and offset time of a visual stimulus, the actual exposure time depends on several factors that are not easy to control, resulting in undesired variability within and across studies. Here we present VISTO, a simple device built on the Arduino platform that allows one to measure the exact onset and offset of a visual stimulus, and to test its synchronization with a trigger signal. The device is used to measure the profile of luminance waveforms in arbitrary analog/digital (AD) units, and the implications of these luminance profiles are discussed based on a model of information accumulation from visual exposure. Moreover, VISTO can be calibrated to match the brightness of each experimental monitor. VISTO allows for control of stimulus timing presentation, both in classical laboratory settings and in more complex settings as technology allows to use new display devices or acquisition equipment. In sum, VISTO allows one to: • measure the profile of luminance curves. • determine the exposure time of a visual stimulus. • measure the synchronization between a trigger signal and a visual stimulus

    Tsunami risk assessments in Messina, Sicily – Italy

    Get PDF
    Abstract. We present a first detailed tsunami risk assessment for the city of Messina where one of the most destructive tsunami inundations of the last centuries occurred in 1908. In the tsunami hazard evaluation, probabilities are calculated through a new general modular Bayesian tool for Probability Tsunami Hazard Assessment. The estimation of losses of persons and buildings takes into account data collected directly or supplied by: (i) the Italian National Institute of Statistics that provides information on the population, on buildings and on many relevant social aspects; (ii) the Italian National Territory Agency that provides updated economic values of the buildings on the basis of their typology (residential, commercial, industrial) and location (streets); and (iii) the Train and Port Authorities. For human beings, a factor of time exposition is introduced and calculated in terms of hours per day in different places (private and public) and in terms of seasons, considering that some factors like the number of tourists can vary by one order of magnitude from January to August. Since the tsunami risk is a function of the run-up levels along the coast, a variable tsunami risk zone is defined as the area along the Messina coast where tsunami inundations may occur

    A Brownian Model for Recurrent Volcanic Eruptions: an Application to Miyakejima Volcano (Japan)

    Get PDF
    The definition of probabilistic models as mathematical structures to describe the response of a volcanic system is a plausible approach to characterize the temporal behavior of volcanic eruptions, and constitutes a tool for long-term eruption forecasting. This kind of approach is motivated by the fact that volcanoes are complex systems in which a com- pletely deterministic description of the processes preceding eruptions is practically impos- sible. To describe recurrent eruptive activity we apply a physically-motivated probabilistic model based on the characteristics of the Brownian passage-time (BPT) distribution; the physical process defining this model can be described by the steady rise of a state variable from a ground state to a failure threshold; adding Brownian perturbations to the steady load- ing produces a stochastic load-state process (a Brownian relaxation oscillator) in which an eruption relaxes the load state to begin a new eruptive cycle. The Brownian relaxation os- cillator and Brownian passage-time distribution connect together physical notions of unob- servable loading and failure processes of a point process with observable response statistics. The Brownian passage-time model is parameterized by the mean rate of event occurrence, μ , and the aperiodicity about the mean, α . We apply this model to analyze the eruptive his- tory of Miyakejima volcano, Japan, finding a value of 44.2(±6.5 years) for the μ parameter and 0.51(±0.01) for the (dimensionless) α parameter. The comparison with other models often used in volcanological literature shows that this pysically-motivated model may be a good descriptor of volcanic systems that produce eruptions with a characteristic size. BPT is clearly superior to the exponential distribution and the fit to the data is comparable to other two-parameters models. Nonetheless, being a physically-motivated model, it provides an insight into the macro-mechanical processes driving the system
    corecore