7,416 research outputs found

    Strong gravitational field light deflection in binary systems containing a collapsed star

    Full text link
    Large light deflection angles are produced in the strong gravitational field regions around neutron stars and black holes. In the case of binary systems, part of the photons emitted from the companion star towards the collapsed object are expected to be deflected in the direction of the earth. Based on a semi-classical approach we calculate the characteristic time delays and frequency shifts of these photons as a function of the binary orbital phase. The intensity of the strongly deflected light rays is reduced by many orders of magnitude, therefore making the observations of this phenomenon extremely difficult. Relativistic binary systems containing a radio pulsar and a collapsed object are the best available candidates for the detection of the strongly deflected photons. Based on the accurate knowledge of their orbital parameters, these systems allow to predict accurately the delays of the pulses along the highly deflected path, such that the sensitivity to very weak signals can be substantially improved through coherent summation over long time intervals. We discuss in detail the cases of PSR 1913+16 and PSR 1534+12 and find that the system geometry is far more promising for the latter. The observation of the highly deflected photons can provide a test of general relativity in an unprecedented strong field regime as well as a tight constraint on the radius of the collapsed object.Comment: 7 pages, uuencoded, gzip'ed, postscript file with figures included. Accepted for pubblication in MNRA

    The vulnerability assessment of current buildings by a macroseismic approach derived from the EMS-98 scale

    Get PDF
    A hierarchical family of Damage Probability Matrices (DPM) has been derived in this paper from the ones implicitly contained in the EMS-98 Macroseismic Scale for 6 vulnerability classes. To this aim the linguistic definitions provided by the scale, and the associated fuzzy sub-sets of the percentage of buildings, have been completed according to reliable hypotheses. A parametric representation of the corresponding cumulative probability distributions is moreover provided, through a unique parameter: a vulnerability index variable in the range from 0 to 1 and independent of the macroseismic intensity. Finally, an innovative macroseismic approach allowing the vulnerability analysis of building typologies is defined within the European Macroseismic Scale (EMS-98) and qualitatively related to the vulnerability classes. Bayes’ theorem allows the upgrading of the frequencies when further data about the built-environment or specific properties of the buildings are available, allowing the identification of a different behaviours with respect to the one generally considered for the typology. Fuzzy measures of any damage function can be derived, using parametric or nonparametric damage probability matrices. For every result of the seismic analysis, the procedure allows supply to the user of the final uncertainty connected with the aforementioned fuzzy relation between the probability of the damage grade, the macroseismic intensity and the vulnerability classes

    An operational flash-flood forecasting chain applied to the test cases of the EU project HYDROPTIMET

    No full text
    International audience The application of a flash-flood prediction chain, developed by CIMA, to some testcases for the Tanaro river basin in the framework of the EU project HYDROPTIMET is presented here. The components of the CIMA chain are: forecast rainfall depths, a stochastic downscaling procedure and a hydrological model. Different meteorological Limited Area Models (LAMs) provide the rainfall input to the hydrological component. The flash-flood prediction chain is run both in a deterministic and in a probabilistic configuration. The sensitivity of forecasting chain performances to different LAMs providing rainfall forecasts is discussed. The results of the application show how the probabilistic forecasting system can give, especially in the case of convective events, a valuable contribution in addressing the uncertainty at different spatio-temporal scales involved in the flash flood forecasting problem in small and medium basins with complex orography

    A probabilistic framework to define the design stress and acceptable defects under combined-cycle fatigue conditions

    Get PDF
    Abstract Probabilistic routines are necessary to determine the permissible design stress when the operational conditions of traditional gas turbines are further challenged to reach higher performances in terms of flexibility. This paper focuses on the definition of a probabilistic tool aimed to simulate the damage development under combined high and low cycle fatigue conditions considering the stochastic nature of some of the variables introduced in the model. The variables considered in the present study are the long crack threshold Δ K th , LC and the slope of the crack growth law C in the so-called Paris region, the endurance limit Δ σ w 0 and the variability of the applied stress as it cannot be considered a fixed variable in the complex loading scenario of gas turbines. A broad experimental campaign was performed to characterize the material properties in terms of short and long crack resistance under constant and combined cyclic conditions. The data were used to define the average and variance of the materials parameters to feed Montecarlo simulations and, at a target probability of failure, establish the design stress. This work points to determine safety margins which strictly depend on the natural variability of the variables that govern the damage accumulation in the mechanical component

    Relationship between molecular connectivity and carcinogenic activity: a confirmation with a new software program based on graph theory.

    Get PDF
    For a database of 826 chemicals tested for carcinogenicity, we fragmented the structural formula of the chemicals into all possible contiguous-atom fragments with size between two and eight (nonhydrogen) atoms. The fragmentation was obtained using a new software program based on graph theory. We used 80% of the chemicals as a training set and 20% as a test set. The two sets were obtained by random sorting. From the training sets, an average (8 computer runs with independently sorted chemicals) of 315 different fragments were significantly (p < 0.125) associated with carcinogenicity or lack thereof. Even using this relatively low level of statistical significance, 23% of the molecules of the test sets lacked significant fragments. For 77% of the molecules of the test sets, we used the presence of significant fragments to predict carcinogenicity. The average level of accuracy of the predictions in the test sets was 67.5%. Chemicals containing only positive fragments were predicted with an accuracy of 78.7%. The level of accuracy was around 60% for chemicals characterized by contradictory fragments or only negative fragments. In a parallel manner, we performed eight paired runs in which carcinogenicity was attributed randomly to the molecules of the training sets. The fragments generated by these pseudo-training sets were devoid of any predictivity in the corresponding test sets. Using an independent software program, we confirmed (for the complex biological endpoint of carcinogenicity) the validity of a structure-activity relationship approach of the type proposed by Klopman and Rosenkranz with their CASE program
    • 

    corecore