1,148,533 research outputs found

    Probability of brittle failure

    Get PDF
    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level

    Fragility of the Commons under Prospect-Theoretic Risk Attitudes

    Full text link
    We study a common-pool resource game where the resource experiences failure with a probability that grows with the aggregate investment in the resource. To capture decision making under such uncertainty, we model each player's risk preference according to the value function from prospect theory. We show the existence and uniqueness of a pure Nash equilibrium when the players have heterogeneous risk preferences and under certain assumptions on the rate of return and failure probability of the resource. Greater competition, vis-a-vis the number of players, increases the failure probability at the Nash equilibrium; we quantify this effect by obtaining bounds on the ratio of the failure probability at the Nash equilibrium to the failure probability under investment by a single user. We further show that heterogeneity in attitudes towards loss aversion leads to higher failure probability of the resource at the equilibrium.Comment: Accepted for publication in Games and Economic Behavior, 201

    Remote Antenna Unit Selection Assisted Seamless Handover for High-Speed Railway Communications with Distributed Antennas

    Full text link
    To attain seamless handover and reduce the han- dover failure probability for high-speed railway (HSR) com- munication systems, this paper proposes a remote antenna unit (RAU) selection assisted handover scheme where two antennas are installed on high speed train (HST) and distributed antenna system (DAS) cell architecture on ground is adopted. The RAU selection is used to provide high quality received signals for trains moving in DAS cells and the two HST antennas are employed on trains to realize seamless handover. Moreover, to efficiently evaluate the system performance, a new met- ric termed as handover occurrence probability is defined for describing the relation between handover occurrence position and handover failure probability. We then analyze the received signal strength, the handover trigger probability, the handover occurrence probability, the handover failure probability and the communication interruption probability. Numerical results are provided to compare our proposed scheme with the current existing ones. It is shown that our proposed scheme achieves better performances in terms of handover failure probability and communication interruption probability.Comment: 7 figures, accepted by IEEE VTC-Spring, 201

    A Merton Model Approach to Assessing the Default Risk of UK Public Companies

    Get PDF
    This paper shows how a Merton-model approach can be used to develop measures of the probability of failure of quoted UK companies. Probability estimates are constructed for a group of failed companies and their properties as leading indicators of failure assessed. Probability estimates of failure for a control group of surviving companies are also constructed. These are used in Probit-regressions to evaluate the information content of the Merton-based estimates relative to information available in company accounts. The paper shows that there is much useful information in the Merton-style estimates.Merton models, corporate failure, implied default probabilities

    Reliability of Erasure Coded Storage Systems: A Geometric Approach

    Full text link
    We consider the probability of data loss, or equivalently, the reliability function for an erasure coded distributed data storage system under worst case conditions. Data loss in an erasure coded system depends on probability distributions for the disk repair duration and the disk failure duration. In previous works, the data loss probability of such systems has been studied under the assumption of exponentially distributed disk failure and disk repair durations, using well-known analytic methods from the theory of Markov processes. These methods lead to an estimate of the integral of the reliability function. Here, we address the problem of directly calculating the data loss probability for general repair and failure duration distributions. A closed limiting form is developed for the probability of data loss and it is shown that the probability of the event that a repair duration exceeds a failure duration is sufficient for characterizing the data loss probability. For the case of constant repair duration, we develop an expression for the conditional data loss probability given the number of failures experienced by a each node in a given time window. We do so by developing a geometric approach that relies on the computation of volumes of a family of polytopes that are related to the code. An exact calculation is provided and an upper bound on the data loss probability is obtained by posing the problem as a set avoidance problem. Theoretical calculations are compared to simulation results.Comment: 28 pages. 8 figures. Presented in part at IEEE International Conference on BigData 2013, Santa Clara, CA, Oct. 2013 and to be presented in part at 2014 IEEE Information Theory Workshop, Tasmania, Australia, Nov. 2014. New analysis added May 2015. Further Update Aug. 201

    Some advances in importance sampling of reliability models based on zero variance approximation

    Get PDF
    We are interested in estimating, through simulation, the probability of entering a rare failure state before a regeneration state. Since this probability is typically small, we apply importance sampling. The method that we use is based on finding the most likely paths to failure. We present an algorithm that is guaranteed to produce an estimator that meets the conditions presented in [10] [9] for vanishing relative error. We furthermore demonstrate how the procedure that is used to obtain the change of measure can be executed a second time to achieve even further variance reduction, using ideas from [5], and also apply this technique to the method of failure biasing, with which we compare our results
    corecore