2,747,095 research outputs found
Reliability Analysis Model
RAM program determines probability of success for one or more given objectives in any complex system. Program includes failure mode and effects, criticality and reliability analyses, and some aspects of operations, safety, flight technology, systems design engineering, and configuration analyses
Reliability analysis of distribution network
The knowledge of the reliability of distribution networks and systems is important
consideration in the system planning and operations for development and
improvements of power distribution systems. To achieve the target as minimum
interruptions as possible to customers, utilities must strive to improve the reliability
but at the same time reduce cost. It is a known fact that most of customer
interruptions are caused by the failure in distribution system. However, valid data are
not easy to collect and the reliability performance statistic not easy to obtain. There is
always uncertainty associated with the distribution network reliability. For evaluation
and analysis of reliability, it is necessary to have data on the number and range of the
examined piece of equipment. It’s important to have database for failure rates, repair
time and unavailability for each component in distribution network. These studies
present the analysis of distribution networks and systems by using analytical methods
in SESB’s distribution substations and network systems. These studies use analytical
methods to determine the reliability indices and effect of distribution substation
configuration and network to the reliability indices performance. Then the result
obtained will be compare with the actual data from SESB to determine the area of
improvement required for mutual benefit and also for improvement in the future
studies
Creep-rupture reliability analysis
A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep
Computation-based reliability analysis
The structural and functional complexity of man-made systems is reported, showing the reliability analysis task becoming much more complex. Quantative methods are discussed for analyzing system quality. The reliability equations that are developed for systems employing modular redundancy and sparing are discussed to illustrate the need for computation-based reliability analysis
RowHammer: Reliability Analysis and Security Implications
As process technology scales down to smaller dimensions, DRAM chips become
more vulnerable to disturbance, a phenomenon in which different DRAM cells
interfere with each other's operation. For the first time in academic
literature, our ISCA paper exposes the existence of disturbance errors in
commodity DRAM chips that are sold and used today. We show that repeatedly
reading from the same address could corrupt data in nearby addresses. More
specifically: When a DRAM row is opened (i.e., activated) and closed (i.e.,
precharged) repeatedly (i.e., hammered), it can induce disturbance errors in
adjacent DRAM rows. This failure mode is popularly called RowHammer. We tested
129 DRAM modules manufactured within the past six years (2008-2014) and found
110 of them to exhibit RowHammer disturbance errors, the earliest of which
dates back to 2010. In particular, all modules from the past two years
(2012-2013) were vulnerable, which implies that the errors are a recent
phenomenon affecting more advanced generations of process technology.
Importantly, disturbance errors pose an easily-exploitable security threat
since they are a breach of memory protection, wherein accesses to one page
(mapped to one row) modifies the data stored in another page (mapped to an
adjacent row).Comment: This is the summary of the paper titled "Flipping Bits in Memory
Without Accessing Them: An Experimental Study of DRAM Disturbance Errors"
which appeared in ISCA in June 201
Quantifying Information Leaks Using Reliability Analysis
acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4We report on our work-in-progress into the use of reliability analysis to quantify information leaks. In recent work we have proposed a software reliability analysis technique that uses symbolic execution and model counting to quantify the probability of reaching designated program states, e.g. assert violations, under uncertainty conditions in the environment. The technique has many applications beyond reliability analysis, ranging from program understanding and debugging to analysis of cyber-physical systems. In this paper we report on a novel application of the technique, namely Quantitative Information Flow analysis (QIF). The goal of QIF is to measure information leakage of a program by using information-theoretic metrics such as Shannon entropy or Renyi entropy. We exploit the model counting engine of the reliability analyzer over symbolic program paths, to compute an upper bound of the maximum leakage over all possible distributions of the confidential data. We have implemented our approach into a prototype tool, called QILURA, and explore its effectiveness on a number of case studie
Reliability Analysis of Pavement Performance Prediction
The prediction of road pavement performance may be facilitated by appropriate models developed by analyzing sets of historical data or data collected from accelerated pavement testing facilities. However, there may be systematic or random errors in these data sets and also the data sets may not be complete or represent the full range of conditions likely to occur in the field. As a result the predictions made by the models may not be fully accurate and include a degree of uncertainty. Therefore, ideally the behavioral modeling of long-term pavement performance should include a procedure for taking into account the uncertainty in the data and quantify it accordingly. This paper presents such a methodology that first defines the reliability of pavement performance predictions and its associated risk using a probabilistic approach. It then demonstrates how the reliability of pavement performance predictions can be estimated by considering the variability of the parameters (such as pavement strength, cumulative equivalent standard axle load and initial pavement roughness) that make up the performance model. A framework is presented that uses the Monte Carlo simulation to evaluate the effect of the model parameters variability on the allowable cumulative equivalent standard axle load applications. The analysis demonstrates that data variability has a significant influence on the reliability of pavement performance prediction
Reliability analysis of a glulam beam
The present case study is an example of the use of reliability analysis to asses the failure probability of a tapered glulam beam. This beam is part of a true structure built for a super market in the town of Kokemaki in Finland. The reliability analysis is carried out using the snow load statistics available from the site and on material strength information available from previous experiments. The Eurocode 5 and the Finnish building code are used as the deterministic methods to which the probabilistic method is compared to. The calculations show that the effect of the strength variation is not significant, when the coefficient of variation of the strength is around 15% as usually assumed for glulam. The probability of failure resulting from a deterministic design based on Eurocode 5 is low compared to the target values and lower sections are possible if applying a probabilistic design method. In fire design, if a 60 min resistance is required, this is not the case according to Eurocode 5 design procedures, a higher section would be required. However, a probabilistic based fire analysis results in bounds for the yearly probability of failure which are comparable to the target value and to the values obtained from the normal probabilistic based design. (C) 2006 Elsevier Ltd. All rights reserved
- …
