9 research outputs found
Recommended from our members
Percentile estimation using the normal and lognormal probability distribution
Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution
Recommended from our members
Detonation probabilities of high explosives
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared
Recommended from our members
Characterizing reliability in a product/process design-assurance program
Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, this Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework
Recommended from our members
Nondestructive verification of the exposure of heavy-water reactor fuel elements
Relative exposures of 137 irradiated heavy-water reactor fuel elements were determined from the measured fission product activities, using high-resolution gamma-ray spectrometry. Exposures ranged from 100 to 1000 MWd/tU. Correlations between various gamma-ray signatures of specific fission products and operator-declared exposure values were calculated. Axial gamma-ray profiles were measured using intrinsic germanium, cadmium telluride, and ionization chamber detectors
Recommended from our members
Geostatistics project of the national uranium resource evaluation program. Progress report, October 1979-March 1980
During the period covered by this report, the authors investigated the serial properties of aerial radiometric data. Results were applied to the choice of minimum segment width in the maximum variance segments algorithm and to the use of aerial radiometric data in the design of ground sampling experiments. The report also presents the results of a comparison of normal and lognormal percentile estimation techniques. Twenty-two quadrangles are being analyzed in the search for a uranium favorability index. Computer codes developed during this investigation have been provided to the Bendix Field Engineering Corporation in Grand Junction, Colorado
Recommended from our members
Geostatistics project of the national uranium resource evaluation program. Progress report, April 1981-September 1981
During the period covered by this report, we proposed a method of comparing aerial and ground data as a means of assessing the quality of aerial data. We also compared two methods of partitioning count rates among several isotopes. Time-series analysis and analysis of variance were considered as tools for using aerial radiometric data to aid in designing ground-based sampling experiments. Several methods of computing covariance matrices were compared for use with very large data sets. A study showed a potential for using aerial radiometric data to rank quadrangles according to the Department of Energy's estimated uranium inventories. A discriminant analysis code was transferred to Grand Junction, and several statistics short courses were presented there. Recommended cluster analysis procedures were developed and applied to aerial radiometric data. Reports and papers were prepared on topics such as outlier detection, percentile estimation, discriminant analysis, and statistical package comparison
Recommended from our members
Application of nondestructive gamma-ray and neutron techniques for the safeguarding of irradiated fuel materials
Nondestructive gamma-ray and neutron techniques were used to characterize the irradiation exposures of irradiated fuel assemblies. Techniques for the rapid measurement of the axial-activity profiles of fuel assemblies have been developed using ion chambers and Be(..gamma..,n) detectors. Detailed measurements using high-resolution gamma-ray spectrometry and passive neutron techniques were correlated with operator-declared values of cooling times and burnup
Computing maximum-scoring segments in almost linear time
Given a sequence, the problem studied in this paper is to find a set of k disjoint continuous subsequences such that the total sum of all elements in the set is maximized. This problem arises naturally in the analysis of DNA sequences. The previous best known algorithm requires Θ(n log n) time in the worst case. For a given sequence of length n, we present an almost linear-time algorithm for this problem. Our algorithm uses a disjoint-set data structure and requires O(nα(n, n)) time in the worst case, where α(n, n) is the inverse Ackermann function