42,097 research outputs found
Fracture mechanics evaluation of Ti-6A1-4V pressure vessels
Computer program calculates maximum potential flaw depth after specific cyclic pressure history. Kobayashi's solution to critical stress intensity equation and empirical relation for flaw growth rate are used. Program assesses pressure vessels of any material but only cylindrical or spherical shapes
An Optimal Skorokhod Embedding for Diffusions
Given a Brownian motion and a general target law (not necessarily
centered or even integrable) we show how to construct an embedding of in
. This embedding is an extension of an embedding due to Perkins, and is
optimal in the sense that it simultaneously minimises the distribution of the
maximum and maximises the distribution of the minimum among all embeddings of
. The embedding is then applied to regular diffusions, and used to
characterise the target laws for which a -embedding may be found.Comment: 22 pages, 4 figure
Frequentist statistics as a theory of inductive inference
After some general remarks about the interrelation between philosophical and
statistical thinking, the discussion centres largely on significance tests.
These are defined as the calculation of -values rather than as formal
procedures for ``acceptance'' and ``rejection.'' A number of types of null
hypothesis are described and a principle for evidential interpretation set out
governing the implications of -values in the specific circumstances of each
application, as contrasted with a long-run interpretation. A variety of more
complicated situations are discussed in which modification of the simple
-value may be essential.Comment: Published at http://dx.doi.org/10.1214/074921706000000400 in the IMS
Lecture Notes--Monograph Series
(http://www.imstat.org/publications/lecnotes.htm) by the Institute of
Mathematical Statistics (http://www.imstat.org
Stability and photochemistry of ClO dimers formed at low temperature in the gas phase
The recent observations of elevated concentrations of the ClO radical in the austral spring over Antarctica have implicated catalytic destruction by chlorine in the large depletions seen in the total ozone column. One of the chemical theories consistent with an elevated concentration of the ClO is a cycle involving the formation of the ClO dimer through the association reaction: ClO + ClO = Cl2O2 and the photolysis of the dimer to give the active Cl species necessary for O3 depletion. Here, researchers report experimental studies designed to characterize the dimer of ClO formed by the association reaction at low temperatures. ClO was produced by static photolysis of several different precursor systems: Cl sub 2 + O sub 3; Cl sub 2 O sub 2; OClO + Cl sub 2 O spectroscopy in the U.V. region, which allowed the time dependence of Cl sub 2, Cl sub 2 O, ClO, OClO, O sub 3 and other absorbing molecules to be determined
Standard Transistor Array (STAR). Volume 1, addendum 1: CAPSTAR user's guide
The cell placement techniques developed for use with the standard transistor array were incorporated in the cell arrangement program for STAR (CAPSTAR). Instructions for use of this program are given
Standard Transistor Array (STAR). Volume 1: Placement technique
A large scale integration (LSI) technology, the standard transistor array uses a prefabricated understructure of transistors and a comprehensive library of digital logic cells to allow efficient fabrication of semicustom digital LSI circuits. The cell placement technique for this technology involves formation of a one dimensional cell layout and "folding" of the one dimensional placement onto the chip. It was found that, by use of various folding methods, high quality chip layouts can be achieved. Methods developed to measure of the "goodness" of the generated placements include efficient means for estimating channel usage requirements and for via counting. The placement and rating techniques were incorporated into a placement program (CAPSTAR). By means of repetitive use of the folding methods and simple placement improvement strategies, this program provides near optimum placements in a reasonable amount of time. The program was tested on several typical LSI circuits to provide performance comparisons both with respect to input parameters and with respect to the performance of other placement techniques. The results of this testing indicate that near optimum placements can be achieved by use of the procedures incurring severe time penalties
Sum of Two Squares - Pair Correlation and Distribution in Short Intervals
In this work we show that based on a conjecture for the pair correlation of
integers representable as sums of two squares, which was first suggested by
Connors and Keating and reformulated here, the second moment of the
distribution of the number of representable integers in short intervals is
consistent with a Poissonian distribution, where "short" means of length
comparable to the mean spacing between sums of two squares. In addition we
present a method for producing such conjectures through calculations in prime
power residue rings and describe how these conjectures, as well as the above
stated result, may by generalized to other binary quadratic forms. While
producing these pair correlation conjectures we arrive at a surprising result
regarding Mertens' formula for primes in arithmetic progressions, and in order
to test the validity of the conjectures, we present numericalz computations
which support our approach.Comment: 3 figure
Causal Inference When Counterfactuals Depend on the Proportion of All Subjects Exposed
The assumption that no subject's exposure affects another subject's outcome,
known as the no-interference assumption, has long held a foundational position
in the study of causal inference. However, this assumption may be violated in
many settings, and in recent years has been relaxed considerably. Often this
has been achieved with either the aid of a known underlying network, or the
assumption that the population can be partitioned into separate groups, between
which there is no interference, and within which each subject's outcome may be
affected by all the other subjects in the group via the proportion exposed (the
stratified interference assumption). In this paper, we instead consider a
complete interference setting, in which each subject affects every other
subject's outcome. In particular, we make the stratified interference
assumption for a single group consisting of the entire sample. This can occur
when the exposure is a shared resource whose efficacy is modified by the number
of subjects among whom it is shared. We show that a targeted maximum likelihood
estimator for the i.i.d.~setting can be used to estimate a class of causal
parameters that includes direct effects and overall effects under certain
interventions. This estimator remains doubly-robust, semiparametric efficient,
and continues to allow for incorporation of machine learning under our model.
We conduct a simulation study, and present results from a data application
where we study the effect of a nurse-based triage system on the outcomes of
patients receiving HIV care in Kenyan health clinics.Comment: 23 pages main article, 23 pages supplementary materials + references,
4 tables, 1 figur
- …