5,657 research outputs found
Direct and Indirect Searches for Low-Mass Magnetic Monopoles
Recently, there has been renewed interest in the search for low-mass magnetic
monopoles. At the University of Oklahoma we are performing an experiment
(Fermilab E882) using material from the old D0 and CDF detectors to set limits
on the existence of Dirac monopoles of masses of the order of 500 GeV. To set
such limits, estimates must be made of the production rate of such monopoles at
the Tevatron collider, and of the binding strength of any such produced
monopoles to matter. Here we sketch the still primitive theory of such
interactions, and indicate why we believe a credible limit may still be
obtained. On the other hand, there have been proposals that the classic
Euler-Heisenberg Lagrangian together with duality could be employed to set
limits on magnetic monopoles having masses less than 1 TeV, based on virtual,
rather than real processes. The D0 collaboration at Fermilab has used such a
proposal to set mass limits based on the nonobservation of pairs of photons
each with high transverse momentum. We critique the underlying theory, by
showing that the cross section violates unitarity at the quoted limits and is
unstable with respect to radiative corrections. We therefore believe that no
significant limit can be obtained from the current experiments, based on
virtual monopole processes.Comment: 20 pages, 1 ps figure, contributed to Kurt Haller's festschrif
Theoretical and Experimental Status of Magnetic Monopoles
The Tevatron has inspired new interest in the subject of magnetic monopoles.
First there was the 1998 D0 limit on the virtual production of monopoles, based
on the theory of Ginzberg and collaborators. In 2000 the first results from an
experiment (Fermilab E882) searching for real magnetically charged particles
bound to elements from the CDF and D0 detectors were reported. This also
required new developments in theory. The status of the experimental limits on
monopole masses will be discussed, as well as the limitation of the theory of
magnetic charge at present.Comment: 14 pages, 5 figures, talk given at 5th Workshop on Quantum Field
Theory Under External Condition
Permutation Methods in Relative Risk Regression Models
In this paper, we develop a weighted permutation (WP) method to construct confidence intervals for regression parameters in relative risk regression models. The WP method is a generalized permutation approach. It constructs a resampled history which mimics the observed history for individuals under study. Inference procedures are based on studentized score statistics that are insensitive to the forms of the relative risk function. This makes the WP method appealing in the general framework of the relative risk regression model. First order accuracy of the WP method is established using the counting process approach with a partial likelihood filtration. A simulation study indicates that the method typically improves accuracy over asymptotic confidence intervals
MAXIMUM LIKELIHOOD ESTIMATION OF ORDERED MULTINOMIAL PARAMETERS
The pool-adjacent violator-algorithm (Ayer et al., 1955) has long been known to give the maximum likelihood estimator of a series of ordered binomial parameters, based on an independent observation from each distribution (see, Barlow et al., 1972). This result has immediate application to estimation of a survival distribution based on current survival status at a set of monitoring times. This paper considers an extended problem of maximum likelihood estimation of a series of ‘ordered’ multinomial parameters pi = (p1i, p2i, . . . , pmi) for 1 \u3c = I \u3c = k, where ordered means that pj1 \u3c = pj2 \u3c = .. . \u3c = pjk for each j with 1 \u3c = j \u3c = m-1. The data consist of k independent observations X1, . . . ,Xk where Xi has a multinomial distribution with probability parameter pi and known index ni \u3e = 1. By making use of variants of the pool adjacent violator algorithm, we obtain a simple algorithm to compute the maximum likelihood estimator of p1, . . . , pk, and demonstrate its convergence. The results are applied to nonparametric maximum likelihood estimation of the sub-distribution functions associated with a survival time random variable with competing risks when only current status data are available. (Jewell et al., 2003
Resampling methods for estimating functions with U-statistic structure
Suppose that inference about parameters of interest is to be based on an unbiased estimating function that is U-statistic of degree 1 or 2. We define suitable studentized versions of such estimating functions and consider asymptotic approximations as well as an estimating function bootstrap (EFB) method based on resampling the estimated terms in the estimating functions. These methods are justified asymptotically and lead to confidence intervals produced directly from the studentized estimating functions. Particular examples in this class of estimating functions arise in La estimation as well as Wilcoxon rank regression and other related estimation problems. The proposed methods are evaluated in examples and simulations and compared with a recent suggestion for inference in such problems which relies on resampling an underlying objective functions with U-statistic structure
Semiparametric Analysis for Correlated Recurrent and Terminal Events
In clinical and observational studies, recurrent event data (e.g. hospitalization) with a terminal event (e.g. death) are often encountered. In many instances, the terminal event is strongly correlated with the recurrent event process. In this article, we propose a semiparametric method to jointly model the recurrent and terminal event processes. The dependence is modeled by a shared gamma frailty that is included in both the recurrent event rate and terminal event hazard function. Marginal models are used to estimate the regression effects on the terminal and recurrent event processes and a Poisson model is used to estimate the dispersion of the frailty variable. A sandwich estimator is used to achieve additional robustness. An analysis of hospitalization data for patients in the peritoneal dialysis study is presented to illustrate the proposed method
Warranty Data Analysis: A Review
Warranty claims and supplementary data contain useful information about product quality and reliability. Analysing such data can therefore be of benefit to manufacturers in identifying early warnings of abnormalities in their products, providing useful information about failure modes to aid design modification, estimating product reliability for deciding on warranty policy and forecasting future warranty claims needed for preparing fiscal plans. In the last two decades, considerable research has been conducted in warranty data analysis (WDA) from several different perspectives. This article attempts to summarise and review the research and developments in WDA with emphasis on models, methods and applications. It concludes with a brief discussion on current practices and possible future trends in WDA
Decline and repair, and covariate effects
The failure processes of repairable systems may be impacted by operational and environmental stress factors. To accommodate such factors, reliability can be modelled using a multiplicative intensity function. In the proportional intensity model, the failure intensity is the product of the failure intensity function of the baseline system that quantifies intrinsic factors and a function of covariates that quantify extrinsic factors. The existing literature has extensively studied the failure processes of repairable systems using general repair concepts such as age-reduction when no covariate effects are considered. This paper investigates different approaches for modelling the failure and repair process of repairable systems in the presence of time-dependent covariates. We derive statistical properties of the failure processes for such systems
A weighted cumulative sum (WCUSUM) to monitor medical outcomes with dependent censoring
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/108011/1/sim6139.pd
- …
