10,587 research outputs found

    The empirical accuracy of uncertain inference models

    Get PDF
    Uncertainty is a pervasive feature of the domains in which expert systems are designed to function. Research design to test uncertain inference methods for accuracy and robustness, in accordance with standard engineering practice is reviewed. Several studies were conducted to assess how well various methods perform on problems constructed so that correct answers are known, and to find out what underlying features of a problem cause strong or weak performance. For each method studied, situations were identified in which performance deteriorates dramatically. Over a broad range of problems, some well known methods do only about as well as a simple linear regression model, and often much worse than a simple independence probability model. The results indicate that some commercially available expert system shells should be used with caution, because the uncertain inference models that they implement can yield rather inaccurate results

    Trispectrum versus Bispectrum in Single-Field Inflation

    Get PDF
    In the standard slow-roll inflationary cosmology, quantum fluctuations in a single field, the inflaton, generate approximately Gaussian primordial density perturbations. At present, the bispectrum and trispectrum of the density perturbations have not been observed and the probability distribution for these perturbations is consistent with Gaussianity. However, Planck satellite data will bring a new level of precision to bear on this issue, and it is possible that evidence for non-Gaussian effects in the primordial distribution will be discovered. One possibility is that a trispectrum will be observed without evidence for a non-zero bispectrum. It is not difficult for this to occur in inflationary models where quantum fluctuations in a field other than the inflaton contribute to the density perturbations. A natural question to ask is whether such an observation would rule out the standard scenarios. We explore this issue and find that it is possible to construct single-field models in which inflaton-generated primordial density perturbations have an observable trispectrum, but a bispectrum that is too small to be observed by the Planck satellite. However, an awkward fine tuning seems to be unavoidable.Comment: 15 pages, 3 figures; journal versio

    The Insignificance of Global Reheating in the Abell 1068 Cluster: X-Ray Analysis

    Full text link
    We report on a Chandra observation of the massive, medium redshift (z=0.1386) cooling flow cluster Abell 1068. We detect a clear temperature gradient in the X-ray emitting gas from kT ~ 5 keV in the outer part of the cluster down to roughly 2 keV in the core, and a striking increase in the metallicity of the gas toward the cluster center. The total spectrum from the cluster can be fit by a cooling flow model with a total mass deposition rate of 150 solar masses/yr. Within the core (r < 30 kpc), the mass depositon rate of 40 solar masses/yr is comparable to estimates for the star formation rate from optical data. We find an apparent correlation between the cD galaxy's optical isophotes and enhanced metallicity isocontours in the central ~100 kpc of the cluster. We show that the approximate doubling of the metallicity associated with the cD can be plausibly explained by supernova explosions associated with the cD's ambient stellar population and the recent starburst. Finally, we calculate the amount of heating due to thermal conduction and show that this process is unlikely to offset cooling in Abell 1068.Comment: Accepted for publication in ApJ, 26 pages, 12 b+w figures, 3 color figure

    Production of superconductor/carbon bicomponent fibers

    Get PDF
    Certain materials are unable to be drawn or spun into fiber form due to their improper melting characteristics or brittleness. However, fibrous samples of such materials are often necessary for the fabrication of intricate shapes and composites. In response to this problem, a unique process, referred to as the piggyback process, was developed to prepare fibrous samples of a variety of nonspinnable ceramics. In this technique, specially produced C-shaped carbon fibers serve as micromolds to hold the desired materials prior to sintering. Depending on the sintering atmosphere used, bicomponent or single component fibers result. While much has been demonstrated worldwide concerning the YBa2Cu3O(7-x) superconductor, fabrication into unique forms has proven quite difficult. However, a variety of intricate shapes are necessary for rapid commercialization of the superconducting materials. The potential for producing fibrous samples of the YBa2Cu3O(7-x) compound by the piggyback process is being investigated. Various organic and acrylic materials were investigated to determine suspending ability, reactivity with the YBa2Cu3O(7-x) compound during long term storage, and burn out characteristics. While many questions were answered with respect to the interfacial reactions between YBa2Cu3O(7-x) and carbon, much work is still necessary to improve the quality of the sintered material if the fibers produced are to be incorporated into useful composites or cables. Additional research is necessary to evaluate quality of the barrier layer during long soakings at the peak temperature; adjust the firing schedule to avoid microcracking and improve densification; and increase the solids loading in the superconductive suspension to decrease porosity

    Higgs Properties and Fourth Generation Leptons

    Get PDF
    It is possible that there are additional vector-like generations where the quarks have mass terms that do not originate from weak symmetry breaking, but the leptons only get mass through weak symmetry breaking. We discuss the impact that the new leptons have on Higgs boson decay branching ratios and on the range of allowed Higgs masses in such a model (with a single new vector-like generation). We find that if the fourth generation leptons are too heavy to be produced in Higgs decay, then the new leptons reduce the branching ratio for h -> gamma gamma to about 30% of its standard-model value. The dependence of this branching ratio on the new charged lepton masses is weak. Furthermore the expected Higgs production rate at the LHC is very near its standard-model value if the new quarks are much heavier than the weak scale. If the new quarks have masses near the cutoff for the theory then for cutoffs greater than 10^15 GeV, the new lepton masses cannot be much heavier than about 100 GeV and the Higgs mass must have a value around 175 GeV.Comment: 8 pages, 8 figures, published versio

    Ostrogradsky's Hamilton formalism and quantum corrections

    Full text link
    By means of a simple scalar field theory it is demonstrated that the Lagrange formalism and Ostrogradsky's Hamilton formalism in the presence of higher derivatives, in general, do not lead to the same results. While the two approaches are equivalent at the classical level, differences appear due to the quantum corrections.Comment: 10 pages, 1 figure, REVTeX

    Translational Invariance and the Anisotropy of the Cosmic Microwave Background

    Get PDF
    Primordial quantum fluctuations produced by inflation are conventionally assumed to be statistically homogeneous, a consequence of translational invariance. In this paper we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes of the spherical-harmonic coefficients.Comment: Notation improve

    The low-temperature energy calibration system for the CUORE bolometer array

    Full text link
    The CUORE experiment will search for neutrinoless double beta decay (0nDBD) of 130Te using an array of 988 TeO_2 bolometers operated at 10 mK in the Laboratori Nazionali del Gran Sasso (Italy). The detector is housed in a large cryogen-free cryostat cooled by pulse tubes and a high-power dilution refrigerator. The TeO_2 bolometers measure the event energies, and a precise and reliable energy calibration is critical for the successful identification of candidate 0nDBD and background events. The detector calibration system under development is based on the insertion of 12 gamma-sources that are able to move under their own weight through a set of guide tubes that route them from deployment boxes on the 300K flange down into position in the detector region inside the cryostat. The CUORE experiment poses stringent requirements on the maximum heat load on the cryostat, material radiopurity, contamination risk and the ability to fully retract the sources during normal data taking. Together with the integration into a unique cryostat, this requires careful design and unconventional solutions. We present the design, challenges, and expected performance of this low-temperature energy calibration system.Comment: To be published in the proceedings of the 13th International Workshop on Low Temperature Detectors (LTD), Stanford, CA, July 20-24, 200

    Heavy Quark Fragmentation to Baryons Containing Two Heavy Quarks

    Full text link
    We discuss the fragmentation of a heavy quark to a baryon containing two heavy quarks of mass mQ≫ΛQCDm_Q\gg\Lambda_{\rm QCD}. In this limit the heavy quarks first combine perturbatively into a compact diquark with a radius small compared to 1/ΛQCD1/\Lambda_{\rm QCD}, which interacts with the light hadronic degrees of freedom exactly as does a heavy antiquark. The subsequent evolution of this QQQQ diquark to a QQqQQq baryon is identical to the fragmentation of a heavy antiquark to a meson. We apply this analysis to the production of baryons of the form ccqccq, bbqbbq, and bcqbcq.Comment: 9 pages, 1 figure included, uses harvmac.tex and epsf.tex, UCSD/PTH 93-11, CALT-68-1868, SLAC-PUB-622
    • …
    corecore