29,446 research outputs found

    A note on the gaps between consecutive zeros of the Riemann zeta-function

    Full text link
    Assuming the Riemann Hypothesis, we show that infinitely often consecutive non-trivial zeros of the Riemann zeta-function differ by at most 0.5155 times the average spacing and infinitely often they differ by at least 2.69 times the average spacing.Comment: 7 pages. Submitted for publicatio

    Spacetime Foam, Holographic Principle, and Black Hole Quantum Computers

    Full text link
    Spacetime foam, also known as quantum foam, has its origin in quantum fluctuations of spacetime. Arguably it is the source of the holographic principle, which severely limits how densely information can be packed in space. Its physics is also intimately linked to that of black holes and computation. In particular, the same underlying physics is shown to govern the computational power of black hole quantum computers.Comment: 8 pages, LaTeX; Talk given by Jack Ng, in celebration of Paul Frampton's 60th birthday, at the Coral Gables Conference (in Fort Lauderdale, Florida on December 17, 2003). To appear in the Proceedings of the 2003 Coral Gables Conferenc

    Kinetic modeling of Secondary Organic Aerosol formation: effects of particle- and gas-phase reactions of semivolatile products

    Get PDF
    The distinguishing mechanism of formation of secondary organic aerosol (SOA) is the partitioning of semivolatile hydrocarbon oxidation products between the gas and aerosol phases. While SOA formation is typically described in terms of partitioning only, the rate of formation and ultimate yield of SOA can also depend on the kinetics of both gas- and aerosol-phase processes. We present a general equilibrium/kinetic model of SOA formation that provides a framework for evaluating the extent to which the controlling mechanisms of SOA formation can be inferred from laboratory chamber data. With this model we examine the effect on SOA formation of gas-phase oxidation of first-generation products to either more or less volatile species, of particle-phase reaction (both first- and second-order kinetics), of the rate of parent hydrocarbon oxidation, and of the extent of reaction of the parent hydrocarbon. The effect of pre-existing organic aerosol mass on SOA yield, an issue of direct relevance to the translation of laboratory data to atmospheric applications, is examined. The importance of direct chemical measurements of gas- and particle-phase species is underscored in identifying SOA formation mechanisms

    Non-magnetic impurities in two- and three- dimensional Heisenberg antiferromagnets

    Full text link
    In this paper we study in a large-S expansion effects of substituting spins by non-magnetic impurities in two- and three- dimensional Heisenberg antiferromagnets in a weak magnetic field. In particular, we demonstrate a novel mechanism where magnetic moments are induced around non-magnetic impurities when magnetic field is present. As a result, Curie-type behaviour in magnetic susceptibility can be observed well below the Neel temperature, in agreement with what is being observed in La2Cu1−xZnxO4La_2Cu_{1-x}Zn_{x}O_4 and Sr(Cu1−xZnx)2O3Sr(Cu_{1-x}Zn_x)_2O_3 compounds.Comment: Latex fil

    Evidential-EM Algorithm Applied to Progressively Censored Observations

    Get PDF
    Evidential-EM (E2M) algorithm is an effective approach for computing maximum likelihood estimations under finite mixture models, especially when there is uncertain information about data. In this paper we present an extension of the E2M method in a particular case of incom-plete data, where the loss of information is due to both mixture models and censored observations. The prior uncertain information is expressed by belief functions, while the pseudo-likelihood function is derived based on imprecise observations and prior knowledge. Then E2M method is evoked to maximize the generalized likelihood function to obtain the optimal estimation of parameters. Numerical examples show that the proposed method could effectively integrate the uncertain prior infor-mation with the current imprecise knowledge conveyed by the observed data

    Precedence-type Test based on Progressively Censored Samples

    Get PDF
    In this paper, we introduce precedence-type tests for testing the hypothesis that two distribution functions are equal, which is an extension of the precedence life-test rst proposed by Nelson (1963), when the two samples are progressively Type-II censored. The null distributions of the test statistics are derived. Critical values for some combination of sample sizes and censoring schemes for the proposed tests are presented. Then, we present the exact power functions under the Lehmann alternative, and compare the exact power as well as simulated power (under location-shift) of the proposed precedence test based on nonparametric estimates of CDF with other precedence-type tests. We then examine the power properties of the proposed test procedures through Monte Carlo simulations. Two examples are presented to illustrate all the test procedures discussed here. Finally, we make some concluding remarks.Precedence test; Product-limit estimator; Type-II progressive censoring; Life-testing; level of significance; power; Lehmann alternative; Monte Carlo simulations
    • …
    corecore