743 research outputs found

    Bimolecular Recombination Reactions: Low Pressure Rates in Terms of Time-Dependent Survival Probabilities, Total J Phase Space Sampling of Trajectories, and Comparison with RRKM Theory

    Get PDF
    We consider the bimolecular formation and redissociation of complexes using classical trajectories and the survival probability distribution function P(E,J,t) of the intermediate complexes at time t as a function of the energy E and total angular momentum quantum number J. The P(E,J,t) and its deviation from single exponential behavior is a main focus of the present set of studies. Together with weak deactivating collisions, the P(E,J,t) and a cumulative reaction probability at the given E and J can also be used to obtain the recombination rate constant k at low pressures of third bodies. Both classical and quantum expressions are given for k in terms of P(E,J,t). The initial conditions for the classical trajectories are sampled for atomāˆ’diatom reactions for various (E,J)ā€™s using action-angle variables. A canonical transformation to a total J representation reduces the sampling space by permitting analytic integration over several of the variables. A similar remark applies for the calculation of the density of states of the intermediate complex Ļ and for the number of states N* of the transition state as a function of E and J. The present approach complements the usual approach based on the rate of the reverse reaction, unimolecular dissociation, and the equilibrium constant. It provides results not necessarily accessible from the unimolecular studies. The formalism is applied elsewhere to the study of nonstatistical aspects of the recombination and redissociation of the resulting ozone molecules and comparison with RRKM theory

    Scheduling Storms and Streams in the Cloud

    Full text link
    Motivated by emerging big streaming data processing paradigms (e.g., Twitter Storm, Streaming MapReduce), we investigate the problem of scheduling graphs over a large cluster of servers. Each graph is a job, where nodes represent compute tasks and edges indicate data-flows between these compute tasks. Jobs (graphs) arrive randomly over time, and upon completion, leave the system. When a job arrives, the scheduler needs to partition the graph and distribute it over the servers to satisfy load balancing and cost considerations. Specifically, neighboring compute tasks in the graph that are mapped to different servers incur load on the network; thus a mapping of the jobs among the servers incurs a cost that is proportional to the number of "broken edges". We propose a low complexity randomized scheduling algorithm that, without service preemptions, stabilizes the system with graph arrivals/departures; more importantly, it allows a smooth trade-off between minimizing average partitioning cost and average queue lengths. Interestingly, to avoid service preemptions, our approach does not rely on a Gibbs sampler; instead, we show that the corresponding limiting invariant measure has an interpretation stemming from a loss system.Comment: 14 page

    Uncertainty analysis of a test bed for calibrating voltage transformers vs.Temperature

    Get PDF
    The paper addresses the evaluation of the uncertainty sources of a test bed system for calibrating voltage transformers vs. temperature. In particular, the Monte Carlo method has been applied in order to evaluate the effects of the uncertainty sources in two different conditions: by using the nominal accuracy specifications of the elements which compose the setup, or by exploiting the results of their metrological characterization. In addition, the influence of random effects on the system accuracy has been quantified and evaluated. From the results, it emerges that the choice of the uncertainty evaluation method affects the overall study. As a matter of fact, the use of a metrological characterization or of accuracy specifications provided by the manufacturers provides respectively an accuracy of 0.1 and 0.5 for the overall measurement setup

    Velocity Curve Analysis of the Spectroscopic Binary Stars V373 Cas, V2388 Oph, V401 Cyg, GM Dra, V523 Cas, AB And, and HD 141929 by Artificial Neural Networks

    Full text link
    We used an Artificial Neural Network (ANN) to derive the orbital parameters of spectroscopic binary stars. Using measured radial velocity data of seven double-lined spectroscopic binary systems V373 Cas, V2388 Oph, V401 Cyg, GM Dra, V523 Cas, AB And, and HD 141929, we found corresponding orbital and spectroscopic elements. Our numerical results are in good agreement with those obtained by others using more traditional methods.Comment: 13 pages, 8 figures, 14 Table

    Approximate biflatness and Johnson pseudo-contractibility of some Banach algebras

    Get PDF
    summary:We study the structure of Lipschitz algebras under the notions of approximate biflatness and Johnson pseudo-contractibility. We show that for a compact metric space XX, the Lipschitz algebras LipĪ±(X){\rm Lip}_{\alpha}(X) and lipĪ±(X){\rm lip}_{\alpha}(X) are approximately biflat if and only if XX is finite, provided that 0<Ī±<10<\alpha<1. We give a necessary and sufficient condition that a vector-valued Lipschitz algebras is Johnson pseudo-contractible. We also show that some triangular Banach algebras are not approximately biflat
    • ā€¦
    corecore