4,411 research outputs found

    Bubble formation at two adjacent submerged orifices in inviscid fluids

    Get PDF
    A theoretical model has been developed as an extension of single orifice bubble formation to investigate the growth and detachment of vapor/gas bubbles formed at two adjacent submerged orifices in inviscid fluids. The mathematical model treats the two bubbles as an expanding control volume moving to the line of centers above a wall. The movement of the bubbles is obtained by application of force balance acting on the bubble and accounts for surface tension, buoyancy, steam momentum and liquid inertia effects. The liquid inertia effects are determined by applying inviscid and irrotational flow assumptions to allow potential flow theory to calculate the liquid velocity field which then allows the pressure distribution to be calculated. The model is extended to include the mass and energy equations to model the steam bubble formation in sub-cooled water. The theoretical results are compared with the available experimental data of bubble formation during constant mass flow steam bubble formation at two submerged upward facing orifices in sub-cooled water. The model was validated by available experimental data for the growth and detachment processes of two adjacent 1 mm orifices at system pressures of 2 and 3 bars, flow rates of 1.2-4 g/min at sub-cooling of 3.5-35 ºC. The comparisons of theory and experiments indicate that the model successfully predicts the bubbles growth and detachment for the range of conditions studied

    Evidential-EM Algorithm Applied to Progressively Censored Observations

    Get PDF
    Evidential-EM (E2M) algorithm is an effective approach for computing maximum likelihood estimations under finite mixture models, especially when there is uncertain information about data. In this paper we present an extension of the E2M method in a particular case of incom-plete data, where the loss of information is due to both mixture models and censored observations. The prior uncertain information is expressed by belief functions, while the pseudo-likelihood function is derived based on imprecise observations and prior knowledge. Then E2M method is evoked to maximize the generalized likelihood function to obtain the optimal estimation of parameters. Numerical examples show that the proposed method could effectively integrate the uncertain prior infor-mation with the current imprecise knowledge conveyed by the observed data

    Application of Monte Carlo Algorithms to the Bayesian Analysis of the Cosmic Microwave Background

    Get PDF
    Power spectrum estimation and evaluation of associated errors in the presence of incomplete sky coverage; non-homogeneous, correlated instrumental noise; and foreground emission is a problem of central importance for the extraction of cosmological information from the cosmic microwave background. We develop a Monte Carlo approach for the maximum likelihood estimation of the power spectrum. The method is based on an identity for the Bayesian posterior as a marginalization over unknowns. Maximization of the posterior involves the computation of expectation values as a sample average from maps of the cosmic microwave background and foregrounds given some current estimate of the power spectrum or cosmological model, and some assumed statistical characterization of the foregrounds. Maps of the CMB are sampled by a linear transform of a Gaussian white noise process, implemented numerically with conjugate gradient descent. For time series data with N_{t} samples, and N pixels on the sphere, the method has a computational expense $KO[N^{2} +- N_{t} +AFw-log N_{t}], where K is a prefactor determined by the convergence rate of conjugate gradient descent. Preconditioners for conjugate gradient descent are given for scans close to great circle paths, and the method allows partial sky coverage for these cases by numerically marginalizing over the unobserved, or removed, region.Comment: submitted to Ap

    Statistical significance of communities in networks

    Full text link
    Nodes in real-world networks are usually organized in local modules. These groups, called communities, are intuitively defined as sub-graphs with a larger density of internal connections than of external links. In this work, we introduce a new measure aimed at quantifying the statistical significance of single communities. Extreme and Order Statistics are used to predict the statistics associated with individual clusters in random graphs. These distributions allows us to define one community significance as the probability that a generic clustering algorithm finds such a group in a random graph. The method is successfully applied in the case of real-world networks for the evaluation of the significance of their communities.Comment: 9 pages, 8 figures, 2 tables. The software to calculate the C-score can be found at http://filrad.homelinux.org/cscor

    Principal Component Analysis with Noisy and/or Missing Data

    Full text link
    We present a method for performing Principal Component Analysis (PCA) on noisy datasets with missing values. Estimates of the measurement error are used to weight the input data such that compared to classic PCA, the resulting eigenvectors are more sensitive to the true underlying signal variations rather than being pulled by heteroskedastic measurement noise. Missing data is simply the limiting case of weight=0. The underlying algorithm is a noise weighted Expectation Maximization (EM) PCA, which has additional benefits of implementation speed and flexibility for smoothing eigenvectors to reduce the noise contribution. We present applications of this method on simulated data and QSO spectra from the Sloan Digital Sky Survey.Comment: Accepted for publication in PASP; v2 with minor updates, mostly to bibliograph

    C1 inhibitor deficiency: 2014 United Kingdom consensus document

    Get PDF
    C1 inhibitor deficiency is a rare disorder manifesting with recurrent attacks of disabling and potentially life-threatening angioedema. Here we present an updated 2014 United Kingdom consensus document for the management of C1 inhibitor-deficient patients, representing a joint venture between the United Kingdom Primary Immunodeficiency Network and Hereditary Angioedema UK. To develop the consensus, we assembled a multi-disciplinary steering group of clinicians, nurses and a patient representative. This steering group first met in 2012, developing a total of 48 recommendations across 11 themes. The statements were distributed to relevant clinicians and a representative group of patients to be scored for agreement on a Likert scale. All 48 statements achieved a high degree of consensus, indicating strong alignment of opinion. The recommendations have evolved significantly since the 2005 document, with particularly notable developments including an improved evidence base to guide dosing and indications for acute treatment, greater emphasis on home therapy for acute attacks and a strong focus on service organisation. This article is protected by copyright. All rights reserved

    Biased tomography schemes: an objective approach

    Get PDF
    We report on an intrinsic relationship between the maximum-likelihood quantum-state estimation and the representation of the signal. A quantum analogy of the transfer function determines the space where the reconstruction should be done without the need for any ad hoc truncations of the Hilbert space. An illustration of this method is provided by a simple yet practically important tomography of an optical signal registered by realistic binary detectors.Comment: 4 pages, 3 figures, accepted in PR

    Statistical Mechanics of Learning in the Presence of Outliers

    Full text link
    Using methods of statistical mechanics, we analyse the effect of outliers on the supervised learning of a classification problem. The learning strategy aims at selecting informative examples and discarding outliers. We compare two algorithms which perform the selection either in a soft or a hard way. When the fraction of outliers grows large, the estimation errors undergo a first order phase transition.Comment: 24 pages, 7 figures (minor extensions added

    Replicators in Fine-grained Environment: Adaptation and Polymorphism

    Full text link
    Selection in a time-periodic environment is modeled via the two-player replicator dynamics. For sufficiently fast environmental changes, this is reduced to a multi-player replicator dynamics in a constant environment. The two-player terms correspond to the time-averaged payoffs, while the three and four-player terms arise from the adaptation of the morphs to their varying environment. Such multi-player (adaptive) terms can induce a stable polymorphism. The establishment of the polymorphism in partnership games [genetic selection] is accompanied by decreasing mean fitness of the population.Comment: 4 pages, 2 figure
    corecore