2,880 research outputs found

    Evidential-EM Algorithm Applied to Progressively Censored Observations

    Get PDF
    Evidential-EM (E2M) algorithm is an effective approach for computing maximum likelihood estimations under finite mixture models, especially when there is uncertain information about data. In this paper we present an extension of the E2M method in a particular case of incom-plete data, where the loss of information is due to both mixture models and censored observations. The prior uncertain information is expressed by belief functions, while the pseudo-likelihood function is derived based on imprecise observations and prior knowledge. Then E2M method is evoked to maximize the generalized likelihood function to obtain the optimal estimation of parameters. Numerical examples show that the proposed method could effectively integrate the uncertain prior infor-mation with the current imprecise knowledge conveyed by the observed data

    X-ray Lighthouses of the High-Redshift Universe. II. Further Snapshot Observations of the Most Luminous z>4 Quasars with Chandra

    Get PDF
    We report on Chandra observations of a sample of 11 optically luminous (Mb<-28.5) quasars at z=3.96-4.55 selected from the Palomar Digital Sky Survey and the Automatic Plate Measuring Facility Survey. These are among the most luminous z>4 quasars known and hence represent ideal witnesses of the end of the "dark age ''. Nine quasars are detected by Chandra, with ~2-57 counts in the observed 0.5-8 keV band. These detections increase the number of X-ray detected AGN at z>4 to ~90; overall, Chandra has detected ~85% of the high-redshift quasars observed with snapshot (few kilosecond) observations. PSS 1506+5220, one of the two X-ray undetected quasars, displays a number of notable features in its rest-frame ultraviolet spectrum, the most prominent being broad, deep SiIV and CIV absorption lines. The average optical-to-X-ray spectral index for the present sample (=-1.88+/-0.05) is steeper than that typically found for z>4 quasars but consistent with the expected value from the known dependence of this spectral index on quasar luminosity. We present joint X-ray spectral fitting for a sample of 48 radio-quiet quasars in the redshift range 3.99-6.28 for which Chandra observations are available. The X-ray spectrum (~870 counts) is well parameterized by a power law with Gamma=1.93+0.10/-0.09 in the rest-frame ~2-40 keV band, and a tight upper limit of N_H~5x10^21 cm^-2 is obtained on any average intrinsic X-ray absorption. There is no indication of any significant evolution in the X-ray properties of quasars between redshifts zero and six, suggesting that the physical processes of accretion onto massive black holes have not changed over the bulk of cosmic time.Comment: 15 pages, 7 figures, accepted for publication in A

    Patient anxiety and IV sedation in Northern Ireland

    Get PDF

    The reluctant polymorph: investigation into the effect of self-association on the solvent mediated phase transformation and nucleation of theophylline

    Get PDF
    Little is known concerning the pathway of the crystallization of the thermodynamically stable polymorph of theophylline, form IV. Here we study the reasons why the thermodynamically stable theophylline form IV can be obtained only by slow, solvent mediated phase transformation (SMPT) in specific solvents, and whether the presence of prenucleation aggregates affect the polymorphic outcome. Solution concentration, polymorphic composition and morphology were monitored over time during the transformation from form II to form IV in several solvents. NMR and FTIR spectroscopy were used to detect prenucleation molecular aggregates present in the solutions. It was determined that theophylline self-associates in solvents which are good H-bond donors and the presence of these aggregates hinder the nucleation and phase transformation. SMPT from form II to form IV is a nucleation-growth controlled polymorphic transformation, nucleation is most likely homogenous, and form IV crystals grow along the (001) plane, forming plate-like crystals

    Appearance-based localization for mobile robots using digital zoom and visual compass

    Get PDF
    This paper describes a localization system for mobile robots moving in dynamic indoor environments, which uses probabilistic integration of visual appearance and odometry information. The approach is based on a novel image matching algorithm for appearance-based place recognition that integrates digital zooming, to extend the area of application, and a visual compass. Ambiguous information used for recognizing places is resolved with multiple hypothesis tracking and a selection procedure inspired by Markov localization. This enables the system to deal with perceptual aliasing or absence of reliable sensor data. It has been implemented on a robot operating in an office scenario and the robustness of the approach demonstrated experimentally

    Identifying dynamical systems with bifurcations from noisy partial observation

    Full text link
    Dynamical systems are used to model a variety of phenomena in which the bifurcation structure is a fundamental characteristic. Here we propose a statistical machine-learning approach to derive lowdimensional models that automatically integrate information in noisy time-series data from partial observations. The method is tested using artificial data generated from two cell-cycle control system models that exhibit different bifurcations, and the learned systems are shown to robustly inherit the bifurcation structure.Comment: 16 pages, 6 figure

    A population-based approach to background discrimination in particle physics

    Full text link
    Background properties in experimental particle physics are typically estimated using control samples corresponding to large numbers of events. This can provide precise knowledge of average background distributions, but typically does not consider the effect of fluctuations in a data set of interest. A novel approach based on mixture model decomposition is presented as a way to estimate the effect of fluctuations on the shapes of probability distributions in a given data set, with a view to improving on the knowledge of background distributions obtained from control samples. Events are treated as heterogeneous populations comprising particles originating from different processes, and individual particles are mapped to a process of interest on a probabilistic basis. The proposed approach makes it possible to extract from the data information about the effect of fluctuations that would otherwise be lost using traditional methods based on high-statistics control samples. A feasibility study on Monte Carlo is presented, together with a comparison with existing techniques. Finally, the prospects for the development of tools for intensive offline analysis of individual events at the Large Hadron Collider are discussed.Comment: Updated according to the version published in J. Phys.: Conf. Ser. Minor changes have been made to the text with respect to the published article with a view to improving readabilit
    • 

    corecore