2,283 research outputs found

    Background subtraction with Dirichlet processes

    Get PDF
    Abstract. Background subtraction is an important first step for video analysis, where it is used to discover the objects of interest for fur-ther processing. Such an algorithm often consists of a background model and a regularisation scheme. The background model determines a per-pixel measure of if a pixel belongs to the background or the foreground, whilst the regularisation brings in information from adjacent pixels. A new method is presented that uses a Dirichlet process Gaussian mixture model to estimate a per-pixel background distribution, which is followed by probabilistic regularisation. Key advantages include inferring the per-pixel mode count, such that it accurately models dynamic backgrounds, and that it updates its model continuously in a principled way.

    Bayesian Analysis of Femtosecond Pump-Probe Photoelectron-Photoion Coincidence Spectra with Fluctuating Laser Intensities

    Full text link
    This paper employs Bayesian probability theory for analyzing data generated in femtosecond pump-probe photoelectron-photoion coincidence (PEPICO) experiments. These experiments allow investigating ultrafast dynamical processes in photoexcited molecules. Bayesian probability theory is consistently applied to data analysis problems occurring in these types of experiments such as background subtraction and false coincidences. We previously demonstrated that the Bayesian formalism has many advantages, amongst which are compensation of false coincidences, no overestimation of pump-only contributions, significantly increased signal-to-noise ratio, and applicability to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, our approach allows running experiments at higher ionization rates, resulting in an appreciable reduction of data acquisition times. In addition to our previous paper, we include fluctuating laser intensities, of which the straightforward implementation highlights yet another advantage of the Bayesian formalism. Our method is thoroughly scrutinized by challenging mock data, where we find a minor impact of laser fluctuations on false coincidences, yet a noteworthy influence on background subtraction. We apply our algorithm to data obtained in experiments and discuss the impact of laser fluctuations on the data analysis

    Background Subtraction via Generalized Fused Lasso Foreground Modeling

    Full text link
    Background Subtraction (BS) is one of the key steps in video analysis. Many background models have been proposed and achieved promising performance on public data sets. However, due to challenges such as illumination change, dynamic background etc. the resulted foreground segmentation often consists of holes as well as background noise. In this regard, we consider generalized fused lasso regularization to quest for intact structured foregrounds. Together with certain assumptions about the background, such as the low-rank assumption or the sparse-composition assumption (depending on whether pure background frames are provided), we formulate BS as a matrix decomposition problem using regularization terms for both the foreground and background matrices. Moreover, under the proposed formulation, the two generally distinctive background assumptions can be solved in a unified manner. The optimization was carried out via applying the augmented Lagrange multiplier (ALM) method in such a way that a fast parametric-flow algorithm is used for updating the foreground matrix. Experimental results on several popular BS data sets demonstrate the advantage of the proposed model compared to state-of-the-arts

    Black Hole Entropy: Off-Shell vs On-Shell

    Full text link
    Different methods of calculation of quantum corrections to the thermodynamical characteristics of a black hole are discussed and compared. The relation between on-shell and off-shell approaches is established. The off-shell methods are used to explicitly demonstrate that the thermodynamical entropy STDS^{TD} of a black hole, defined by the first thermodynamical law, differs from the statistical-mechanical entropy SSMS^{SM}, determined as S^{SM}=-\mbox{Tr}(\hat{\rho}^H\ln\hat{\rho}^H) for the density matrix ρ^H\hat{\rho}^H of a black hole. It is shown that the observable thermodynamical black hole entropy can be presented in the form STD=πrˉ+2+SSMSRindlerSMS^{TD}=\pi {\bar r}_+^2+S^{SM}-S^{SM}_{Rindler}. Here rˉ+{\bar r}_+ is the radius of the horizon shifted because of the quantum backreaction effect, and SRindlerSMS^{SM}_{Rindler} is the statistical-mechanical entropy calculated in the Rindler space.Comment: 47 pages, latex, 7 postscript figures have been included since the first submission of the articl

    Horizon divergences of Fields and Strings in Black Hole backgrounds

    Full text link
    General arguments based on curved space-time thermodynamics show that any extensive quantity, like the free energy or the entropy of thermal matter, always has a divergent boundary contribution in the presence of event horizons, and this boundary term comes with the Hawking-Bekenstein form. Although the coefficients depend on the particular geometry we show that intensive quantities, like the free energy density are universal in the vicinity of the horizon. {} From the point of view of the matter degrees of freedom this divergence is of infrared type rather than ultraviolet, and we use this remark to speculate about the fate of these pathologies in String Theory. Finally we interpret them as instabilities of the Canonical Ensemble with respect to gravitational collapse via the Jeans mechanism.Comment: 16 pages, PUPT-1448 (some typos corrected and references added

    Probabilistic Clustering of Time-Evolving Distance Data

    Full text link
    We present a novel probabilistic clustering model for objects that are represented via pairwise distances and observed at different time points. The proposed method utilizes the information given by adjacent time points to find the underlying cluster structure and obtain a smooth cluster evolution. This approach allows the number of objects and clusters to differ at every time point, and no identification on the identities of the objects is needed. Further, the model does not require the number of clusters being specified in advance -- they are instead determined automatically using a Dirichlet process prior. We validate our model on synthetic data showing that the proposed method is more accurate than state-of-the-art clustering methods. Finally, we use our dynamic clustering model to analyze and illustrate the evolution of brain cancer patients over time
    corecore