144 research outputs found

    A Robust Iterative Unfolding Method for Signal Processing

    Full text link
    There is a well-known series expansion (Neumann series) in functional analysis for perturbative inversion of specific operators on Banach spaces. However, operators that appear in signal processing (e.g. folding and convolution of probability density functions), in general, do not satisfy the usual convergence condition of that series expansion. This article provides some theorems on the convergence criteria of a similar series expansion for this more general case, which is not covered yet by the literature. The main result is that a series expansion provides a robust unbiased unfolding and deconvolution method. For the case of the deconvolution, such a series expansion can always be applied, and the method always recovers the maximum possible information about the initial probability density function, thus the method is optimal in this sense. A very significant advantage of the presented method is that one does not have to introduce ad hoc frequency regulations etc., as in the case of usual naive deconvolution methods. For the case of general unfolding problems, we present a computer-testable sufficient condition for the convergence of the series expansion in question. Some test examples and physics applications are also given. The most important physics example shall be (which originally motivated our survey on this topic) the case of pi^0 --> gamma+gamma particle decay: we show that one can recover the initial pi^0 momentum density function form the measured single gamma momentum density function by our series expansion.Comment: 23 pages, 9 figure

    Effect of time-correlation of input patterns on the convergence of on-line learning

    Get PDF
    We studied the effects of time correlation of subsequent patterns on the convergence of on-line learning by a feedforward neural network with backpropagation algorithm. By using chaotic time series as sequences of correlated patterns, we found that the unexpected scaling of converging time with learning parameter emerges when time-correlated patterns accelerate learning process.Comment: 8 pages(Revtex), 5 figure

    Supermodeling Improving Predictions with an Ensemble of Interacting Models

    Get PDF
    The modeling of weather and climate has been a success story. The skill of forecasts continues to improve and model biases continue to decrease. Combining the output of multiple models has further improved forecast skill and reduced biases. But are we exploiting the full capacity of state-of-the-art models in making forecasts and projections? Supermodeling is a recent step forward in the multimodel ensemble approach. Instead of combining model output after the simulations are completed, in a supermodel individual models exchange state information as they run, influencing each other's behavior. By learning the optimal parameters that determine how models influence each other based on past observations, model errors are reduced at an early stage before they propagate into larger scales and affect other regions and variables. The models synchronize on a common solution that through learning remains closer to the observed evolution. Effectively a new dynamical system has been created, a supermodel, that optimally combines the strengths of the constituent models. The supermodel approach has the potential to rapidly improve current state-of-the-art weather forecasts and climate predictions. In this paper we introduce supermodeling, demonstrate its potential in examples of various complexity, and discuss learning strategies. We conclude with a discussion of remaining challenges for a successful application of supermodeling in the context of state-of-the-art models. The supermodeling approach is not limited to the modeling of weather and climate, but can be applied to improve the prediction capabilities of any complex system, for which a set of different models exists

    A remark on the dimension of the Bergman space of some Hartogs domains

    Full text link
    Let D be a Hartogs domain of the form D={(z,w) \in CxC^N : |w| < e^{-u(z)}} where u is a subharmonic function on C. We prove that the Bergman space of holomorphic and square integrable functions on D is either trivial or infinite dimensional.Comment: 12 page

    Optimal control as a graphical model inference problem

    Get PDF
    We reformulate a class of non-linear stochastic optimal control problems introduced by Todorov (2007) as a Kullback-Leibler (KL) minimization problem. As a result, the optimal control computation reduces to an inference computation and approximate inference methods can be applied to efficiently compute approximate optimal controls. We show how this KL control theory contains the path integral control method as a special case. We provide an example of a block stacking task and a multi-agent cooperative game where we demonstrate how approximate inference can be successfully applied to instances that are too complex for exact computation. We discuss the relation of the KL control approach to other inference approaches to control.Comment: 26 pages, 12 Figures; Machine Learning Journal (2012
    corecore