21,229 research outputs found

    Inhomogeneity of the phase space of the damped harmonic oscillator under Levy noise

    Full text link
    The damped harmonic oscillator under symmetric L\'{e}vy white noise shows inhomogeneous phase space, which is in contrast to the homogeneous one of the same oscillator under the Gaussian white noise, as shown in a recent paper [I. M. Sokolov, W. Ebeling, and B. Dybiec, Phys. Rev. E \textbf{83}, 041118 (2011)]. The inhomogeneity of the phase space shows certain correlation between the coordinate and the velocity of the damped oscillator under symmetric L\'{e}vy white noise. In the present work we further explore the physical origin of these distinguished features and find that it is due to the combination of the damped effect and heavy tail of the noise. We demonstrate directly this in the reduced coordinate x~\tilde{x} versus velocity v~\tilde{v} plots and identify the physics of the anti-association of the coordinate and velocity.Comment: 7 pages,10 figures, a full version of published pape

    Distributed Stochastic Optimization over Time-Varying Noisy Network

    Full text link
    This paper is concerned with distributed stochastic multi-agent optimization problem over a class of time-varying network with slowly decreasing communication noise effects. This paper considers the problem in composite optimization setting which is more general in noisy network optimization. It is noteworthy that existing methods for noisy network optimization are Euclidean projection based. We present two related different classes of non-Euclidean methods and investigate their convergence behavior. One is distributed stochastic composite mirror descent type method (DSCMD-N) which provides a more general algorithm framework than former works in this literature. As a counterpart, we also consider a composite dual averaging type method (DSCDA-N) for noisy network optimization. Some main error bounds for DSCMD-N and DSCDA-N are obtained. The trade-off among stepsizes, noise decreasing rates, convergence rates of algorithm is analyzed in detail. To the best of our knowledge, this is the first work to analyze and derive convergence rates of optimization algorithm in noisy network optimization. We show that an optimal rate of O(1/T)O(1/\sqrt{T}) in nonsmooth convex optimization can be obtained for proposed methods under appropriate communication noise condition. Moveover, convergence rates in different orders are comprehensively derived in both expectation convergence and high probability convergence sense.Comment: 27 page
    corecore