404,984 research outputs found

    Location prediction based on a sector snapshot for location-based services

    Get PDF
    In location-based services (LBSs), the service is provided based on the users' locations through location determination and mobility realization. Most of the current location prediction research is focused on generalized location models, where the geographic extent is divided into regular-shaped cells. These models are not suitable for certain LBSs where the objectives are to compute and present on-road services. Such techniques are the new Markov-based mobility prediction (NMMP) and prediction location model (PLM) that deal with inner cell structure and different levels of prediction, respectively. The NMMP and PLM techniques suffer from complex computation, accuracy rate regression, and insufficient accuracy. In this paper, a novel cell splitting algorithm is proposed. Also, a new prediction technique is introduced. The cell splitting is universal so it can be applied to all types of cells. Meanwhile, this algorithm is implemented to the Micro cell in parallel with the new prediction technique. The prediction technique, compared with two classic prediction techniques and the experimental results, show the effectiveness and robustness of the new splitting algorithm and prediction technique

    Splitting methods for low Mach number Euler and Navier-Stokes equations

    Get PDF
    Examined are some splitting techniques for low Mach number Euler flows. Shortcomings of some of the proposed methods are pointed out and an explanation for their inadequacy suggested. A symmetric splitting for both the Euler and Navier-Stokes equations is then presented which removes the stiffness of these equations when the Mach number is small. The splitting is shown to be stable

    Asymptotic behavior of splitting schemes involving time-subcycling techniques

    Get PDF
    This paper deals with the numerical integration of well-posed multiscale systems of ODEs or evolutionary PDEs. As these systems appear naturally in engineering problems, time-subcycling techniques are widely used every day to improve computational efficiency. These methods rely on a decomposition of the vector field in a fast part and a slow part and take advantage of that decomposition. This way, if an unconditionnally stable (semi-)implicit scheme cannot be easily implemented, one can integrate the fast equations with a much smaller time step than that of the slow equations, instead of having to integrate the whole system with a very small time-step to ensure stability. Then, one can build a numerical integrator using a standard composition method, such as a Lie or a Strang formula for example. Such methods are primarily designed to be convergent in short-time to the solution of the original problems. However, their longtime behavior rises interesting questions, the answers to which are not very well known. In particular, when the solutions of the problems converge in time to an asymptotic equilibrium state, the question of the asymptotic accuracy of the numerical longtime limit of the schemes as well as that of the rate of convergence is certainly of interest. In this context, the asymptotic error is defined as the difference between the exact and numerical asymptotic states. The goal of this paper is to apply that kind of numerical methods based on splitting schemes with subcycling to some simple examples of evolutionary ODEs and PDEs that have attractive equilibrium states, to address the aforementioned questions of asymptotic accuracy, to perform a rigorous analysis, and to compare them with their counterparts without subcycling. Our analysis is developed on simple linear ODE and PDE toy-models and is illustrated with several numerical experiments on these toy-models as well as on more complex systems. Lie andComment: IMA Journal of Numerical Analysis, Oxford University Press (OUP): Policy A - Oxford Open Option A, 201

    Accelerated Consensus via Min-Sum Splitting

    Full text link
    We apply the Min-Sum message-passing protocol to solve the consensus problem in distributed optimization. We show that while the ordinary Min-Sum algorithm does not converge, a modified version of it known as Splitting yields convergence to the problem solution. We prove that a proper choice of the tuning parameters allows Min-Sum Splitting to yield subdiffusive accelerated convergence rates, matching the rates obtained by shift-register methods. The acceleration scheme embodied by Min-Sum Splitting for the consensus problem bears similarities with lifted Markov chains techniques and with multi-step first order methods in convex optimization

    Local Linear Convergence Analysis of Primal-Dual Splitting Methods

    Full text link
    In this paper, we study the local linear convergence properties of a versatile class of Primal-Dual splitting methods for minimizing composite non-smooth convex optimization problems. Under the assumption that the non-smooth components of the problem are partly smooth relative to smooth manifolds, we present a unified local convergence analysis framework for these methods. More precisely, in our framework we first show that (i) the sequences generated by Primal-Dual splitting methods identify a pair of primal and dual smooth manifolds in a finite number of iterations, and then (ii) enter a local linear convergence regime, which is characterized based on the structure of the underlying active smooth manifolds. We also show how our results for Primal-Dual splitting can be specialized to cover existing ones on Forward-Backward splitting and Douglas-Rachford splitting/ADMM (alternating direction methods of multipliers). Moreover, based on these obtained local convergence analysis result, several practical acceleration techniques are discussed. To exemplify the usefulness of the obtained result, we consider several concrete numerical experiments arising from fields including signal/image processing, inverse problems and machine learning, etc. The demonstration not only verifies the local linear convergence behaviour of Primal-Dual splitting methods, but also the insights on how to accelerate them in practice
    corecore