94,121 research outputs found

    A study on machine learning algorithms for fall detection and movement classification

    Get PDF
    Fall among the elderly is an important health issue. Fall detection and movement tracking techniques are therefore instrumental in dealing with this issue. This thesis responds to the challenge of classifying different movement types as a part of a system designed to fulfill the need for a wearable device to collect data for fall and near-fall analysis. Four different fall activities (forward, backward, left and right), three normal activities (standing, walking and lying down) and near-fall situations are identified and detected. Different machine learning algorithms are compared and the best one is used for the real time classification. The comparison is made using Waikato Environment for Knowledge Analysis or in short WEKA. The system also has the ability to adapt to different gaits of different people. A feature selection algorithm is also introduced to reduce the number of features required for the classification problem

    Generalized Forward-Backward Splitting with Penalization for Monotone Inclusion Problems

    Full text link
    We introduce a generalized forward-backward splitting method with penalty term for solving monotone inclusion problems involving the sum of a finite number of maximally monotone operators and the normal cone to the nonempty set of zeros of another maximal monotone operator. We show weak ergodic convergence of the generated sequence of iterates to a solution of the considered monotone inclusion problem, provided the condition corresponded to the Fitzpatrick function of the operator describing the set of the normal cone is fulfilled. Under strong monotonicity of an operator, we show strong convergence of the iterates. Furthermore, we utilize the proposed method for minimizing a large-scale hierarchical minimization problem concerning the sum of differentiable and nondifferentiable convex functions subject to the set of minima of another differentiable convex function. We illustrate the functionality of the method through numerical experiments addressing constrained elastic net and generalized Heron location problems

    A Comparison of Algorithms for Learning Hidden Variables in Normal Graphs

    Full text link
    A Bayesian factor graph reduced to normal form consists in the interconnection of diverter units (or equal constraint units) and Single-Input/Single-Output (SISO) blocks. In this framework localized adaptation rules are explicitly derived from a constrained maximum likelihood (ML) formulation and from a minimum KL-divergence criterion using KKT conditions. The learning algorithms are compared with two other updating equations based on a Viterbi-like and on a variational approximation respectively. The performance of the various algorithm is verified on synthetic data sets for various architectures. The objective of this paper is to provide the programmer with explicit algorithms for rapid deployment of Bayesian graphs in the applications.Comment: Submitted for journal publicatio
    • …
    corecore