25 research outputs found

    Continuous Realtime Gesture Following and Recognition

    Full text link

    Automatic primitive finding for action modeling

    Get PDF

    A Survey on Behavior Analysis in Video Surveillance Applications

    Get PDF

    T-Patterns Revisited: Mining for Temporal Patterns in Sensor Data

    Get PDF
    The trend to use large amounts of simple sensors as opposed to a few complex sensors to monitor places and systems creates a need for temporal pattern mining algorithms to work on such data. The methods that try to discover re-usable and interpretable patterns in temporal event data have several shortcomings. We contrast several recent approaches to the problem, and extend the T-Pattern algorithm, which was previously applied for detection of sequential patterns in behavioural sciences. The temporal complexity of the T-pattern approach is prohibitive in the scenarios we consider. We remedy this with a statistical model to obtain a fast and robust algorithm to find patterns in temporal data. We test our algorithm on a recent database collected with passive infrared sensors with millions of events

    A semantic-based probabilistic approach for real-time video event recognition

    Full text link
    This is the author’s version of a work that was accepted for publication in Journal Computer Vision and Image Understanding. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Journal Computer Vision and Image Understanding, 116, 9 (2012) DOI: 10.1016/j.cviu.2012.04.005This paper presents an approach for real-time video event recognition that combines the accuracy and descriptive capabilities of, respectively, probabilistic and semantic approaches. Based on a state-of-art knowledge representation, we define a methodology for building recognition strategies from event descriptions that consider the uncertainty of the low-level analysis. Then, we efficiently organize such strategies for performing the recognition according to the temporal characteristics of events. In particular, we use Bayesian Networks and probabilistically-extended Petri Nets for recognizing, respectively, simple and complex events. For demonstrating the proposed approach, a framework has been implemented for recognizing human-object interactions in the video monitoring domain. The experimental results show that our approach improves the event recognition performance as compared to the widely used deterministic approach.This work has been partially supported by the Spanish Administration agency CDTI (CENIT-VISION 2007- 1007), by the Spanish Government (TEC2011-25995 EventVideo), by the Consejería de Educación of the Comunidad de Madrid and by The European Social Fund

    Trajectory Based Activity Discovery

    Get PDF
    International audienceThis paper proposes a framework to discover activities in an unsupervised manner, and add semantics with minimal supervision. The framework uses basic trajectory information as input and goes up to video interpretation. The work reduces the gap between low-level information and semantic interpretation, building an intermediate layer composed of Primitive Events. The proposed representation for primitive events aims at capturing small meaningful motions over the scene with the advantage of being learnt in an unsupervised manner. We propose the discovery of an activity using these Primitive Events as the main descriptors. The activity discovery is done using only real tracking data. Semantics are added to the discovered activities and the recognition of activities (e.g., “Cooking”, “Eating”) can be automatically done with new datasets. Finally we validate the descriptors by discovering and recognizing activities in a home care application dataset

    Trajectory based Primitive Events for learning and recognizing Activity

    Get PDF
    International audienceThis paper proposes a framework to recognize and classify loosely constrained activities with minimal supervision. The framework use basic trajectory information as input and goes up to video interpretation. The work reduces the gap between low-level information and semantic interpretation, building an intermediate layer composed Primitive Events. The proposed representation for primitive events aims at capturing small meaningful motions over the scene with the advantage of been learnt in an unsupervised manner. We propose the modelling of an activity using Primitive Events as the main descriptors. The activity model is built in a semi-supervised way using only real tracking data. Finally we validate the descriptors by recognizing and labelling modelled activities in a home-care application dataset
    corecore