3,601 research outputs found

    Traffic monitoring using image processing : a thesis presented in partial fulfillment of the requirements for the degree of Master of Engineering in Information and Telecommunications Engineering at Massey University, Palmerston North, New Zealand

    Get PDF
    Traffic monitoring involves the collection of data describing the characteristics of vehicles and their movements. Such data may be used for automatic tolls, congestion and incident detection, law enforcement, and road capacity planning etc. With the recent advances in Computer Vision technology, videos can be analysed automatically and relevant information can be extracted for particular applications. Automatic surveillance using video cameras with image processing technique is becoming a powerful and useful technology for traffic monitoring. In this research project, a video image processing system that has the potential to be developed for real-time application is developed for traffic monitoring including vehicle tracking, counting, and classification. A heuristic approach is applied in developing this system. The system is divided into several parts, and several different functional components have been built and tested using some traffic video sequences. Evaluations are carried out to show that this system is robust and can be developed towards real-time applications

    Tracking of motor vehicles from aerial video imagery using the OT-MACH correlation filter

    Get PDF
    Accurately tracking moving targets in a complex scene involving moving cameras, occlusions and targets embedded in noise is a very active research area in computer vision. In this paper, an optimal trade-off maximum correlation height (OT-MACH) filter has been designed and implemented as a robust tracker. The algorithm allows selection of different objects as a target, based on the operator’s requirements. The user interface is designed so as to allow the selection of a different target for tracking at any time. The filter is updated, at a frequency selected by the user, which makes the filter more resistant to progressive changes in the object’s orientation and scale. The tracker has been tested on both colour visible band as well as infra-red band video sequences acquired from the air by the Sussex County police helicopter. Initial testing has demonstrated the ability of the filter to maintain a stable track on vehicles despite changes of scale, orientation and lighting and the ability to re-acquire the track after short losses due to the vehicle passing behind occlusions

    Automated Video Analysis of Animal Movements Using Gabor Orientation Filters

    Get PDF
    To quantify locomotory behavior, tools for determining the location and shape of an animal’s body are a first requirement. Video recording is a convenient technology to store raw movement data, but extracting body coordinates from video recordings is a nontrivial task. The algorithm described in this paper solves this task for videos of leeches or other quasi-linear animals in a manner inspired by the mammalian visual processing system: the video frames are fed through a bank of Gabor filters, which locally detect segments of the animal at a particular orientation. The algorithm assumes that the image location with maximal filter output lies on the animal’s body and traces its shape out in both directions from there. The algorithm successfully extracted location and shape information from video clips of swimming leeches, as well as from still photographs of swimming and crawling snakes. A Matlab implementation with a graphical user interface is available online, and should make this algorithm conveniently usable in many other contexts

    Automated Markerless Extraction of Walking People Using Deformable Contour Models

    No full text
    We develop a new automated markerless motion capture system for the analysis of walking people. We employ global evidence gathering techniques guided by biomechanical analysis to robustly extract articulated motion. This forms a basis for new deformable contour models, using local image cues to capture shape and motion at a more detailed level. We extend the greedy snake formulation to include temporal constraints and occlusion modelling, increasing the capability of this technique when dealing with cluttered and self-occluding extraction targets. This approach is evaluated on a large database of indoor and outdoor video data, demonstrating fast and autonomous motion capture for walking people

    Spotlight-8 Image Analysis Software

    Get PDF
    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms

    FlyLimbTracker: An active contour based approach for leg segment tracking in unmarked, freely behaving Drosophila.

    Get PDF
    Understanding the biological underpinnings of movement and action requires the development of tools for quantitative measurements of animal behavior. Drosophila melanogaster provides an ideal model for developing such tools: the fly has unparalleled genetic accessibility and depends on a relatively compact nervous system to generate sophisticated limbed behaviors including walking, reaching, grooming, courtship, and boxing. Here we describe a method that uses active contours to semi-automatically track body and leg segments from video image sequences of unmarked, freely behaving D. melanogaster. We show that this approach yields a more than 6-fold reduction in user intervention when compared with fully manual annotation and can be used to annotate videos with low spatial or temporal resolution for a variety of locomotor and grooming behaviors. FlyLimbTracker, the software implementation of this method, is open-source and our approach is generalizable. This opens up the possibility of tracking leg movements in other species by modifications of underlying active contour models
    • 

    corecore