914,508 research outputs found

    Smart environment monitoring through micro unmanned aerial vehicles

    Get PDF
    In recent years, the improvements of small-scale Unmanned Aerial Vehicles (UAVs) in terms of flight time, automatic control, and remote transmission are promoting the development of a wide range of practical applications. In aerial video surveillance, the monitoring of broad areas still has many challenges due to the achievement of different tasks in real-time, including mosaicking, change detection, and object detection. In this thesis work, a small-scale UAV based vision system to maintain regular surveillance over target areas is proposed. The system works in two modes. The first mode allows to monitor an area of interest by performing several flights. During the first flight, it creates an incremental geo-referenced mosaic of an area of interest and classifies all the known elements (e.g., persons) found on the ground by an improved Faster R-CNN architecture previously trained. In subsequent reconnaissance flights, the system searches for any changes (e.g., disappearance of persons) that may occur in the mosaic by a histogram equalization and RGB-Local Binary Pattern (RGB-LBP) based algorithm. If present, the mosaic is updated. The second mode, allows to perform a real-time classification by using, again, our improved Faster R-CNN model, useful for time-critical operations. Thanks to different design features, the system works in real-time and performs mosaicking and change detection tasks at low-altitude, thus allowing the classification even of small objects. The proposed system was tested by using the whole set of challenging video sequences contained in the UAV Mosaicking and Change Detection (UMCD) dataset and other public datasets. The evaluation of the system by well-known performance metrics has shown remarkable results in terms of mosaic creation and updating, as well as in terms of change detection and object detection

    A Smart Management System For Garbage Classification Using Deep Learning

    Get PDF
    Thanks to the development of artificial intelligence (AI), the outdated trash system now offers better time monitoring and enables for better waste management. The purpose of this paper is to develop a smart sterile management system using a Tensor Flow-based deep learning model. In real time, it recognizes and categorizes items. Metal, plastic, and paper waste are separated from other sorts of trash in the bin's several divisions. Object detection and garbage classification are carried out using the Tensor Flow framework and a trained object recognition model. In order to create a frozen inference graph that can be used to recognize things using a camera, this trash detection model is trained on garbage photographs

    Reactive direction control for a mobile robot: A locust-like control of escape direction emerges when a bilateral pair of model locust visual neurons are integrated

    Get PDF
    Locusts possess a bilateral pair of uniquely identifiable visual neurons that respond vigorously to the image of an approaching object. These neurons are called the lobula giant movement detectors (LGMDs). The locust LGMDs have been extensively studied and this has lead to the development of an LGMD model for use as an artificial collision detector in robotic applications. To date, robots have been equipped with only a single, central artificial LGMD sensor, and this triggers a non-directional stop or rotation when a potentially colliding object is detected. Clearly, for a robot to behave autonomously, it must react differently to stimuli approaching from different directions. In this study, we implement a bilateral pair of LGMD models in Khepera robots equipped with normal and panoramic cameras. We integrate the responses of these LGMD models using methodologies inspired by research on escape direction control in cockroaches. Using ‘randomised winner-take-all’ or ‘steering wheel’ algorithms for LGMD model integration, the khepera robots could escape an approaching threat in real time and with a similar distribution of escape directions as real locusts. We also found that by optimising these algorithms, we could use them to integrate the left and right DCMD responses of real jumping locusts offline and reproduce the actual escape directions that the locusts took in a particular trial. Our results significantly advance the development of an artificial collision detection and evasion system based on the locust LGMD by allowing it reactive control over robot behaviour. The success of this approach may also indicate some important areas to be pursued in future biological research

    Development and Validation of on-board systems control laws

    Get PDF
    Purpose - The purpose of this paper is to describe the tool and procedure developed in order to design the control laws of several UAV (Unmanned Aerial Vehicle) sub-systems. The authors designed and developed the logics governing: landing gear, nose wheel steering, wheel braking, and fuel system. Design/methodology/approach - This procedure is based on a general purpose, object-oriented, simulation tool. The development method used is based on three-steps. The main structure of the control laws is defined through flow charts; then the logics are ported to ANSI-C programming language; finally the code is implemented inside the status model. The status model is a Matlab-Simulink model, which uses an embedded Matlab-function to model the FCC (Flight Control Computer). The core block is linked with the components, but cannot access their internal model. Interfaces between FCCs and system components in the model reflect real system ones. Findings - The user verifies systems' reactions in real time, through the status model. Using block-oriented approach, development of the control laws and integration of several systems is faster. Practical implications - The tool aims to test and validate the control laws dynamically, helping specialists to find out odd logics or undesired responses, during the pre-design. Originality/value - The development team can test and verify the control laws in various failure scenarios. This tool allows more reliable and effective logics to be produced, which can be directly used on the system

    Computer vision application proposal for smart inventory systems in convenience store reach-in refrigerators

    Get PDF
    Inventory systems in reach-in refrigerators employ manual or smart inventory outdated methods, although efficient, new methods like computer vision could render better results in less time, with less human intervention. The objective of this work proposes a computer vision system to acquire an inventory of products placed in reach-in convenience store refrigerators. A comparative of different computer vision object recognition models was performed to select the most appropriate model for the application. Then, based on the model characteristics and the application requirements, a YOLOv4 object recognition model was selected. Along with a 2-dimension camera positioning rig to capture a live video feed of the products to count for the inventory. Future works could include a real size prototype and further development into a commercial product.ITESO, A. C

    ATOM: an object-based formal method for real-time systems

    Get PDF
    An object based formal method for the development of real-time systems, called ATOM, is presented. The method is an integration of the real-time formal technique TAM (Temporal Agent Model) with an industry-strength structured methodology known as HRT-HOOD. ATOM is a systematic formal approach based on the refinement calculus. Within ATOM, a formal specification (or abstract description statement) contains Interval Temporal Logic (ITL) description of the timing, functional, and communication behavior of the proposed real-time system. This formal specification can be analyzed and then refined into concrete statements through successive applications of sound refinement laws. Both abstract and concrete statements are allowed to freely intermix. The semantics of the concrete statements in ATOM are defined denotationally in specification-oriented style using ITL.Funding received from the UK Engineering and Physical Sciences Research Council (EPSRC) through the Research Grant GR/M/0258

    Making intelligent systems team players. A guide to developing intelligent monitoring systems

    Get PDF
    This reference guide for developers of intelligent monitoring systems is based on lessons learned by developers of the DEcision Support SYstem (DESSY), an expert system that monitors Space Shuttle telemetry data in real time. DESSY makes inferences about commands, state transitions, and simple failures. It performs failure detection rather than in-depth failure diagnostics. A listing of rules from DESSY and cue cards from DESSY subsystems are included to give the development community a better understanding of the selected model system. The G-2 programming tool used in developing DESSY provides an object-oriented, rule-based environment, but many of the principles in use here can be applied to any type of monitoring intelligent system. The step-by-step instructions and examples given for each stage of development are in G-2, but can be used with other development tools. This guide first defines the authors' concept of real-time monitoring systems, then tells prospective developers how to determine system requirements, how to build the system through a combined design/development process, and how to solve problems involved in working with real-time data. It explains the relationships among operational prototyping, software evolution, and the user interface. It also explains methods of testing, verification, and validation. It includes suggestions for preparing reference documentation and training users

    Real-Time Human Motion Capture Driven by a Wireless Sensor Network

    Get PDF
    The motion of a real object model is reconstructed through measurements of the position, direction, and angle of moving objects in 3D space in a process called “motion capture.” With the development of inertial sensing technology, motion capture systems that are based on inertial sensing have become a research hot spot. However, the solution of motion attitude remains a challenge that restricts the rapid development of motion capture systems. In this study, a human motion capture system based on inertial sensors is developed, and the real-time movement of a human model controlled by real people’s movement is achieved. According to the features of the system of human motion capture and reappearance, a hierarchical modeling approach based on a 3D human body model is proposed. The method collects articular movement data on the basis of rigid body dynamics through a miniature sensor network, controls the human skeleton model, and reproduces human posture according to the features of human articular movement. Finally, the feasibility of the system is validated by testing of system properties via capture of continuous dynamic movement. Experiment results show that the scheme utilizes a real-time sensor network-driven human skeleton model to achieve the accurate reproduction of human motion state. The system also has good application value
    corecore