10 research outputs found

    Drone-Based Cattle Detection Using Deep Neural Networks

    Get PDF
    © 2021, Springer Nature Switzerland AG. Cattle form an important source of farming in many countries. In literature, several attempts have been conducted to detect farm animals for different applications and purposes. However, these approaches have been based on detecting animals from images captured from ground level and most approaches use traditional machine learning approaches for their automated detection. In this modern era, Drones facilitate accessing images in challenging environments and scanning large-scale areas with minimum time, which enables many new applications to be established. Considering the fact that drones typically are flown at high altitude to facilitate coverage of large areas within a short time, the captured object size tend to be small and hence this significantly challenges the possible use of traditional machine learning algorithms for object detection. This research proposes a novel methodology to detect cattle in farms established in desert areas using Deep Neural Networks. We propose to detect animals based on a ‘group-of-animals’ concept and associated features in which different group sizes and animal density distribution are used. Two state-of-the-art Convolutional Neural Network (CNN) architectures, SSD-500 and YOLO V-3, are effectively configured, trained and used for the purpose and their performance efficiencies are compared. The results demonstrate the capability of the two generated CNN models to detect groups-of-animals in which the highest accuracy recorded was when using SSD-500 giving a F-score of 0.93, accuracy of 0.89 and mAP rate of 84.7

    A modified genetic algorithm for forecasting fuzzy time series

    No full text
    Egrioglu, Erol/0000-0003-4301-4149; Bas, Eren/0000-0002-0263-8804WOS: 000341094000008Fuzzy time series approaches are used when observations of time series contain uncertainty. Moreover, these approaches do not require the assumptions needed for traditional time series approaches. Generally, fuzzy time series methods consist of three stages, namely, fuzzification, determination of fuzzy relations, and defuzzification. Artificial intelligence algorithms are frequently used in these stages with genetic algorithms being the most popular of these algorithms owing to their rich operators and good performance. However, the mutation operator of a GA may cause some negative results in the solution set. Thus, we propose a modified genetic algorithm to find optimal interval lengths and control the effects of the mutation operator. The results of applying our new approach to real datasets show superior forecasting performance when compared with those obtained by other techniques
    corecore