662 research outputs found

    Bags of Affine Subspaces for Robust Object Tracking

    Full text link
    We propose an adaptive tracking algorithm where the object is modelled as a continuously updated bag of affine subspaces, with each subspace constructed from the object's appearance over several consecutive frames. In contrast to linear subspaces, affine subspaces explicitly model the origin of subspaces. Furthermore, instead of using a brittle point-to-subspace distance during the search for the object in a new frame, we propose to use a subspace-to-subspace distance by representing candidate image areas also as affine subspaces. Distances between subspaces are then obtained by exploiting the non-Euclidean geometry of Grassmann manifolds. Experiments on challenging videos (containing object occlusions, deformations, as well as variations in pose and illumination) indicate that the proposed method achieves higher tracking accuracy than several recent discriminative trackers.Comment: in International Conference on Digital Image Computing: Techniques and Applications, 201

    Describing Change in Visitors and Visits to the “Bob”

    Get PDF
    Understanding wilderness use and users is essential to wilderness management. However, there have only been a limited number of studies specifically designed to detect changes in use and user characteristics across time. Recreation use of the U.S. National Wilderness Preservation System (NWPS) has increased since its creation in 1964, along with many other changes in influences on society’s relationship with wilderness. This article describes a series of visitor trend studies at the Bob Marshall Wilderness Complex in Montana, and identifies some of the challenges encountered in estimating long-term use and user trends

    Improved vision-based weed classification for robotic weeding – a method for increasing speed while retaining accuracy

    Get PDF
    In this paper, we demonstrate how a deep convolutional neural network (DCNN) can be deployed in resource limited environments, such as robots, to reduce the inference time by more than an order of magnitude while retaining high classification accuracy and robustness to novel conditions. This is achieved by training a lightweight DCNN, or compressed model, via model distillation. We show that training models using this approach outperform training a similar model from scratch, using the same data, for weed classification. Using model distillation we are able to improve the accuracy from 97.1% to 97.9% for similar conditions (as the training data) and from 86.4% to 89.8% for different conditions (as the training data). This is in comparison to a traditional approach using robust local binary pattern features which achieves 87.7% for classifying in similar conditions and 83.9% for classifying in different conditions. Finally, we compare this compressed model to a complex fine-tuned model which achieves higher accuracy of 99.6% for the same condition and 95.8% for different conditions but has 100.0 times more parameters (larger model size) and is 40.6 times slower at computing the inference

    Wildland Fire Effects on Visits and Visitors to the Bob Marshal Wilderness Complex

    Get PDF
    Wildland fire can affect wilderness visits and scientific efforts to understand visitor relationships with wilderness places. Large-scale and long-lasting fires occurred in the Bob Marshall Wilderness Complex, Montana, in 2003. A study of visitors that year to monitor long-term trends in visit and visitor characteristics was repeated in 2004 to fully understand how the 2003 fires affected trend analysis. This article considers the question of how wildland fire changes the relationship people have with wilderness, particularly related to their visits and visitor attitudes toward fire management

    Lessons learnt from field trials of a robotic sweet pepper harvester for protected cropping systems

    Get PDF
    In this paper, we present the lessons learnt during the development of a new robotic harvester (Harvey) that can autonomously harvest sweet pepper (capsicum) in protected cropping environments. Robotic harvesting offers an attractive potential solution to reducing labour costs while enabling more regular and selective harvesting, optimising crop quality, scheduling and therefore profit. Our approach combines effective vision algorithms with a novel end-effector design to enable successful harvesting of sweet peppers. We demonstrate a simple and effective vision-based algorithm for crop detection, a grasp selection method, and a novel end-effector design for harvesting. To reduce complexity of motion planning and to minimise occlusions we focus on picking sweet peppers in a protected cropping environment where plants are grown on planar trellis structures. Initial field trials in protected cropping environments, with two cultivars, demonstrate the efficacy of this approach. The results show that the robot harvester can successfully detect, grasp, and detach crop from the plant within a real protected cropping system. The novel contributions of this work have resulted in significant and encouraging improvements in sweet pepper picking success rates compared with the state-of-the-art. Future work will look at detecting sweet pepper peduncles and improving the total harvesting cycle time for each sweet pepper. The methods presented in this paper provide steps towards the goal of fully autonomous and reliable crop picking systems that will revolutionise the horticulture industry by reducing labour costs, maximising the quality of produce, and ultimately improving the sustainability of farming enterprises

    Protected Area Planning Principles and Strategies

    Get PDF
    In this chapter, the challenges of protected area planning are explored by addressing the latter question. The chapter focuses on maintaining protected area values in face of increasing recreational pressure, although these general concepts and principles can be applied to other threats as well (Machlis and Tichnell 1985). First, the social and political contexts within which such planning occurs are outlined. It is to these complex contexts that an interactive, collaborative-learning based planning process would seem most appropriate. Next, an overview of eleven principles of visitor management is presented. These principles must be acknowledged and incorporated in any protected area planning system. Following this section, the conditions needed to implement a carrying capacity approach are reviewed; these requisite conditions lead us to conclude that, despite a resurgence of interest, the carrying capacity model does not adequately address the needs of protected area management. The final section briefly outlines the Limits of Acceptable Change planning system, an example of an approach that can incorporate the eleven previously described principles and has a demonstrated capacity to respond to the needs of protected area managers. The ideas in this chapter have been variously presented in Malaysia, Venezuela, Canada, and Puerto Rico (McCool 1996, McCool and Stankey 1992, Stankey and McCool 1993) and have benefited from the positive interactions and feedback received from protected area managers in those countries

    Robotic weeding – from concept to trials

    Get PDF
    This paper reports on the use of robotic selective mechanical cultivation as an alternative method to herbicide control for managing weed species in zero-till cropping systems. Existing best-practice technology in weed spot spraying utilises infrared technology to detect and selectively spray weeds using herbicide at quantities significantly less than those used in normal blanket spray applications. This reduction in the herbicide de- creases operational costs and can be beneficial for the environment; however, the capital investment in the technology is substantial for farmers who wish to own and operate their equipment. While effective in reducing overall herbicide usage, the technology has done little to tackle the rapid evolution of herbicide resistant weed species. As a potential solution to this issue, our research over the past three years has been focused on the development of non-chemical methods of weed management utilising robot-enabled selective mechanical weeding. Used in conjunction with a robotic vehicle platform, a mechanical weeding array is capable of working throughout the day and night. The weeding tools have been designed to be removable and inter- changeable, allowing the use of tools especially designed for different weed species, weed densities, and soil types. The system developed consists of a one-degree-of-freedom array of weeding tines, actuated into the ground in time to remove individual weeds. Sensing of the weeds is enabled by a vision-based plant detection and classification system, while the timing for the implement actuation to hit the weed is determined as a function of the robot speed. The field trials reported in this paper demonstrate the potential of this robotic system for individualised weed treatment and multi-mode weed management methods. In particular, a trial of the mechanical weeding array in a fallow field over six weeks maintained the weed coverage in robot treated sections to be 1.5%, compared to 37% in the control areas not treated by the robot—a reduction in excess of 90% in weed coverage

    Extracardiac 18F-florbetapir imaging in patients with systemic amyloidosis: more than hearts and minds

    Get PDF
    PURPOSE: 18F-Florbetapir has been reported to show cardiac uptake in patients with systemic light-chain amyloidosis (AL). This study systematically assessed uptake of 18F-florbetapir in patients with proven systemic amyloidosis at sites outside the heart. METHODS: Seventeen patients with proven cardiac amyloidosis underwent 18F-florbetapir PET/CT imaging, 15 with AL and 2 with transthyretin amyloidosis (ATTR). Three patients had repeat scans. All patients had protocolized assessment at the UK National Amyloidosis Centre including imaging with 123I-serum amyloid P component (SAP). 18F-Florbetapir images were assessed for areas of increased tracer accumulation and time-uptake curves in terms of standardized uptake values (SUVmean) were produced. RESULTS: All 17 patients showed 18F-florbetapir uptake at one or more extracardiac sites. Uptake was seen in the spleen in 6 patients (35%; 6 of 9, 67%, with splenic involvement on 123I-SAP scintigraphy), in the fat in 11 (65%), in the tongue in 8 (47%), in the parotids in 8 (47%), in the masticatory muscles in 7 (41%), in the lungs in 3 (18%), and in the kidney in 2 (12%) on the late half-body images. The 18F-florbetapir spleen retention index (SRI) was calculated. SRI >0.045 had 100% sensitivity/sensitivity (in relation to 123I-SAP splenic uptake, the current standard) in detecting splenic amyloid on dynamic imaging and a sensitivity of 66.7% and a specificity of 100% on the late half-body images. Intense lung uptake was seen in three patients, one of whom had lung interstitial infiltration suggestive of amyloid deposition on previous high-resolution CT. Repeat imaging showed a stable appearance in all three patients suggesting no early impact of treatment response. CONCLUSION: 18F-Florbetapir PET/CT is a promising tool for the detection of extracardiac sites of amyloid deposition. The combination of uptake in the heart and uptake in the spleen on 18F-florbetapir PET/CT, a hallmark of AL, suggests that this tracer holds promise as a screening tool for AL

    A well-separated pairs decomposition algorithm for k-d trees implemented on multi-core architectures

    Get PDF
    Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.Variations of k-d trees represent a fundamental data structure used in Computational Geometry with numerous applications in science. For example particle track tting in the software of the LHC experiments, and in simulations of N-body systems in the study of dynamics of interacting galaxies, particle beam physics, and molecular dynamics in biochemistry. The many-body tree methods devised by Barnes and Hutt in the 1980s and the Fast Multipole Method introduced in 1987 by Greengard and Rokhlin use variants of k-d trees to reduce the computation time upper bounds to O(n log n) and even O(n) from O(n2). We present an algorithm that uses the principle of well-separated pairs decomposition to always produce compressed trees in O(n log n) work. We present and evaluate parallel implementations for the algorithm that can take advantage of multi-core architectures.The Science and Technology Facilities Council, UK
    • …
    corecore