19 research outputs found

    The algebra of observables in Gau{\ss}ian normal spacetime coordinates

    Full text link
    We discuss the canonical structure of a spacetime version of the radial gauge, i.e. Gau{\ss}ian normal spacetime coordinates. While it was found for the spatial version of the radial gauge that a "local" algebra of observables can be constructed, it turns out that this is not possible for the spacetime version. The technical reason for this observation is that the new gauge condition needed to upgrade the spatial to a spacetime radial gauge does not Poisson-commute with the previous gauge conditions. It follows that the involved Dirac bracket is inherently non-local in the sense that no complete set of observables can be found which is constructed locally and at the same time has local Dirac brackets. A locally constructed observable here is defined as a finite polynomial of the canonical variables at a given physical point specified by the Gau{\ss}ian normal spacetime coordinates.Comment: 16 pages; discussion of the cosmological constant added; matches published versio

    Real‐time alerts from AI‐enabled camera traps using the Iridium satellite network: A case‐study in Gabon, Central Africa

    Get PDF
    Efforts to preserve, protect and restore ecosystems are hindered by long delays between data collection and analysis. Threats to ecosystems can go undetected for years or decades as a result. Real-time data can help solve this issue but significant technical barriers exist. For example, automated camera traps are widely used for ecosystem monitoring but it is challenging to transmit images for real-time analysis where there is no reliable cellular or WiFi connectivity. We modified an off-the-shelf camera trap (Bushnell™) and customised existing open-source hardware to create a ‘smart’ camera trap system. Images captured by the camera trap are instantly labelled by an artificial intelligence model and an ‘alert’ containing the image label and other metadata is then delivered to the end-user within minutes over the Iridium satellite network. We present results from testing in the Netherlands, Europe, and from a pilot test in a closed-canopy forest in Gabon, Central Africa. All reference materials required to build the system are provided in open-source repositories. Results show the system can operate for a minimum of 3 months without intervention when capturing a median of 17.23 images per day. The median time-difference between image capture and receiving an alert was 7.35 min, though some outliers showed delays of 5-days or more when the system was incorrectly positioned and unable to connect to the Iridium network. We anticipate significant developments in this field and hope that the solutions presented here, and the lessons learned, can be used to inform future advances. New artificial intelligence models and the addition of other sensors such as microphones will expand the system's potential for other, real-time use cases including real-time biodiversity monitoring, wild resource management and detecting illegal human activities in protected areas

    Robust ecological analysis of camera trap data labelled by a machine learning model

    Get PDF
    1. Ecological data are collected over vast geographic areas using digital sensors such as camera traps and bioacoustic recorders. Camera traps have become the standard method for surveying many terrestrial mammals and birds, but camera trap arrays often generate millions of images that are time‐consuming to label. This causes significant latency between data collection and subsequent inference, which impedes conservation at a time of ecological crisis. Machine learning algorithms have been developed to improve the speed of labelling camera trap data, but it is uncertain how the outputs of these models can be used in ecological analyses without secondary validation by a human. 2. Here, we present our approach to developing, testing and applying a machine learning model to camera trap data for the purpose of achieving fully automated ecological analyses. As a case‐study, we built a model to classify 26 Central African forest mammal and bird species (or groups). The model generalizes to new spatially and temporally independent data (n = 227 camera stations, n = 23,868 images), and outperforms humans in several respects (e.g. detecting ‘invisible’ animals). We demonstrate how ecologists can evaluate a machine learning model's precision and accuracy in an ecological context by comparing species richness, activity patterns (n = 4 species tested) and occupancy (n = 4 species tested) derived from machine learning labels with the same estimates derived from expert labels. 3. Results show that fully automated species labels can be equivalent to expert labels when calculating species richness, activity patterns (n = 4 species tested) and estimating occupancy (n = 3 of 4 species tested) in a large, completely out‐of‐sample test dataset. Simple thresholding using the Softmax values (i.e. excluding ‘uncertain’ labels) improved the model's performance when calculating activity patterns and estimating occupancy but did not improve estimates of species richness. 4. We conclude that, with adequate testing and evaluation in an ecological context, a machine learning model can generate labels for direct use in ecological analyses without the need for manual validation. We provide the user‐community with a multi‐platform, multi‐language graphical user interface that can be used to run our model offline.Additional co-authors: Cisquet Kiebou Opepa, Ross T. Pitman, Hugh S. Robinso

    Real-time alerts from AI-enabled camera traps using the Iridium satellite network: a case-study in Gabon, Central Africa

    Get PDF
    Efforts to preserve, protect, and restore ecosystems are hindered by long delays between data collection and analysis. Threats to ecosystems can go undetected for years or decades as a result. Real-time data can help solve this issue but significant technical barriers exist. For example, automated camera traps are widely used for ecosystem monitoring but it is challenging to transmit images for real-time analysis where there is no reliable cellular or WiFi connectivity. Here, we present our design for a camera trap with integrated artificial intelligence that can send real-time information from anywhere in the world to end-users. We modified an off-the-shelf camera trap (Bushnell) and customised existing open-source hardware to rapidly create a 'smart' camera trap system. Images captured by the camera trap are instantly labelled by an artificial intelligence model and an 'alert' containing the image label and other metadata is then delivered to the end-user within minutes over the Iridium satellite network. We present results from testing in the Netherlands, Europe, and from a pilot test in a closed-canopy forest in Gabon, Central Africa. Results show the system can operate for a minimum of three months without intervention when capturing a median of 17.23 images per day. The median time-difference between image capture and receiving an alert was 7.35 minutes. We show that simple approaches such as excluding 'uncertain' labels and labelling consecutive series of images with the most frequent class (vote counting) can be used to improve accuracy and interpretation of alerts. We anticipate significant developments in this field over the next five years and hope that the solutions presented here, and the lessons learned, can be used to inform future advances. New artificial intelligence models and the addition of other sensors such as microphones will expand the system's potential for other, real-time use cases. Potential applications include, but are not limited to, wildlife tourism, real-time biodiversity monitoring, wild resource management and detecting illegal human activities in protected areas
    corecore