19,569 research outputs found

    The crowd as a cameraman : on-stage display of crowdsourced mobile video at large-scale events

    Get PDF
    Recording videos with smartphones at large-scale events such as concerts and festivals is very common nowadays. These videos register the atmosphere of the event as it is experienced by the crowd and offer a perspective that is hard to capture by the professional cameras installed throughout the venue. In this article, we present a framework to collect videos from smartphones in the public and blend these into a mosaic that can be readily mixed with professional camera footage and shown on displays during the event. The video upload is prioritized by matching requests of the event director with video metadata, while taking into account the available wireless network capacity. The proposed framework's main novelty is its scalability, supporting the real-time transmission, processing and display of videos recorded by hundreds of simultaneous users in ultra-dense Wi-Fi environments, as well as its proven integration in commercial production environments. The framework has been extensively validated in a controlled lab setting with up to 1 000 clients as well as in a field trial where 1 183 videos were collected from 135 participants recruited from an audience of 8 050 people. 90 % of those videos were uploaded within 6.8 minutes

    Keeping track of worm trackers

    Get PDF
    C. elegans is used extensively as a model system in the neurosciences due to its well defined nervous system. However, the seeming simplicity of this nervous system in anatomical structure and neuronal connectivity, at least compared to higher animals, underlies a rich diversity of behaviors. The usefulness of the worm in genome-wide mutagenesis or RNAi screens, where thousands of strains are assessed for phenotype, emphasizes the need for computational methods for automated parameterization of generated behaviors. In addition, behaviors can be modulated upon external cues like temperature, O2 and CO2 concentrations, mechanosensory and chemosensory inputs. Different machine vision tools have been developed to aid researchers in their efforts to inventory and characterize defined behavioral “outputs”. Here we aim at providing an overview of different worm-tracking packages or video analysis tools designed to quantify different aspects of locomotion such as the occurrence of directional changes (turns, omega bends), curvature of the sinusoidal shape (amplitude, body bend angles) and velocity (speed, backward or forward movement)

    Unmanned Aerial Vehicles (UAVs) in environmental biology: A Review

    Get PDF
    Acquiring information about the environment is a key step during each study in the field of environmental biology at different levels, from an individual species to community and biome. However, obtaining information about the environment is frequently difficult because of, for example, the phenological timing, spatial distribution of a species or limited accessibility of a particular area for the field survey. Moreover, remote sensing technology, which enables the observation of the Earth’s surface and is currently very common in environmental research, has many limitations such as insufficient spatial, spectral and temporal resolution and a high cost of data acquisition. Since the 1990s, researchers have been exploring the potential of different types of unmanned aerial vehicles (UAVs) for monitoring Earth’s surface. The present study reviews recent scientific literature dealing with the use of UAV in environmental biology. Amongst numerous papers, short communications and conference abstracts, we selected 110 original studies of how UAVs can be used in environmental biology and which organisms can be studied in this manner. Most of these studies concerned the use of UAV to measure the vegetation parameters such as crown height, volume, number of individuals (14 studies) and quantification of the spatio-temporal dynamics of vegetation changes (12 studies). UAVs were also frequently applied to count birds and mammals, especially those living in the water. Generally, the analytical part of the present study was divided into following sections: (1) detecting, assessing and predicting threats on vegetation, (2) measuring the biophysical parameters of vegetation, (3) quantifying the dynamics of changes in plants and habitats and (4) population and behaviour studies of animals. At the end, we also synthesised all the information showing, amongst others, the advances in environmental biology because of UAV application. Considering that 33% of studies found and included in this review were published in 2017 and 2018, it is expected that the number and variety of applications of UAVs in environmental biology will increase in the future

    Preserving privacy in surgical video analysis using a deep learning classifier to identify out-of-body scenes in endoscopic videos

    Get PDF
    Surgical video analysis facilitates education and research. However, video recordings of endoscopic surgeries can contain privacy-sensitive information, especially if the endoscopic camera is moved out of the body of patients and out-of-body scenes are recorded. Therefore, identification of out-of-body scenes in endoscopic videos is of major importance to preserve the privacy of patients and operating room staff. This study developed and validated a deep learning model for the identification of out-of-body images in endoscopic videos. The model was trained and evaluated on an internal dataset of 12 different types of laparoscopic and robotic surgeries and was externally validated on two independent multicentric test datasets of laparoscopic gastric bypass and cholecystectomy surgeries. Model performance was evaluated compared to human ground truth annotations measuring the receiver operating characteristic area under the curve (ROC AUC). The internal dataset consisting of 356,267 images from 48 videos and the two multicentric test datasets consisting of 54,385 and 58,349 images from 10 and 20 videos, respectively, were annotated. The model identified out-of-body images with 99.97% ROC AUC on the internal test dataset. Mean +/- standard deviation ROC AUC on the multicentric gastric bypass dataset was 99.94 +/- 0.07% and 99.71 +/- 0.40% on the multicentric cholecystectomy dataset, respectively. The model can reliably identify out-of-body images in endoscopic videos and is publicly shared. This facilitates privacy preservation in surgical video analysis

    Small Animal Video Tracking for Activity and Path Analysis Using a Novel Open-Source Multi-Platform Application (AnimApp)

    Get PDF
    Experimental biological model system outcomes such as altered animal movement capability or behaviour are difficult to quantify manually. Existing automatic movement tracking devices can be expensive and imposing upon the typical environment of the animal model. We have developed a novel multiplatform, free-to-use open-source application based on OpenCV, called AnimApp. Our results show that AnimApp can reliably and reproducibly track movement of small animals such as rodents or insects, and quantify parameters of action including distance and speed in order to detect activity changes arising from handling, environment enrichment, or temperature alteration. This system offers an accurate and reproducible experimental approach with potential for simple, fast and flexible analysis of movement and behaviour in a wide range of model systems

    D1.3 List of available solutions

    Get PDF
    This report has been submitted by Tempesta Media SL as deliverable D1.3 within the framework of H2020 project "SO-CLOSE: Enhancing Social Cohesion through Sharing the Cultural Heritage of Forced Migrations" Grant No. 870939.This report aims to conduct research on the specific topics and needs of the SO-CLOSE project, addressing the available solutions through a state-of-the-art digital tools analysis, applied in the cultural heritage and migration fields. More specifically the report's scope is:To define proper tools and proceedings for the interview needs -performing, recording, transcription, translation. To analyse potential content gathering tools for the co-creation workshops. To conduct a state-of-the-art sharing tools analysis, applied in the cultural heritage and migration fields, and propose a critically adjusted and innovative digital approach
    • …
    corecore