18 research outputs found

    Grazing under irrigation affects N2O-emissions substantially in South Africa

    Get PDF
    CITATION: Smit, H. P. J. et al. 2020. Grazing under irrigation affects N2O-emissions substantially in South Africa. Atmosphere, 11(9):925, doi:10.3390/atmos11090925.The original publication is available at https://www.mdpi.comFertilized agricultural soils serve as a primary source of anthropogenic N2O emissions. In South Africa, there is a paucity of data on N2O emissions from fertilized, irrigated dairy-pasture systems and emission factors (EF) associated with the amount of N applied. A first study aiming to quantify direct N2O emissions and associated EFs of intensive pasture-based dairy systems in sub-Sahara Africa was conducted in South Africa. Field trials were conducted to evaluate fertilizer rates (0, 220, 440, 660, and 880 kg N ha−1 year−1) on N2O emissions from irrigated kikuyu–perennial ryegrass (Pennisetum clandestinum–Lolium perenne) pastures. The static chamber method was used to collect weekly N2O samples for one year. The highest daily N2O fluxes occurred in spring (0.99 kg ha−1 day−1) and summer (1.52 kg ha−1 day−1). Accumulated N2O emissions ranged between 2.45 and 15.5 kg N2O-N ha−1 year−1 and EFs for mineral fertilizers applied had an average of 0.9%. Nitrogen in yielded herbage varied between 582 and 900 kg N ha−1. There was no positive effect on growth of pasture herbage from adding N at high rates. The relationship between N balance and annual N2O emissions was exponential, which indicated that excessive fertilization of N will add directly to N2O emissions from the pastures. Results from this study could update South Africa’s greenhouse gas inventory more accurately to facilitate Tier 3 estimates.https://www.mdpi.com/2073-4433/11/9/925Publisher's versio

    Weight, temperature and humidity sensor data of honey bee colonies in Germany, 2019–2022

    No full text
    Humans have kept honeybees as livestock to harvest honey, wax and other products for thousands of years and still continue doing so. Today however, beekeepers in many parts of the world report unprecedented high numbers of colony losses. Sensor data from honey bee colonies can contribute to new insights about development and health factors for honey bee colonies. The data can be incorporated in smart decision support systems and warning tools for beekeepers. In this paper, we present sensor data from 78 honey bee colonies in Germany collected as part of a citizen science project. Each honey bee hive was equipped with five temperature sensors within the hive, one temperature sensor for outside measurements, a combined sensor for temperature, ambient air pressure and humidity, and a scale to measure the weight. During the data acquisition period, beekeepers used a web app to report their observations and beekeeping activities. We provide the raw data with a measurement interval of up to 5 s as well as aggregated data, with per minute, hourly or daily average values. Furthermore, we performed several preprocessing steps, removing outliers with a threshold based approach, excluding changes in weight that were induced by beekeeping activities and combining the sensor data with the most important meta-data from the beekeepers' observations. The data is organised in directories based on the year of recording. Alternatively, we provide subsets of the data structured based on the occurrence or non-occurrence of a swarming event or the death of a colony. The data can be analysed using methods from time series analysis, time series classification or other data science approaches to form a better understanding of specifics in the development of honey bee colonies

    Multi-Sensory-Motor Research: Investigating Auditory, Visual, and Motor Interaction in Virtual Reality Environments

    No full text
    Perception in natural environments is inseparably linked to motor action. In fact, we consider action an essential component of perceptual representation. But these representations are inherently difficult to investigate: Traditional experimental setups are limited by the lack of flexibility in manipulating spatial features. To overcome these problems, virtual reality (VR) experiments seem to be a feasible alternative, but these setups typically lack ecological realism due to the use of “unnatural” interface-devices (joystick). Thus, we propose an experimental apparatus which combines multisensory perception and action in an ecologically realistic way. The basis is a 10-foot hollow sphere (VirtuSphere) placed on a platform that allows free rotation. A subject inside can walk in any direction for any distance immersed into virtual environment. Both the rotation of the sphere and movement of the subject's head are tracked to process the subject's view within the VR-environment presented on a head-mounted display. Moreover, auditory features are dynamically processed taking greatest care of exact alignment of sound-sources and visual objects using ambisonic-encoded audio processed by a HRTF-filterbank. We present empirical data that confirm ecological realism of this setup and discuss its suitability for multi-sensory-motor research

    Information-Driven Active Audio-Visual Source Localization

    No full text
    <p>The raw data recorded during the experiments is available in the folder "data", which includes the subfolders 'robotData-IG','robotData-Random', 'simulationData-IG' and 'simulationData-Random' for the data of the simulation and the robot experiments, respectively. In each step, all data needed for performance evaluations (that is, the action u selected by the system and the corresponding information gain, the sensory measurements z, the state of the system, the state estimate and the particle set itself) were serialized by Python's cPickle-module. Furthermore, the subfolder 'data\extracted_data_csv' contains all the data we used in our Figures in a condensed form, saved to csv-files: all relevant data (and only relevant data) were extracted from the raw data, so that it is not necessary anymore to load and  process the binary data recorded during the experiments and you have all the information you need in a human-readable text-based file.</p> <p>The Python module "InformationDriven_AV_SourceTracking_EVALUATION.py" shows how to access the data and includes all the code necessary to read and evaluate the data recorded during the experiments.</p> <p>How to build and run:<br>In addition to a standard Python 2.7 distribution, some Python libraries are necessary to run the code:<br>- numpy (http://www.numpy.org/)<br>- matplotlib(http://matplotlib.org/)<br>- config (https://pypi.python.org/pypi/config/0.3.7)<br>optional (see below):<br>- evaluation/csvData/error<br>- open cv(2) for python</p> <p>[OPTIONAL: If you want to analyze the raw data (not the data saved in the CSV-files) you have to build a few custom modules manually:<br>As some of the modules used in our implementation were written in Cython (http://www.cython.org/) in order to speed up computations, it is necessary to compile these for your system by<br>>> cd src/particleFiltering<br>>> python setup.py build_ext --inplace<br>The next step is to manually uncomment the line "# from particleFiltering.belief_Cy import Belief" at the beginning of the file  "InformationDriven_AV_SourceTracking_EVALUATION.py' in order to use the functions working on raw data.<br>\OPTIONAL]</p> <p>After installing the necessary libraries (and optionally compiling the Cython-modules), you can start the evaluation script by:<br>>> cd src<br>>> python InformationDriven_AV_SourceTracking_EVALUATION.py ,<br>in order to generate all figures shown in the "results"-section of the manuscript and save them to the "src"-directory. By default, they are saved to a pdf-file, but you can change the file-format by changing the variable 'plotFileType' at the beginning of the evaluation script to '.jpg', '.png', '.tiff' or any other file formats supported by matplotlib.</p> <p>If you want to analyze the data yourself, all steps needed to access and evaluate the recorded data are exemplified in the module "InformationDriven_AV_SourceTracking_EVALUATION.py" and should be fairly self-explanatory. While the figures in our manuscript were generated using the extracted data in the CSV-files (see function 'generatePlots' in  "InformationDriven_AV_SourceTracking_EVALUATION.py"), we also included functions which work with the raw data (functions 'evaluateSimulationExperiments_IG_error_raw', 'evaluateSimulationExperiments_random_error_raw',<br>'evaluateSimulationExperiments_IG_entropy_raw', 'evaluateSimulationExperiments_random_entropy_raw',<br>'evaluateRobotExperiments_IG_error_raw', 'evaluateRobotExperiments_IG_entropy_raw',<br>'evaluateRobotExperiments_random_error_raw' and 'evaluateRobotExperiments_random_entropy_raw').<br>These show how to access the raw data and how to generate the same curves as the ones shown in the results section, so that it is transparent how the data stored in the CSV-files can be extracted from the raw data recorded in the experiments.</p> <p> </p> <p> </p

    Initial State And Updates.

    No full text
    <p>These figures show the initial particle distribution in a robot-centric coordinate system whose origin corresponds to the center of the robot (indicated by a green dot and a green line which illustrates that the robot’s head is rotated toward its front) while the x- and y-axis correspond to the left/right and front/back of the robot, respectively. Particles are depicted as blue dots while the source’s actual position is indicated by a green star. The mean of the particle distribution is symbolized by a red dot. We chose to include it in these figures because we use it as a state estimate when calculating distances to the actual position of the source. <b>A) After initialization</b> all particles are distributed uniformly in the state space, which implies that the system does not have any information about the location of the source, which is located in front of the robot. <b>B) The initial auditory correction update</b> eliminates all particles located to the left and right of the robot because they are not compatible with the measurement. The width of the cone around the “true position” directly corresponds to the standard deviation parameter of the Gaussians in the auditory sensor model. This correction update is also a good example for the front-back confusion in audition: Based on the ITD-measurement, the system cannot distinguish whether the source is in front of or behind it, and thus treats corresponding particles behind and in front of it the same way. <b>C) The initial visual correction update</b> shows the characteristic properties of the visual modality within our system: Visual measurements have a high spatial precision but particles corresponding to positions outside the field of view of the camera are not updated because the measurement contains no information about these positions. <b>D) The combination of both sensory updates</b> shows the interaction of the modalities: The visual modality is able to increase the precision of the estimate based on the auditory measurement but provides no information about locations outside the field of view.</p

    Visual Processing: Logistic Regression.

    No full text
    <p>A) Result of the template matching procedure B) Effect of the application of the logistic regression model.</p

    Localization performance.

    No full text
    <p>The result show a significantly lower number of actions required to reach a criterion RMS error < 30<i>cm</i> in both A) simulation and B) robot experiments. Errors bars show confidence intervals (.95).</p

    Entropy of the PDF estimate (robot experiments).

    No full text
    <p>The information-driven action selection process leads to a (roughly) exponential decrease of entropy. In comparison, random movements still lead to a mostly monotonic decrease of entropy, but at a slower rate and with a higher minimum entropy than in the information-driven approach.</p
    corecore