2,480 research outputs found
Capturing natural-colour 3D models of insects for species discovery
Collections of biological specimens are fundamental to scientific
understanding and characterization of natural diversity. This paper presents a
system for liberating useful information from physical collections by bringing
specimens into the digital domain so they can be more readily shared, analyzed,
annotated and compared. It focuses on insects and is strongly motivated by the
desire to accelerate and augment current practices in insect taxonomy which
predominantly use text, 2D diagrams and images to describe and characterize
species. While these traditional kinds of descriptions are informative and
useful, they cannot cover insect specimens "from all angles" and precious
specimens are still exchanged between researchers and collections for this
reason. Furthermore, insects can be complex in structure and pose many
challenges to computer vision systems. We present a new prototype for a
practical, cost-effective system of off-the-shelf components to acquire
natural-colour 3D models of insects from around 3mm to 30mm in length. Colour
images are captured from different angles and focal depths using a digital
single lens reflex (DSLR) camera rig and two-axis turntable. These 2D images
are processed into 3D reconstructions using software based on a visual hull
algorithm. The resulting models are compact (around 10 megabytes), afford
excellent optical resolution, and can be readily embedded into documents and
web pages, as well as viewed on mobile devices. The system is portable, safe,
relatively affordable, and complements the sort of volumetric data that can be
acquired by computed tomography. This system provides a new way to augment the
description and documentation of insect species holotypes, reducing the need to
handle or ship specimens. It opens up new opportunities to collect data for
research, education, art, entertainment, biodiversity assessment and
biosecurity control.Comment: 24 pages, 17 figures, PLOS ONE journa
Towards high-throughput 3D insect capture for species discovery and diagnostics
Digitisation of natural history collections not only preserves precious
information about biological diversity, it also enables us to share, analyse,
annotate and compare specimens to gain new insights. High-resolution,
full-colour 3D capture of biological specimens yields color and geometry
information complementary to other techniques (e.g., 2D capture, electron
scanning and micro computed tomography). However 3D colour capture of small
specimens is slow for reasons including specimen handling, the narrow depth of
field of high magnification optics, and the large number of images required to
resolve complex shapes of specimens. In this paper, we outline techniques to
accelerate 3D image capture, including using a desktop robotic arm to automate
the insect handling process; using a calibrated pan-tilt rig to avoid attaching
calibration targets to specimens; using light field cameras to capture images
at an extended depth of field in one shot; and using 3D Web and mixed reality
tools to facilitate the annotation, distribution and visualisation of 3D
digital models.Comment: 2 pages, 1 figure, for BigDig workshop at 2017 eScience conferenc
An automated device for the digitization and 3D modelling of insects, combining extended-depth-of-field and all-side multi-view imaging
Digitization of natural history collections is a major challenge in archiving biodiversity. In recent years,
several approaches have emerged, allowing either automated digitization, extended depth of field (EDOF)
or multi-view imaging of insects. Here, we present DISC3D: a new digitization device for pinned insects
and other small objects that combines all these aspects. A PC and a microcontroller board control the
device. It features a sample holder on a motorized two-axis gimbal, allowing the specimens to be imaged
from virtually any view. Ambient, mostly reflection-free illumination is ascertained by two LED-stripes
circularly installed in two hemispherical white-coated domes (front-light and back-light). The device is
equipped with an industrial camera and a compact macro lens, mounted on a motorized macro rail.
EDOF images are calculated from an image stack using a novel calibrated scaling algorithm that meets the
requirements of the pinhole camera model (a unique central perspective). The images can be used to generate
a calibrated and real color texturized 3Dmodel by âstructure from motionâ with a visibility consistent
mesh generation. Such models are ideal for obtaining morphometric measurement data in 1D, 2D and
3D, thereby opening new opportunities for trait-based research in taxonomy, phylogeny, eco-physiology,
and functional ecology
Informative and misinformative interactions in a school of fish
It is generally accepted that, when moving in groups, animals process
information to coordinate their motion. Recent studies have begun to apply
rigorous methods based on Information Theory to quantify such distributed
computation. Following this perspective, we use transfer entropy to quantify
dynamic information flows locally in space and time across a school of fish
during directional changes around a circular tank, i.e. U-turns. This analysis
reveals peaks in information flows during collective U-turns and identifies two
different flows: an informative flow (positive transfer entropy) based on fish
that have already turned about fish that are turning, and a misinformative flow
(negative transfer entropy) based on fish that have not turned yet about fish
that are turning. We also reveal that the information flows are related to
relative position and alignment between fish, and identify spatial patterns of
information and misinformation cascades. This study offers several
methodological contributions and we expect further application of these
methodologies to reveal intricacies of self-organisation in other animal groups
and active matter in general
Characterising the neck motor system of the blowfly
Flying insects use visual, mechanosensory, and proprioceptive information to control their
movements, both when on the ground and when airborne. Exploiting visual information for
motor control is significantly simplified if the eyes remain aligned with the external horizon.
In fast flying insects, head rotations relative to the body enable gaze stabilisation during highspeed
manoeuvres or externally caused attitude changes due to turbulent air.
Previous behavioural studies into gaze stabilisation suffered from the dynamic properties
of the supplying sensor systems and those of the neck motor system being convolved.
Specifically, stabilisation of the head in Dipteran flies responding to induced thorax roll
involves feed forward information from the mechanosensory halteres, as well as feedback
information from the visual systems. To fully understand the functional design of the blowfly
gaze stabilisation system as a whole, the neck motor system needs to be investigated
independently.
Through X-ray micro-computed tomography (ÎŒCT), high resolution 3D data has become
available, and using staining techniques developed in collaboration with the Natural History
Museum London, detailed anatomical data can be extracted. This resulted in a full 3-
dimensional anatomical representation of the 21 neck muscle pairs and neighbouring cuticula
structures which comprise the blowfly neck motor system.
Currently, on the work presented in my PhD thesis, ÎŒCT data are being used to infer
function from structure by creating a biomechanical model of the neck motor system. This
effort aims to determine the specific function of each muscle individually, and is likely to
inform the design of artificial gaze stabilisation systems. Any such design would incorporate
both sensory and motor systems as well as the control architecture converting sensor signals
into motor commands under the given physical constraints of the system as a whole.Open Acces
Sustainable control of infestations using image processing and modelling
A sustainable pest control system integrates automated pest detection and recognition to evaluate the pest density using image samples taken from habitats. Novel predator/prey modelling algorithms assess control requirements for the UAV system, which is designed to deliver measured quantities of naturally beneficial predators to combat pest infestations within economically acceptable timeframes. The integrated system will reduce the damaging effect of pests in an infested habitat to an economically acceptable level without the use of chemical pesticides.
Plant pest recognition and detection is vital for food security, quality of life and a stable agricultural economy. The research utilises a combination of the k-means clustering algorithm and the correspondence filter to achieve pest detection and recognition. The detection is achieved by partitioning the data space into Voronoi cells, which tends to find clusters of comparable spatial extents, thereby separating the objects (pests) from the background (pest habitat). The detection is established by extracting the variant and distinctive attributes between the pest and its habitat (leaf, stem) and using the correspondence filter to identify the plant pests to obtain correlation peak values for the different datasets. The correspondence filter can achieve rotationally invariant recognition of pests for a full 360 degrees, which proves the effectiveness of the algorithm and provides a count of the number of pests in the image.
A series of models has been produced that will permit an assessment of common pest infestation problems and estimate the number of predators that are required to control the problem within a time schedule. A UAV predator deployment system has been designed.
The system is offered as a replacement for chemical pesticides to improve peoplesâ health opportunities and the quality of food products
- âŠ