3,593 research outputs found

    Capturing natural-colour 3D models of insects for species discovery

    Full text link
    Collections of biological specimens are fundamental to scientific understanding and characterization of natural diversity. This paper presents a system for liberating useful information from physical collections by bringing specimens into the digital domain so they can be more readily shared, analyzed, annotated and compared. It focuses on insects and is strongly motivated by the desire to accelerate and augment current practices in insect taxonomy which predominantly use text, 2D diagrams and images to describe and characterize species. While these traditional kinds of descriptions are informative and useful, they cannot cover insect specimens "from all angles" and precious specimens are still exchanged between researchers and collections for this reason. Furthermore, insects can be complex in structure and pose many challenges to computer vision systems. We present a new prototype for a practical, cost-effective system of off-the-shelf components to acquire natural-colour 3D models of insects from around 3mm to 30mm in length. Colour images are captured from different angles and focal depths using a digital single lens reflex (DSLR) camera rig and two-axis turntable. These 2D images are processed into 3D reconstructions using software based on a visual hull algorithm. The resulting models are compact (around 10 megabytes), afford excellent optical resolution, and can be readily embedded into documents and web pages, as well as viewed on mobile devices. The system is portable, safe, relatively affordable, and complements the sort of volumetric data that can be acquired by computed tomography. This system provides a new way to augment the description and documentation of insect species holotypes, reducing the need to handle or ship specimens. It opens up new opportunities to collect data for research, education, art, entertainment, biodiversity assessment and biosecurity control.Comment: 24 pages, 17 figures, PLOS ONE journa

    No small feat: microRNA responses during vocal communication in songbirds

    Get PDF
    Simply hearing the song produced by another bird of the same species triggers the regulation of microRNAs (miRNAs) in high-order auditory parts of the zebra finch brain. Some of the identified miRNAs appear to be unique to birds, possibly to songbirds. These findings, reported in BMC Genomics, highlight the complexities of gene regulation associated with vocal communication and point to possible key regulators of song-triggered gene networks

    Egg-laying substrate selection for optimal camouflage by quail

    Get PDF
    Camouflage is conferred by background matching and disruption, which are both affected by microhabitat [1]. However, microhabitat selection that enhances camouflage has only been demonstrated in species with discrete phenotypic morphs [2 and 3]. For most animals, phenotypic variation is continuous [4 and 5]; here we explore whether such individuals can select microhabitats to best exploit camouflage. We use substrate selection in a ground-nesting bird (Japanese quail, Coturnix japonica). For such species, threat from visual predators is high [6] and egg appearance shows strong between-female variation [7]. In quail, variation in appearance is particularly obvious in the amount of dark maculation on the light-colored shell [8]. When given a choice, birds consistently selected laying substrates that made visual detection of their egg outline most challenging. However, the strategy for maximizing camouflage varied with the degree of egg maculation. Females laying heavily maculated eggs selected the substrate that more closely matched egg maculation color properties, leading to camouflage through disruptive coloration. For lightly maculated eggs, females chose a substrate that best matched their egg background coloration, suggesting background matching. Our results show that quail “know” their individual egg patterning and seek out a nest position that provides most effective camouflage for their individual phenotyp

    Incorporating Environmental Impacts in the Measurement of Agricultural Productivity Growth

    Get PDF
    Agricultural production is known to have environmental impacts, both adverse and beneficial, and it is desirable to incorporate at least some of these impacts in an environmentally sensitive productivity index. In this paper, we construct indicators of water contamination from the use of agricultural chemicals. These environmental indicators are merged with data on marketed outputs and purchased inputs to form a state-by-year panel of relative levels of outputs and inputs, including environmental impacts. We do not have prices for these undesirable by products, since they are not marketed. Consequently, we calculate a series of Malmquist productivity indexes, which do not require price information. Our benchmark scenario is a conventional Malmquist productivity index based on marketed outputs and purchased inputs only. Our comparison scenarios consist of environmentally sensitive Malmquist productivity indexes that include indicators of risk to human health and to aquatic life from chronic exposure to pesticides. In addition, we derive a set of virtual prices of the undesirable by-products that can be used to calculate an environmentally sensitive Fisher index of productivity change.environmental impacts, productivity growth, Environmental Economics and Policy,

    3D Scanning System for Automatic High-Resolution Plant Phenotyping

    Full text link
    Thin leaves, fine stems, self-occlusion, non-rigid and slowly changing structures make plants difficult for three-dimensional (3D) scanning and reconstruction -- two critical steps in automated visual phenotyping. Many current solutions such as laser scanning, structured light, and multiview stereo can struggle to acquire usable 3D models because of limitations in scanning resolution and calibration accuracy. In response, we have developed a fast, low-cost, 3D scanning platform to image plants on a rotating stage with two tilting DSLR cameras centred on the plant. This uses new methods of camera calibration and background removal to achieve high-accuracy 3D reconstruction. We assessed the system's accuracy using a 3D visual hull reconstruction algorithm applied on 2 plastic models of dicotyledonous plants, 2 sorghum plants and 2 wheat plants across different sets of tilt angles. Scan times ranged from 3 minutes (to capture 72 images using 2 tilt angles), to 30 minutes (to capture 360 images using 10 tilt angles). The leaf lengths, widths, areas and perimeters of the plastic models were measured manually and compared to measurements from the scanning system: results were within 3-4% of each other. The 3D reconstructions obtained with the scanning system show excellent geometric agreement with all six plant specimens, even plants with thin leaves and fine stems.Comment: 8 papes, DICTA 201
    corecore