133 research outputs found

    The use of opportunistic data for IUCN Red List assessments

    Get PDF
    IUCN Red Lists are recognized worldwide as powerful instruments for the conservation of species. Quantitative criteria to standardize approaches for estimating population trends, geographic ranges and population sizes have been developed at global and sub-global levels. Little attention has been given to the data needed to estimate species trends and range sizes for IUCN Red List assessments. Few regions collect monitoring data in a structured way and usually only for a limited number of taxa. Therefore, opportunistic data are increasingly used for estimating trends and geographic range sizes. Trend calculations use a range of proxies: (i) monitoring sentinel populations, (ii) estimating changes in available habitat, or (iii) statistical models of change based on opportunistic records. Geographic ranges have been determined using: (i) marginal occurrences, (ii) habitat distributions, (iii) range-wide occurrences, (iv) species distribution modelling (including site-occupancy models), and (v) process-based modelling. Red List assessments differ strongly among regions (Europe, Britain and Flanders, north Belgium). Across different taxonomic groups, in European Red Lists IUCN criteria B and D resulted in the highest level of threat. In Britain, this was the case for criterion D and criterion A, while in Flanders criterion B and criterion A resulted in the highest threat level. Among taxonomic groups, however, large differences in the use of IUCN criteria were revealed. We give examples from Europe, Britain and Flemish Red List assessments using opportunistic data and give recommendations for a more uniform use of IUCN criteria among regions and among taxonomic groups

    Dephasing via stochastic absorption: A case study in Aharonov-Bohm oscillations

    Get PDF
    The Aharonov-Bohm ring has been the mainstay of mesoscopic physics research since its inception. In this paper we have dwelt on the problem of dephasing of AB oscillations using a phenomenological model based on stochastic absorption. To calculate the conductance in the presence of inelastic scattering we have used the method due to Brouwer and Beenakker. We have shown that conductance is symmetric under flux reversal and visibility of AB oscillations decay to zero as a function of the incoherence parameter thus signalling dephasing in the system. Some comments are made on the relative merits of stochastic absorption with respect to optical potential model, which have been used to mimic dephasing.Comment: 4 pages, 4 figures Minor corrections made and journal reference adde

    Evolving the Face of a Criminal: How to Search a Face Space More Effectively

    Get PDF
    Witnesses and victims of serious crime are often required to construct a facial composite, a visual likeness of a suspect’s face. The traditional method is for them to select individual facial features to build a face, but often these images are of poor quality. We have developed a new method whereby witnesses repeatedly select instances from an array of complete faces and a composite is evolved over time by searching a face model built using PCA. While past research suggests that the new approach is superior, performance is far from ideal. In the current research, face models are built which match a witness’s description of a target. It is found that such ‘tailored’ models promote better quality composites, presumably due to a more effective search, and also that smaller models may be even better. The work has implications for researchers who are using statistical modelling techniques for recognising faces

    A versatile cancer cell trapping and 1D migration assay in a microfluidic device

    Get PDF
    Highly migratory cancer cells often lead to metastasis and recurrence and are responsible for the high mortality rates in many cancers despite aggressive treatment. Recently, the migratory behavior of patient-derived glioblastoma multiforme cells on microtracks has shown potential in predicting the likelihood of recurrence, while at the same time, antimetastasis drugs have been developed which require simple yet relevant high-throughput screening systems. However, robust in vitro platforms which can reliably seed single cells and measure their migration while mimicking the physiological tumor microenvironment have not been demonstrated. In this study, we demonstrate a microfluidic device which hydrodynamically seeds single cancer cells onto stamped or femtosecond laser ablated polystyrene microtracks, promoting 1D migratory behavior due to the cells' tendency to follow topographical cues. Using time-lapse microscopy, we found that single U87 glioblastoma multiforme cells migrated more slowly on laser ablated microtracks compared to stamped microtracks of equal width and spacing (p < 0.05) and exhibited greater directional persistence on both 1D patterns compared to flat polystyrene (p < 0.05). Single-cell morphologies also differed significantly between flat and 1D patterns, with cells on 1D substrates exhibiting higher aspect ratios and less circularity (p < 0.05). This microfluidic platform could lead to automated quantification of single-cell migratory behavior due to the high predictability of hydrodynamic seeding and guided 1D migration, an important step to realizing the potential of microfluidic migration assays for drug screening and individualized medicine. Published under license by AIP Publishing

    Nanoparticle tracking analysis of gold nanoparticles in aqueous media through an inter-laboratory comparison

    Get PDF
    In the field of nanotechnology, analytical characterization plays a vital role in understanding the behavior and toxicity of nanomaterials (NMs). Characterization needs to be thorough and the technique chosen should be well-suited to the property to be determined, the material being analyzed and the medium in which it is present. Furthermore, the instrument operation and methodology need to be well-developed and clearly understood by the user to avoid data collection errors. Any discrepancies in the applied method or procedure can lead to differences and poor reproducibility of obtained data. This paper aims to clarify the method to measure the hydrodynamic diameter of gold nanoparticles by means of Nanoparticle Tracking Analysis (NTA). This study was carried out as an inter-laboratory comparison (ILC) amongst seven different laboratories to validate the standard operating procedure’s performance and reproducibility. The results obtained from this ILC study reveal the importance and benefits of detailed standard operating procedures (SOPs), best practice updates, user knowledge, and measurement automation

    Cliques in high-dimensional random geometric graphs

    Get PDF
    International audienceRandom geometric graphs have become now a popular object of research. Defined rather simply, these graphs describe real networks much better than classical ErdƑs–RĂ©nyi graphs due to their ability to produce tightly connected communities. The nn vertices of a random geometric graph are points in dd-dimensional Euclidean space, and two vertices are adjacent if they are close to each other. Many properties of these graphs have been revealed in the case when dd is fixed. However, the case of growing dimension dd is practically unexplored. This regime corresponds to a real-life situation when one has a data set of n observations with a significant number of features, a quite common case in data science today. In this paper, we study the clique structure of random geometric graphs when n→∞n \to \infty, and d→∞d \to \infty, and average vertex degree grows significantly slower than nn. We show that under these conditions, random geometric graphs do not contain cliques of size 4 a.s. if only d>>log⁥1+Ï”(n)d >> \log^{1+\epsilon}(n). As for the cliques of size 3, we present new bounds on the expected number of triangles in the case log⁥2(n)<<d<<log⁥3(n)\log^2(n) << d << \log^3(n) that improve previously known results. In addition, we provide new numerical results showing that the underlying geometry can be detected using the number of triangles even for small nn

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Cross-ancestry genome-wide association analysis of corneal thickness strengthens link between complex and Mendelian eye diseases

    Get PDF
    Central corneal thickness (CCT) is a highly heritable trait associated with complex eye diseases such as keratoconus and glaucoma. We perform a genome-wide association meta-analysis of CCT and identify 19 novel regions. In addition to adding support for known connective tissue-related pathways, pathway analyses uncover previously unreported gene sets. Remarkably, >20% of the CCT-loci are near or within Mendelian disorder genes. These included FBN1, ADAMTS2 and TGFB2 which associate with connective tissue disorders (Marfan, Ehlers-Danlos and Loeys-Dietz syndromes), and the LUM-DCN-KERA gene complex involved in myopia, corneal dystrophies and cornea plana. Using index CCT-increasing variants, we find a significant inverse correlation in effect sizes between CCT and keratoconus (r =-0.62, P = 5.30 × 10-5) but not between CCT and primary open-angle glaucoma (r =-0.17, P = 0.2). Our findings provide evidence for shared genetic influences between CCT and keratoconus, and implicate candidate genes acting in collagen and extracellular matrix regulation
    • 

    corecore