785 research outputs found
Ambiguity in guideline definitions introduces assessor bias and influences consistency in IUCN Red List status assessments
The IUCN Red List is the most widely used tool to measure extinction risk and report biodiversity trends. Accurate and standardized conservation status assessments for the IUCN Red List are limited by a lack of adequate information; and need consistent and unbiased interpretation of that information. Variable interpretation stems from a lack of quantified thresholds in certain areas of the Red List guidelines. Thus, even in situations with sufficient information to make a Red List assessment, inconsistency can occur when experts, especially from different regions, interpret the guidelines differently, thereby undermining the goals and credibility of the process. Assessors make assumptions depending on their level of Red List experience (subconscious bias) and their personal values or agendas (conscious bias). We highlight two major issues where such bias influences assessments: relating to fenced subpopulations that require intensive management; and defining benchmark geographic distributions and thus the inclusion/exclusion of introduced subpopulations. We suggest assessor bias can be reduced by refining the Red List guidelines to include quantified thresholds for when to include fenced/intensively managed subpopulations or subpopulations outside the benchmark distribution; publishing case studies of difficult assessments to enhance cohesion between Specialist Groups; developing an online accreditation course on applying Red List criteria as a prerequisite for assessors; and ensuring that assessments of species subject to trade and utilization are represented by all dissenting views (for example, both utilitarian and preservationist) and reviewed by relevant Specialist Groups. We believe these interventions would ensure consistent, reliable assessments of threatened species between regions and across assessors with divergent views, and will thus improve comparisons between taxa and counteract the use of Red List assessments as a tool to leverage applied agendas.University of Bangor, University of Pretoria, CIB, the Scientific Authority of the South
African National Biodiversity Institute
RESOLVING UNCERTAINTIES IN FORAMINIFERA-BASED RELATIVE SEA-LEVEL RECONSTRUCTION : A CASE STUDY FROM SOUTHERN NEW ZEALAND
Since the pioneering work of David Scott and others in the 1970s and 1980s, foraminifera have been used to develop precise sea-level reconstructions from salt marshes around the world. In New Zealand, reconstructions feature rapid rates of sea-level rise during the early to mid-20th century. Here, we test whether infaunality, taphonomy, and sediment compaction influence these reconstructions. We find that surface (0â1 cm) and subsurface (3â4 cm) foraminiferal assemblages show a high degree of similarity. A landward shift in assemblage zones is consistent with recent sea-level rise and transgression. Changes associated with infaunality and taphonomy do not affect transfer function-based sea-level reconstructions. Applying a geotechnical modelling approach to the core from which sea-level changes were reconstructed, we demonstrate compaction is also negligible, resulting in maximum post-depositional lowering of 2.5 mm. We conclude that salt-marsh foraminifera are indeed highly accurate and precise indicators of past sea levels
Voltage-controlled superparamagnetic ensembles for low-power reservoir computing
We propose thermally driven, voltage-controlled superparamagnetic ensembles as low-energy platforms for hardware-based reservoir computing. In the proposed devices, thermal noise is used to drive the ensembles' magnetization dynamics, while control of their net magnetization states is provided by strain-mediated voltage inputs. Using an ensemble of CoFeB nanodots as an example, we use analytical models and micromagnetic simulations to demonstrate how such a device can function as a reservoir and perform two benchmark machine learning tasks (spoken digit recognition and chaotic time series prediction) with competitive performance. Our results indicate robust performance on timescales from microseconds to milliseconds, potentially allowing such a reservoir to be tuned to perform a wide range of real-time tasks, from decision making in driverless cars (fast) to speech recognition (slow). The low energy consumption expected for such a device makes it an ideal candidate for use in edge computing applications that require low latency and power.
The authors thank the Engineering and Physical Sciences Research Council (Grant No.: EP/S009647/1 and EP/V006339/1) for financial support. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement No. 861618 (SpinENGINE)
Animal welfare considerations for using large carnivores and guardian dogs as vertebrate biocontrol tools against other animals
Introducing consumptive and non-consumptive effects into food webs can have profound effects on individuals, populations and communities. This knowledge has led to the deliberate use of predation and/or fear of predation as an emerging technique for controlling wildlife. Many now advocate for the intentional use of large carnivores and livestock guardian dogs as more desirable alternatives to traditional wildlife control approaches like fencing, shooting, trapping, or poisoning. However, there has been very little consideration of the animal welfare implications of deliberately using predation as a wildlife management tool. We assess the animal welfare impacts of using dingoes, leopards and guardian dogs as biocontrol tools against wildlife in Australia and South Africa
following the âFive Domainsâ model commonly used to assess other wildlife management tools. Application of this model indicates that large carnivores and guardian dogs cause considerable lethal and non-lethal animal welfare impacts to the individual animals they are intended to control. These impacts are likely similar across different predator-prey systems, but are dependent on specific predator-prey combinations; combinations that result in short chases and quick kills will be rated as less harmful than those that result in long chases and protracted kills. Moreover, these impacts are typically rated greater than those caused by traditional wildlife control techniques. The intentional lethal and non-lethal harms caused by large carnivores and guardian dogs should not be ignored or dismissively assumed to be negligible. A greater understanding of the impacts they impose would benefit from empirical studies of the animal welfare outcomes arising from their use in different contexts
Two-dimensional Quantum-Corrected Eternal Black Hole
The one-loop quantum corrections to geometry and thermodynamics of black hole
are studied for the two-dimensional RST model. We chose boundary conditions
corresponding to the eternal black hole being in the thermal equilibrium with
the Hawking radiation. The equations of motion are exactly integrated. The one
of the solutions obtained is the constant curvature space-time with dilaton
being a constant function. Such a solution is absent in the classical theory.
On the other hand, we derive the quantum-corrected metric (\ref{solution})
written in the Schwarzschild like form which is a deformation of the classical
black hole solution \cite{5d}. The space-time singularity occurs to be milder
than in classics and the solution admits two asymptotically flat black hole
space-times lying at "different sides" of the singularity. The thermodynamics
of the classical black hole and its quantum counterpart is formulated. The
thermodynamical quantities (energy, temperature, entropy) are calculated and
occur to be the same for both the classical and quantum-corrected black holes.
So, no quantum corrections to thermodynamics are observed. The possible
relevance of the results obtained to the four-dimensional case is discussed.Comment: Latex, 28 pges; minor corrections in text and abstract made and new
references adde
An action for the exact string black hole
A local action is constructed describing the exact string black hole
discovered by Dijkgraaf, Verlinde and Verlinde in 1992. It turns out to be a
special 2D Maxwell-dilaton gravity theory, linear in curvature and field
strength. Two constants of motion exist: mass M>1, determined by the level k,
and U(1)-charge Q>0, determined by the value of the dilaton at the origin. ADM
mass, Hawking temperature T_H \propto \sqrt{1-1/M} and Bekenstein-Hawking
entropy are derived and studied in detail. Winding/momentum mode duality
implies the existence of a similar action, arising from a branch ambiguity,
which describes the exact string naked singularity. In the strong coupling
limit the solution dual to AdS_2 is found to be the 5D Schwarzschild black
hole. Some applications to black hole thermodynamics and 2D string theory are
discussed and generalizations - supersymmetric extension, coupling to matter
and critical collapse, quantization - are pointed out.Comment: 41 pages, 2 eps figures, dedicated to Wolfgang Kummer on occasion of
his Emeritierung; v2: added ref; v3: extended discussion in sections 3.2, 3.3
and at the end of 5.3 by adding 2 pages of clarifying text; updated refs;
corrected typo
Should science educators deal with the science/religion issue?
I begin by examining the natures of science and religion before looking at the ways in which they relate to one another. I then look at a number of case studies that centre on the relationships between science and religion, including attempts to find mechanisms for divine action in quantum theory and chaos theory, creationism, genetic engineering and the writings of Richard Dawkins. Finally, I consider some of the pedagogical issues that would need to be considered if the science/religion issue is to be addressed in the classroom. I conclude that there are increasing arguments in favour of science educators teaching about the science/religion issue. The principal reason for this is to help students better to learn science. However, such teaching makes greater demands on science educators than has generally been the case. Certain of these demands are identified and some specific suggestions are made as to how a science educator might deal with the science/religion issue. © 2008 Taylor & Francis
Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in âs = 7 TeV pp collisions with the ATLAS detector
A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fbâ1 of protonâproton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results
Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC
Measurements of inclusive jet suppression in heavy ion collisions at the LHC
provide direct sensitivity to the physics of jet quenching. In a sample of
lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated
luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with
a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the
transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the
anti-kt algorithm with values for the distance parameter that determines the
nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of
the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp.
Jet production is found to be suppressed by approximately a factor of two in
the 10% most central collisions relative to peripheral collisions. Rcp varies
smoothly with centrality as characterized by the number of participating
nucleons. The observed suppression is only weakly dependent on jet radius and
transverse momentum. These results provide the first direct measurement of
inclusive jet suppression in heavy ion collisions and complement previous
measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables,
submitted to Physics Letters B. All figures including auxiliary figures are
available at
http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02
Measurement of the polarisation of W bosons produced with large transverse momentum in pp collisions at sqrt(s) = 7 TeV with the ATLAS experiment
This paper describes an analysis of the angular distribution of W->enu and
W->munu decays, using data from pp collisions at sqrt(s) = 7 TeV recorded with
the ATLAS detector at the LHC in 2010, corresponding to an integrated
luminosity of about 35 pb^-1. Using the decay lepton transverse momentum and
the missing transverse energy, the W decay angular distribution projected onto
the transverse plane is obtained and analysed in terms of helicity fractions
f0, fL and fR over two ranges of W transverse momentum (ptw): 35 < ptw < 50 GeV
and ptw > 50 GeV. Good agreement is found with theoretical predictions. For ptw
> 50 GeV, the values of f0 and fL-fR, averaged over charge and lepton flavour,
are measured to be : f0 = 0.127 +/- 0.030 +/- 0.108 and fL-fR = 0.252 +/- 0.017
+/- 0.030, where the first uncertainties are statistical, and the second
include all systematic effects.Comment: 19 pages plus author list (34 pages total), 9 figures, 11 tables,
revised author list, matches European Journal of Physics C versio
- âŠ