254 research outputs found
Understanding citizen science and environmental monitoring: final report on behalf of UK Environmental Observation Framework
Citizen science can broadly be defined as the involvement of volunteers in science. Over the past decade there has
been a rapid increase in the number of citizen science initiatives. The breadth of environmental-based citizen
science is immense. Citizen scientists have surveyed for and monitored a broad range of taxa, and also contributed
data on weather and habitats reflecting an increase in engagement with a diverse range of observational science.
Citizen science has taken many varied approaches from citizen-led (co-created) projects with local community
groups to, more commonly, scientist-led mass participation initiatives that are open to all sectors of society. Citizen
science provides an indispensable means of combining environmental research with environmental education and
wildlife recording.
Here we provide a synthesis of extant citizen science projects using a novel cross-cutting approach to objectively
assess understanding of citizen science and environmental monitoring including: 1. Brief overview of knowledge on the motivations of volunteers.
2. Semi-systematic review of environmental citizen science projects in order to understand the variety of
extant citizen science projects.
3. Collation of detailed case studies on a selection of projects to complement the semi-systematic review.
4. Structured interviews with users of citizen science and environmental monitoring data focussing on policy, in
order to more fully understand how citizen science can fit into policy needs.
5. Review of technology in citizen science and an exploration of future opportunities
Botulinum Neurotoxin Serotypes Detected by Electrochemical Impedance Spectroscopy
Botulinum neurotoxin is one of the deadliest biological toxins known to mankind and is able to cause the debilitating disease botulism. The rapid detection of the different serotypes of botulinum neurotoxin is essential for both diagnosis of botulism and identifying the presence of toxin in potential cases of terrorism and food contamination. The modes of action of botulinum neurotoxins are well-established in literature and differ for each serotype. The toxins are known to specifically cleave portions of the SNARE proteins SNAP-25 or VAMP; an interaction that can be monitored by electrochemical impedance spectroscopy. This study presents a SNAP-25 and a VAMP biosensors for detecting the activity of five botulinum neurotoxin serotypes (A-E) using electrochemical impedance spectroscopy. The biosensors are able to detect concentrations of toxins as low as 25 fg/mL, in a short time-frame compared with the current standard methods of detection. Both biosensors show greater specificity for their compatible serotypes compared with incompatible serotypes and denatured toxins
Reddening law and interstellar dust properties along Magellanic sight-lines
This study establishes that SMC, LMC and Milky Way extinction curves obey the
same extinction law which depends on the 2200A bump size and one parameter, and
generalizes the Cardelli, Clayton and Mathis (1989) relationship. This suggests
that extinction in all three galaxies is of the same nature. The role of linear
reddening laws over all the visible/UV wavelength range, particularly important
in the SMC but also present in the LMC and in the Milky Way, is also
highlighted and discussed.Comment: accepted for publication in Astrophysics and Space Science. 16 pages,
12 figures. Some figures are colour plot
Self-stabilizing algorithms for Connected Vertex Cover and Clique decomposition problems
In many wireless networks, there is no fixed physical backbone nor
centralized network management. The nodes of such a network have to
self-organize in order to maintain a virtual backbone used to route messages.
Moreover, any node of the network can be a priori at the origin of a malicious
attack. Thus, in one hand the backbone must be fault-tolerant and in other hand
it can be useful to monitor all network communications to identify an attack as
soon as possible. We are interested in the minimum \emph{Connected Vertex
Cover} problem, a generalization of the classical minimum Vertex Cover problem,
which allows to obtain a connected backbone. Recently, Delbot et
al.~\cite{DelbotLP13} proposed a new centralized algorithm with a constant
approximation ratio of for this problem. In this paper, we propose a
distributed and self-stabilizing version of their algorithm with the same
approximation guarantee. To the best knowledge of the authors, it is the first
distributed and fault-tolerant algorithm for this problem. The approach
followed to solve the considered problem is based on the construction of a
connected minimal clique partition. Therefore, we also design the first
distributed self-stabilizing algorithm for this problem, which is of
independent interest
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
A First Search for coincident Gravitational Waves and High Energy Neutrinos using LIGO, Virgo and ANTARES data from 2007
We present the results of the first search for gravitational wave bursts
associated with high energy neutrinos. Together, these messengers could reveal
new, hidden sources that are not observed by conventional photon astronomy,
particularly at high energy. Our search uses neutrinos detected by the
underwater neutrino telescope ANTARES in its 5 line configuration during the
period January - September 2007, which coincided with the fifth and first
science runs of LIGO and Virgo, respectively. The LIGO-Virgo data were analysed
for candidate gravitational-wave signals coincident in time and direction with
the neutrino events. No significant coincident events were observed. We place
limits on the density of joint high energy neutrino - gravitational wave
emission events in the local universe, and compare them with densities of
merger and core-collapse events.Comment: 19 pages, 8 figures, science summary page at
http://www.ligo.org/science/Publication-S5LV_ANTARES/index.php. Public access
area to figures, tables at
https://dcc.ligo.org/cgi-bin/DocDB/ShowDocument?docid=p120000
and in the Two Higgs Doublet Model with Flavor Changing Neutral Currents
A study of and is presented in the context of a Two Higgs Doublet
Model (2HDM) with flavor changing scalar currents (FCSC). Implications of the
model for the -parameter and for are also considered. The
experimental data on places stringent constraints on the model
parameters. The configuration of the model needed to account for is found
to be irreconcilable with constraints from and . In
particular, if R^{\rm exp}_b>R^{\sss{\rm SM}}_b persists then this version of
2HDM will be ruled out or require significant modifications. Noting that
aspects of the experimental analysis for and may be of some
concern, we also disregard and and give
predictions for these using constraints from and
parameter only. We emphasize the theoretical and experimental advantages of the
observable R_{b+c}\equiv \Gamma(Z\to b\bar b\mbox{ or } c\bar
c)/\Gamma(Z\to\mbox{hadrons}). We also stress the role of R_\ell\equiv
\Gamma(Z\to\mbox{hadrons})/\Gamma(Z\to \ell^+\ell^-) in testing the Standard
Model (SM) despite its dependence on QCD corrections. Noting that in models
with FCNC the amplitude for receives a contribution which grows
with , the importance and uniqueness of precision
measurements for constraining flavor changing currents is
underscored.Comment: 35 pages, 5 Postscript figures, 10 Postscript files used in the tex
file, uses epsf.st
- …