44,661 research outputs found
Technical Guidance Sheet (TGS) on normal levels of contaminants in English soils : copper (Cu) : technical guidance sheet supplementary information TGS03s, July 2012
Both the NSI (XRFS) and G-BASE data sets are derived from a soil sample that has been aggregated (composited) from a number of subsamples collected over the area of a site, rather than a single point sample. In the case of NSI this is 25 cores (subsamples) from a 20-m square (McGrath and Loveland 1992) whereas G-BASE is 5 cores, also from a 20-m square (Johnson et al. 2005; Fordyce et al. 2005). If a sample is collected as a single core, and the result is compared to the NBC, it is important to be aware that short-range variation (which can be substantial) for the single core sample will be potentially much greater than for the samples from which the NBC values are derived (Lark, 2012).
Soil samples used to calculate the Cu NBCs have been collected from the top 15 cm of the mineral soil profile (hence they are referred to as topsoils). When the sample is collected from a site covered with vegetation the surface organic layers (leaf litter) do not form part of the sample collected. Any recently deposited airborne particulates that have not yet migrated into the soil profile will not be sampled and surface organic material, which has the capacity to fix some contaminants from atmospheric deposition, is not included as part of the sample. In urban areas the top 15 cm will be expected to have been modified by historical urban land uses and, in rural agricultural areas, where relevant, will be within the ploughed horizon. Surveys targeting recent airborne pollution added to the soil will generally only collect from the top 2 cm of the profile in order to bias the soil results toward the airborne pollutant inputs. Such data has not been used in the NBC calculation
Let Your CyberAlter Ego Share Information and Manage Spam
Almost all of us have multiple cyberspace identities, and these {\em
cyber}alter egos are networked together to form a vast cyberspace social
network. This network is distinct from the world-wide-web (WWW), which is being
queried and mined to the tune of billions of dollars everyday, and until
recently, has gone largely unexplored. Empirically, the cyberspace social
networks have been found to possess many of the same complex features that
characterize its real counterparts, including scale-free degree distributions,
low diameter, and extensive connectivity. We show that these topological
features make the latent networks particularly suitable for explorations and
management via local-only messaging protocols. {\em Cyber}alter egos can
communicate via their direct links (i.e., using only their own address books)
and set up a highly decentralized and scalable message passing network that can
allow large-scale sharing of information and data. As one particular example of
such collaborative systems, we provide a design of a spam filtering system, and
our large-scale simulations show that the system achieves a spam detection rate
close to 100%, while the false positive rate is kept around zero. This system
has several advantages over other recent proposals (i) It uses an already
existing network, created by the same social dynamics that govern our daily
lives, and no dedicated peer-to-peer (P2P) systems or centralized server-based
systems need be constructed; (ii) It utilizes a percolation search algorithm
that makes the query-generated traffic scalable; (iii) The network has a built
in trust system (just as in social networks) that can be used to thwart
malicious attacks; iv) It can be implemented right now as a plugin to popular
email programs, such as MS Outlook, Eudora, and Sendmail.Comment: 13 pages, 10 figure
Technical Guidance Sheet (TGS) on normal levels of contaminants in English soils : supplementary information : cadium (Cd) : technical guidance sheet supplementary information TGS06s, July 2012
Both the NSI (XRFS) and G-BASE data sets are derived from a soil sample that has been aggregated (composited) from a number of subsamples collected over the area of a site, rather than a single point sample. In the case of NSI this is 25 cores (subsamples) from a 20-m square (McGrath and Loveland 1992) whereas G-BASE is 5 cores, also from a 20-m square (Johnson et al. 2005; Fordyce et al. 2005). If a sample is collected as a single core, and the result is compared to the NBC, it is important to be aware that short-range variation (which can be substantial) for the single core sample will be potentially much greater than for the samples from which the NBC values are derived (Lark, 2012)
The Deconstructed (or Distributed) Journal - an emerging model?
Reviews the development of the Deconstructed Journal academic publishing model. The model was first proposed in something like its present form in 1997 and further developed in 1999. Although not actively promoted elements of the model appear to be emerging spontaneously from the general developments in online academic publishing
- …