1,736 research outputs found

    Anthropogenic alteration of nutrient supply increases the global freshwater carbon sink

    Get PDF
    Lakes have a disproportionate effect on the global carbon (C) cycle relative to their area, mediating C transfer from land to atmosphere, and burying organic-C in their sediments. The magnitude and temporal variability of C burial is, however, poorly constrained, and the degree to which humans have influenced lake C cycling through landscape alteration has not been systematically assessed. Here, we report global and biome specific trajectories of lake C sequestration based on 516 lakes and show that some lake C burial rates (i.e., those in tropical forest and grassland biomes) have quadrupled over the last 100 years. Global lake C-sequestration (~0.12 Pg year−1) has increased by ~72 Tg year−1 since 1900, offsetting 20% of annual CO2 freshwater emissions rising to ~30% if reservoirs are included and contributing to the residual continental C sink. Nutrient availability explains ~70% of the observed increase, while rising temperatures have a minimal effect

    Investigating uptake of N2O in agricultural soils using a high-precision dynamic chamber method

    Get PDF
    Uptake (or negative flux) of nitrous oxide (N2O)in agricultural soils is a controversial issue which has proved difficult to investigate in the past due to constraints such as instrumental precision and methodological uncertainties. Using a recently developed high-precision quantum cascade laser gas analyser combined with a closed dynamic chamber, a well-defined detection limit of 4 μg N2O-N m could be achieved for individual soil flux measurements. 1220 mea- surements of N2O flux were made from a variety of UK soils using this method, of which 115 indicated uptake by the soil (i.e. a negative flux in the micrometeorological sign convention). Only four of these apparently negative fluxes were greater than the detection limit of the method, which suggests that the vast majority of reported negative fluxes from such measurements are actually due to instrument noise. As such, we suggest that the bulk of negative N2O fluxes reported for agricultural fields are most likely due to limits in detection of a particular flux measurement methodology and not a result of microbiological activity consuming atmospheric N2O

    Naturally Rehearsing Passwords

    Full text link
    We introduce quantitative usability and security models to guide the design of password management schemes --- systematic strategies to help users create and remember multiple passwords. In the same way that security proofs in cryptography are based on complexity-theoretic assumptions (e.g., hardness of factoring and discrete logarithm), we quantify usability by introducing usability assumptions. In particular, password management relies on assumptions about human memory, e.g., that a user who follows a particular rehearsal schedule will successfully maintain the corresponding memory. These assumptions are informed by research in cognitive science and validated through empirical studies. Given rehearsal requirements and a user's visitation schedule for each account, we use the total number of extra rehearsals that the user would have to do to remember all of his passwords as a measure of the usability of the password scheme. Our usability model leads us to a key observation: password reuse benefits users not only by reducing the number of passwords that the user has to memorize, but more importantly by increasing the natural rehearsal rate for each password. We also present a security model which accounts for the complexity of password management with multiple accounts and associated threats, including online, offline, and plaintext password leak attacks. Observing that current password management schemes are either insecure or unusable, we present Shared Cues--- a new scheme in which the underlying secret is strategically shared across accounts to ensure that most rehearsal requirements are satisfied naturally while simultaneously providing strong security. The construction uses the Chinese Remainder Theorem to achieve these competing goals

    Spectroscopic biomedical imaging with the Medipix2 detector

    Get PDF
    This study confirms that the Medipix2 x-ray detector enables spectroscopic bio-medical plain radiography. We show that the detector has the potential to provide new, useful information beyond the limited spectroscopic information of modern dual-energy computed tomography (CT) scanners. Full spectroscopic 3D-imaging is likely to be the next major technological advance in computed tomography, moving the modality towards molecular imaging applications. This paper focuses on the enabling technology which allows spectroscopic data collection and why this information is useful. In this preliminary study we acquired the first spectroscopic images of human tissue and other biological samples obtained using the Medipix2 detector. The images presented here include the clear resolution of the 1.4mm long distal phalanx of a 20 week old miscarried foetus, showing clear energy-dependent variations. The opportunities for further research using the forthcoming Medipix3 detector are discussed and a prototype spectroscopic CT scanner (MARS, Medipix All Resolution System) is briefly described

    Gravitation, electromagnetism and cosmological constant in purely affine gravity

    Full text link
    The Ferraris-Kijowski purely affine Lagrangian for the electromagnetic field, that has the form of the Maxwell Lagrangian with the metric tensor replaced by the symmetrized Ricci tensor, is dynamically equivalent to the metric Einstein-Maxwell Lagrangian, except the zero-field limit, for which the metric tensor is not well-defined. This feature indicates that, for the Ferraris-Kijowski model to be physical, there must exist a background field that depends on the Ricci tensor. The simplest possibility, supported by recent astronomical observations, is the cosmological constant, generated in the purely affine formulation of gravity by the Eddington Lagrangian. In this paper we combine the electromagnetic field and the cosmological constant in the purely affine formulation. We show that the sum of the two affine (Eddington and Ferraris-Kijowski) Lagrangians is dynamically inequivalent to the sum of the analogous (Λ\LambdaCDM and Einstein-Maxwell) Lagrangians in the metric-affine/metric formulation. We also show that such a construction is valid, like the affine Einstein-Born-Infeld formulation, only for weak electromagnetic fields, on the order of the magnetic field in outer space of the Solar System. Therefore the purely affine formulation that combines gravity, electromagnetism and cosmological constant cannot be a simple sum of affine terms corresponding separately to these fields. A quite complicated form of the affine equivalent of the metric Einstein-Maxwell-Λ\Lambda Lagrangian suggests that Nature can be described by a simpler affine Lagrangian, leading to modifications of the Einstein-Maxwell-Λ\LambdaCDM theory for electromagnetic fields that contribute to the spacetime curvature on the same order as the cosmological constant.Comment: 17 pages, extended and combined with gr-qc/0612193; published versio

    Data Analysis Challenges for the Einstein Telescope

    Full text link
    The Einstein Telescope is a proposed third generation gravitational wave detector that will operate in the region of 1 Hz to a few kHz. As well as the inspiral of compact binaries composed of neutron stars or black holes, the lower frequency cut-off of the detector will open the window to a number of new sources. These will include the end stage of inspirals, plus merger and ringdown of intermediate mass black holes, where the masses of the component bodies are on the order of a few hundred solar masses. There is also the possibility of observing intermediate mass ratio inspirals, where a stellar mass compact object inspirals into a black hole which is a few hundred to a few thousand times more massive. In this article, we investigate some of the data analysis challenges for the Einstein Telescope such as the effects of increased source number, the need for more accurate waveform models and the some of the computational issues that a data analysis strategy might face.Comment: 18 pages, Invited review for Einstein Telescope special edition of GR

    Statistical Model of Superconductivity in a 2D Binary Boson-Fermion Mixture

    Full text link
    A two-dimensional (2D) assembly of noninteracting, temperature-dependent, composite-boson Cooper pairs (CPs) in chemical and thermal equilibrium with unpaired fermions is examined in a binary boson-fermion statistical model as the superconducting singularity temperature is approached from above. The model is derived from {\it first principles} for the BCS model interfermion interaction from three extrema of the system Helmholtz free energy (subject to constant pairable-fermion number) with respect to: a) the pairable-fermion distribution function; b) the number of excited (bosonic) CPs, i.e., with nonzero total momenta--usually ignored in BCS theory--and with the appropriate (linear, as opposed to quadratic) dispersion relation that arises from the Fermi sea; and c) the number of CPs with zero total momenta. Compared with the BCS theory condensate, higher singularity temperatures for the Bose-Einstein condensate are obtained in the binary boson-fermion mixture model which are in rough agreement with empirical critical temperatures for quasi-2D superconductorsComment: 16 pages and 4 figures. This is a improved versio

    Weakly-Interacting Bosons in a Trap within Approximate Second Quantization Approach

    Full text link
    The theory of Bogoliubov is generalized for the case of a weakly-interacting Bose-gas in harmonic trap. A set of nonlinear matrix equations is obtained to make the diagonalization of Hamiltonian possible. Its perturbative solution is used for the calculation of the energy and the condensate fraction of the model system to show the applicability of the method.Comment: 6 pages, two figures .Presented at the International Symposium on Quantum Fluids and Solids QFS2006 (Kyoto, Japan

    Опыт использования интерактивного элемента «лекция» в электронном учебном курсе «Основы САПР»

    Get PDF
    Many arctic landscapes are rich in lakes that store large quantities of organic carbon in their sediments. While there are indications of highly efficient carbon burial in high-latitude lakes, the magnitude and efficiency of carbon burial in arctic lake sediments, and thus their potential as carbon sinks, has not been studied systematically. We therefore investigated the burial efficiency of organic carbon (OC), defined as the ratio between OC burial and OC deposition onto the sediment, in seven contrasting lakes in western Greenland representing different arctic lake types. We found that the OC burial efficiency was generally low in spite of the differences between lake types (mean 22%, range 11–32%), and comparable to lakes in other climates with similar organic matter source and oxygen exposure time. Accordingly, post-depositional degradation of sediment organic matter was evident in the organic matter C:N ratio, δ13C and δ15N values during the initial ~50 years after deposition, and proceeds simultaneously with long-term changes in, e.g., productivity and climate. Pore water profiles of dissolved methane suggest that post-depositional degradation may continue for several centuries in these lakes, at very low rates. Our results demonstrate that the regulation of the sediment OC burial efficiency is no different in arctic lakes than in other lakes, implying that the efficiency of the carbon sink in lake sediments depends similarly on environmental conditions irrespective of latitude

    Site investigation for the effects of vegetation on ground stability

    Get PDF
    The procedure for geotechnical site investigation is well established but little attention is currently given to investigating the potential of vegetation to assist with ground stability. This paper describes how routine investigation procedures may be adapted to consider the effects of the vegetation. It is recommended that the major part of the vegetation investigation is carried out, at relatively low cost, during the preliminary (desk) study phase of the investigation when there is maximum flexibility to take account of findings in the proposed design and construction. The techniques available for investigation of the effects of vegetation are reviewed and references provided for further consideration. As for general geotechnical investigation work, it is important that a balance of effort is maintained in the vegetation investigation between (a) site characterisation (defining and identifying the existing and proposed vegetation to suit the site and ground conditions), (b) testing (in-situ and laboratory testing of the vegetation and root systems to provide design parameters) and (c) modelling (to analyse the vegetation effects)
    corecore