1,195 research outputs found
Recommended from our members
Maritime windstorm influence on soil process in a temperate rainforest
Maritime cyclonic windstorms cause widespread disturbance to forested ecosystems in southeast Alaska. The consequence of this disturbance process on the movement, storage, and quality of soil carbon, forest hydrology and streamwater chemistry was studied along a windthrow disturbance sequence. Soil profiles were described and the thickness of the major organic and mineral horizons were measured every 5 m along transects in 3 catchments with contrasting disturbance histories. A of these horizons were randomly selected and sampled to determine the quantity and quality of carbon present. Mineral soil samples were physically fractionated based on particle density. Total C and N, natural abundance δ¹⁵N, δ¹³C isotopes, and solid state ¹³C NMR were used to compare soil organic carbon pools in catchments with contrasting disturbance histories. An event-based sampling scheme was then used to compare hydrochemical properties of each catchment. Six storms were sampled over 14 months, representing a range of rainfall and soil moisture conditions. Streamfiow was measured, and water samples were collected every 4 hours during storm events from each catchment. Evidence for two distinct pathways for mineral soil carbon accumulation was found; 1) mineral and organic particle mixing by windthrow, 2) soil water transport of mobile organic carbon (MOC) to mineral soil horizons. MOC accumulated in mineral horizons principally through adsorption to mineral particles, and the extent of strong chemical association (adsorption) with mineral particles increased in older, thicker illuvial horizons. Forested catchments which experienced more intense soil mixing from windthrow were depleted in strongly humified soil carbon pools, and an overall increase in quality of soil carbon toward a partially decomposed particulate form was observed. Streamfiow on more-disturbed catchments peaked 4 to 12 hours later than in less disturbed catchments. During summer months, streamwater temperatures in more disturbed watersheds were cooler than air temperatures and less-disturbed catchments. Streams in more-disturbed catchments had higher pH, alkalinity and base cation concentrations than streams in less-disturbed catchments. These results suggest that catastrophic windtbrow disturbance smoothes hydrograph response to storms, and increases the chemical interaction of rainwater with mineral soil horizons by increasing rainwater infiltration and storage in deeper soil profiles. The changes in concentration and characteristics of organic carbon in mineral soil which result from soil mixing disturbances (windthrow, landslides) can strongly influence the hydrology, chemical properties of catchments, and the rate of nutrient cycling
General non-rotating perfect-fluid solution with an abelian spacelike C_3 including only one isometry
The general solution for non-rotating perfect-fluid spacetimes admitting one
Killing vector and two conformal (non-isometric) Killing vectors spanning an
abelian three-dimensional conformal algebra (C_3) acting on spacelike
hypersurfaces is presented. It is of Petrov type D; some properties of the
family such as matter contents are given. This family turns out to be an
extension of a solution recently given in \cite{SeS} using completely different
methods. The family contains Friedman-Lema\^{\i}tre-Robertson-Walker particular
cases and could be useful as a test for the different FLRW perturbation
schemes. There are two very interesting limiting cases, one with a non-abelian
G_2 and another with an abelian G_2 acting non-orthogonally transitively on
spacelike surfaces and with the fluid velocity non-orthogonal to the group
orbits. No examples are known to the authors in these classes.Comment: Submitted to GRG, Latex fil
Uniqueness properties of the Kerr metric
We obtain a geometrical condition on vacuum, stationary, asymptotically flat
spacetimes which is necessary and sufficient for the spacetime to be locally
isometric to Kerr. Namely, we prove a theorem stating that an asymptotically
flat, stationary, vacuum spacetime such that the so-called Killing form is an
eigenvector of the self-dual Weyl tensor must be locally isometric to Kerr.
Asymptotic flatness is a fundamental hypothesis of the theorem, as we
demonstrate by writing down the family of metrics obtained when this
requirement is dropped. This result indicates why the Kerr metric plays such an
important role in general relativity. It may also be of interest in order to
extend the uniqueness theorems of black holes to the non-connected and to the
non-analytic case.Comment: 30 pages, LaTeX, submitted to Classical and Quantum Gravit
Axially symmetric Einstein-Straus models
The existence of static and axially symmetric regions in a Friedman-Lemaitre
cosmology is investigated under the only assumption that the cosmic time and
the static time match properly on the boundary hypersurface. It turns out that
the most general form for the static region is a two-sphere with arbitrarily
changing radius which moves along the axis of symmetry in a determined way. The
geometry of the interior region is completely determined in terms of background
objects. When any of the most widely used energy-momentum contents for the
interior region is imposed, both the interior geometry and the shape of the
static region must become exactly spherically symmetric. This shows that the
Einstein-Straus model, which is the generally accepted answer for the null
influence of the cosmic expansion on the local physics, is not a robust model
and it is rather an exceptional and isolated situation. Hence, its suitability
for solving the interplay between cosmic expansion and local physics is
doubtful and more adequate models should be investigated.Comment: Latex, no figure
A spacetime characterization of the Kerr metric
We obtain a characterization of the Kerr metric among stationary,
asymptotically flat, vacuum spacetimes, which extends the characterization in
terms of the Simon tensor (defined only in the manifold of trajectories) to the
whole spacetime. More precisely, we define a three index tensor on any
spacetime with a Killing field, which vanishes identically for Kerr and which
coincides in the strictly stationary region with the Simon tensor when
projected down into the manifold of trajectories. We prove that a stationary
asymptotically flat vacuum spacetime with vanishing spacetime Simon tensor is
locally isometric to Kerr. A geometrical interpretation of this
characterization in terms of the Weyl tensor is also given. Namely, a
stationary, asymptotically flat vacuum spacetime such that each principal null
direction of the Killing form is a repeated principal null direction of the
Weyl tensor is locally isometric to Kerr.Comment: 23 pages, No figures, LaTeX, to appear in Classical and Quantum
Gravit
Insufficient neutralization in testing a chlorhexidine-containing ethanol-based hand rub can result in a false positive efficacy assessment
BACKGROUND: Effective neutralization in testing hand hygiene preparations is considered to be a crucial element to ensure validity of the test results, especially with the difficulty to neutralize chlorhexidine gluconate. Aim of the study was to measure the effect of chemical neutralization under practical test conditions according to EN 1500. METHODS: We have investigated two ethanol-based hand rubs (product A, based on 61% ethanol and 1% chlorhexidine gluconate; product B, based on 85% ethanol). The efficacy of products (application of 3 ml for 30 s) was compared to 2-propanol 60% (v/v) (two 3 ml rubs of 30 s each) on hands artificially contaminated with Escherichia coli using a cross-over design with 15 volunteers. Pre-values were obtained by rubbing fingertips for 1 minute in liquid broth. Post-values were determined by sampling immediately after disinfection in liquid broth with and without neutralizers (0.5% lecithin, 4% polysorbate 20). RESULTS: The neutralizers were found to be effective and non-toxic. Without neutralization in the sampling fluid, the reference disinfection reduced the test bacteria by 3.7 log(10), product B by 3.3 log(10 )and product A by 4.8 log(10 )(P = 0.001; ANOVA). With neutralization the reference disinfection reduced the test bacteria by 3.5 log(10), product B by 3.3 log(10 )and product A by 2.7 log(10 )(P = 0.011; ANOVA). In comparison to the reference treatment Product B lead to a lower mean reduction than the reference disinfection but the difference was not significant (P > 0.1; Wilcoxon-Wilcox test). Without neutralizing agents in the sampling fluid, product A yielded a significantly higher reduction of test bacteria (4.8; P = 0.02) as compared to the situation with neutralizing agents (2.7; P = 0.033). CONCLUSION: The crucial step of neutralization lies in the sampling fluid itself in order to stop any residual bacteriostatic or bactericidal activity immediately after the application of the preparation, especially with chlorhexidine gluconate-containing preparations. This is particularly important at short application times such as the 30 s
PET segmentation of bulky tumors:Strategies and workflows to improve inter-observer variability
Background PET-based tumor delineation is an error prone and labor intensive part of image analysis. Especially for patients with advanced disease showing bulky tumor FDG load, segmentations are challenging. Reducing the amount of user-interaction in the segmentation might help to facilitate segmentation tasks especially when labeling bulky and complex tumors. Therefore, this study reports on segmentation workflows/strategies that may reduce the inter-observer variability for large tumors with complex shapes with different levels of user-interaction. Methods Twenty PET images of bulky tumors were delineated independently by six observers using four strategies: (I) manual, (II) interactive threshold-based, (III) interactive threshold-based segmentation with the additional presentation of the PET-gradient image and (IV) the selection of the most reasonable result out of four established semi-automatic segmentation algorithms (Select-the-best approach). The segmentations were compared using Jaccard coefficients (JC) and percentage volume differences. To obtain a reference standard, a majority vote (MV) segmentation was calculated including all segmentations of experienced observers. Performed and MV segmentations were compared regarding positive predictive value (PPV), sensitivity (SE), and percentage volume differences. Results The results show that with decreasing user-interaction the inter-observer variability decreases. JC values and percentage volume differences of Select-the-best and a workflow including gradient information were significantly better than the measurements of the other segmentation strategies (p-value<0.01). Interactive threshold-based and manual segmentations also result in significant lower and more variable PPV/SE values when compared with the MV segmentation. Conclusions FDG PET segmentations of bulky tumors using strategies with lower user-interaction showed less inter-observer variability. None of the methods led to good results in all cases, but use of either the gradient or the Select-the-best workflow did outperform the other strategies tested and may be a good candidate for fast and reliable labeling of bulky and heterogeneous tumors.</p
Large Scale Cross-Correlations in Internet Traffic
The Internet is a complex network of interconnected routers and the existence
of collective behavior such as congestion suggests that the correlations
between different connections play a crucial role. It is thus critical to
measure and quantify these correlations. We use methods of random matrix theory
(RMT) to analyze the cross-correlation matrix C of information flow changes of
650 connections between 26 routers of the French scientific network `Renater'.
We find that C has the universal properties of the Gaussian orthogonal ensemble
of random matrices: The distribution of eigenvalues--up to a rescaling which
exhibits a typical correlation time of the order 10 minutes--and the spacing
distribution follow the predictions of RMT. There are some deviations for large
eigenvalues which contain network-specific information and which identify
genuine correlations between connections. The study of the most correlated
connections reveals the existence of `active centers' which are exchanging
information with a large number of routers thereby inducing correlations
between the corresponding connections. These strong correlations could be a
reason for the observed self-similarity in the WWW traffic.Comment: 7 pages, 6 figures, final versio
The validity of Dutch health claims data for identifying patients with chronic kidney disease:a hospital-based study in the Netherlands
Background. Health claims data may be an efficient and easily accessible source to study chronic kidney disease (CKD) prevalence in a nationwide population. Our aim was to study Dutch claims data for their ability to identify CKD patients in different subgroups. Methods. From a laboratory database, we selected 24 895 adults with at least one creatinine measurement in 2014 ordered at an outpatient clinic. Of these, 15 805 had >= 2 creatinine measurements at least 3 months apart and could be assessed for the chronicity criterion. We estimated the validity of a claim-based diagnosis of CKD and advanced CKD. The estimated glomerular filtration rate (eGFR)-based definitions for CKD (eGFR = 75 years. The specificity of CKD and advanced CKD was >= 99%. Positive predictive values ranged from 72% to 99% and negative predictive values ranged from 40% to 100%. Conclusion. When using health claims data for the estimation of CKD prevalence, it is important to take into account the characteristics of the population at hand. The younger the subjects and the more advanced the stage of CKD the higher the sensitivity of such data. Understanding which patients are selected using health claims data is crucial for a correct interpretation of study results
Recommended from our members
Changes to particulate versus mineral-associated soil carbon after 50 years of litter manipulation in forest and prairie experimental ecosystems
Models of ecosystem carbon (C) balance
generally assume a strong relationship between NPP,
litter inputs, and soil C accumulation, but there is little
direct evidence for such a coupled relationship. Using a
unique 50-year detrital manipulation experiment in a
mixed deciduous forest and in restored prairie grasslands
in Wisconsin, combined with sequential density
fractionation, isotopic analysis, and short-term incubation,
we examined the effects of detrital inputs and
removals on soil C stabilization, destabilization, and
quality. Both forested sites showed greater decline in
bulk soil C content in litter removal plots (55 and 66%)
compared to increases in litter addition plots (27 and
38% increase in surface soils compared to controls). No
accumulation in the mineral fraction C was observed
after 50 years of litter addition of the two forested plots,
thus increases in the light density fraction pool drove
patterns in total C content. Litter removal across both
ecosystem types resulted in a decline in both free light
fraction and mineral C content, with an overall 51%
decline in mineral-associated carbon in the intermediate
(1.85–2.4 g cm⁻³) density pool; isotopic data suggest
that it was preferentially younger C that was lost. In
contrast to results from other, but younger litter
manipulation sites, there was with no evidence of
priming even in soils collected after 28 years of
treatment. In prairie soils, aboveground litter exclusion
had an effect on C levels similar to that of root
exclusion, thus we did not see evidence that root-derived
C is more critical to soil C sequestration. There
was no clear evidence that soil C quality changed in
litter addition plots in the forested sites; δ¹³C and Δ¹⁴C
values, and incubation estimates of labile C were similar
between control and litter addition soils. C quality
appeared to change in litter removal plots; soils with
litter excluded had Δ¹⁴C values indicative of longer mean residence times, δ¹³C values indicative of loss of
fresh plant-derived C, and decreases in all light fraction
C pools, although incubation estimates of labile C did
not change. In prairie soils, δ¹³C values suggest a loss of
recent C4-derived soil C in litter removal plots along
with significant increases in mean residence time,
especially in plots with removal of roots. Our results
suggest surface mineral soils may be vulnerable to
significant C loss in association with disturbance, land
use change, or perhaps even climate change over
century–decadal timescales, and also highlight the need
for longer-term experimental manipulations to study
soil organic matter dynamics.Keywords: Carbon sequestration,
Carbon stabilization,
SOM,
Prairie,
Detrital manipulation treatments,
Soil organic matter,
DIRT,
Radiocarbon dating,
Forest,
Density fractionatio
- …