1,132 research outputs found

    General non-rotating perfect-fluid solution with an abelian spacelike C_3 including only one isometry

    Get PDF
    The general solution for non-rotating perfect-fluid spacetimes admitting one Killing vector and two conformal (non-isometric) Killing vectors spanning an abelian three-dimensional conformal algebra (C_3) acting on spacelike hypersurfaces is presented. It is of Petrov type D; some properties of the family such as matter contents are given. This family turns out to be an extension of a solution recently given in \cite{SeS} using completely different methods. The family contains Friedman-Lema\^{\i}tre-Robertson-Walker particular cases and could be useful as a test for the different FLRW perturbation schemes. There are two very interesting limiting cases, one with a non-abelian G_2 and another with an abelian G_2 acting non-orthogonally transitively on spacelike surfaces and with the fluid velocity non-orthogonal to the group orbits. No examples are known to the authors in these classes.Comment: Submitted to GRG, Latex fil

    Uniqueness properties of the Kerr metric

    Get PDF
    We obtain a geometrical condition on vacuum, stationary, asymptotically flat spacetimes which is necessary and sufficient for the spacetime to be locally isometric to Kerr. Namely, we prove a theorem stating that an asymptotically flat, stationary, vacuum spacetime such that the so-called Killing form is an eigenvector of the self-dual Weyl tensor must be locally isometric to Kerr. Asymptotic flatness is a fundamental hypothesis of the theorem, as we demonstrate by writing down the family of metrics obtained when this requirement is dropped. This result indicates why the Kerr metric plays such an important role in general relativity. It may also be of interest in order to extend the uniqueness theorems of black holes to the non-connected and to the non-analytic case.Comment: 30 pages, LaTeX, submitted to Classical and Quantum Gravit

    Axially symmetric Einstein-Straus models

    Get PDF
    The existence of static and axially symmetric regions in a Friedman-Lemaitre cosmology is investigated under the only assumption that the cosmic time and the static time match properly on the boundary hypersurface. It turns out that the most general form for the static region is a two-sphere with arbitrarily changing radius which moves along the axis of symmetry in a determined way. The geometry of the interior region is completely determined in terms of background objects. When any of the most widely used energy-momentum contents for the interior region is imposed, both the interior geometry and the shape of the static region must become exactly spherically symmetric. This shows that the Einstein-Straus model, which is the generally accepted answer for the null influence of the cosmic expansion on the local physics, is not a robust model and it is rather an exceptional and isolated situation. Hence, its suitability for solving the interplay between cosmic expansion and local physics is doubtful and more adequate models should be investigated.Comment: Latex, no figure

    A spacetime characterization of the Kerr metric

    Full text link
    We obtain a characterization of the Kerr metric among stationary, asymptotically flat, vacuum spacetimes, which extends the characterization in terms of the Simon tensor (defined only in the manifold of trajectories) to the whole spacetime. More precisely, we define a three index tensor on any spacetime with a Killing field, which vanishes identically for Kerr and which coincides in the strictly stationary region with the Simon tensor when projected down into the manifold of trajectories. We prove that a stationary asymptotically flat vacuum spacetime with vanishing spacetime Simon tensor is locally isometric to Kerr. A geometrical interpretation of this characterization in terms of the Weyl tensor is also given. Namely, a stationary, asymptotically flat vacuum spacetime such that each principal null direction of the Killing form is a repeated principal null direction of the Weyl tensor is locally isometric to Kerr.Comment: 23 pages, No figures, LaTeX, to appear in Classical and Quantum Gravit

    Insufficient neutralization in testing a chlorhexidine-containing ethanol-based hand rub can result in a false positive efficacy assessment

    Get PDF
    BACKGROUND: Effective neutralization in testing hand hygiene preparations is considered to be a crucial element to ensure validity of the test results, especially with the difficulty to neutralize chlorhexidine gluconate. Aim of the study was to measure the effect of chemical neutralization under practical test conditions according to EN 1500. METHODS: We have investigated two ethanol-based hand rubs (product A, based on 61% ethanol and 1% chlorhexidine gluconate; product B, based on 85% ethanol). The efficacy of products (application of 3 ml for 30 s) was compared to 2-propanol 60% (v/v) (two 3 ml rubs of 30 s each) on hands artificially contaminated with Escherichia coli using a cross-over design with 15 volunteers. Pre-values were obtained by rubbing fingertips for 1 minute in liquid broth. Post-values were determined by sampling immediately after disinfection in liquid broth with and without neutralizers (0.5% lecithin, 4% polysorbate 20). RESULTS: The neutralizers were found to be effective and non-toxic. Without neutralization in the sampling fluid, the reference disinfection reduced the test bacteria by 3.7 log(10), product B by 3.3 log(10 )and product A by 4.8 log(10 )(P = 0.001; ANOVA). With neutralization the reference disinfection reduced the test bacteria by 3.5 log(10), product B by 3.3 log(10 )and product A by 2.7 log(10 )(P = 0.011; ANOVA). In comparison to the reference treatment Product B lead to a lower mean reduction than the reference disinfection but the difference was not significant (P > 0.1; Wilcoxon-Wilcox test). Without neutralizing agents in the sampling fluid, product A yielded a significantly higher reduction of test bacteria (4.8; P = 0.02) as compared to the situation with neutralizing agents (2.7; P = 0.033). CONCLUSION: The crucial step of neutralization lies in the sampling fluid itself in order to stop any residual bacteriostatic or bactericidal activity immediately after the application of the preparation, especially with chlorhexidine gluconate-containing preparations. This is particularly important at short application times such as the 30 s

    PET segmentation of bulky tumors:Strategies and workflows to improve inter-observer variability

    Get PDF
    Background PET-based tumor delineation is an error prone and labor intensive part of image analysis. Especially for patients with advanced disease showing bulky tumor FDG load, segmentations are challenging. Reducing the amount of user-interaction in the segmentation might help to facilitate segmentation tasks especially when labeling bulky and complex tumors. Therefore, this study reports on segmentation workflows/strategies that may reduce the inter-observer variability for large tumors with complex shapes with different levels of user-interaction. Methods Twenty PET images of bulky tumors were delineated independently by six observers using four strategies: (I) manual, (II) interactive threshold-based, (III) interactive threshold-based segmentation with the additional presentation of the PET-gradient image and (IV) the selection of the most reasonable result out of four established semi-automatic segmentation algorithms (Select-the-best approach). The segmentations were compared using Jaccard coefficients (JC) and percentage volume differences. To obtain a reference standard, a majority vote (MV) segmentation was calculated including all segmentations of experienced observers. Performed and MV segmentations were compared regarding positive predictive value (PPV), sensitivity (SE), and percentage volume differences. Results The results show that with decreasing user-interaction the inter-observer variability decreases. JC values and percentage volume differences of Select-the-best and a workflow including gradient information were significantly better than the measurements of the other segmentation strategies (p-value&lt;0.01). Interactive threshold-based and manual segmentations also result in significant lower and more variable PPV/SE values when compared with the MV segmentation. Conclusions FDG PET segmentations of bulky tumors using strategies with lower user-interaction showed less inter-observer variability. None of the methods led to good results in all cases, but use of either the gradient or the Select-the-best workflow did outperform the other strategies tested and may be a good candidate for fast and reliable labeling of bulky and heterogeneous tumors.</p

    Large Scale Cross-Correlations in Internet Traffic

    Full text link
    The Internet is a complex network of interconnected routers and the existence of collective behavior such as congestion suggests that the correlations between different connections play a crucial role. It is thus critical to measure and quantify these correlations. We use methods of random matrix theory (RMT) to analyze the cross-correlation matrix C of information flow changes of 650 connections between 26 routers of the French scientific network `Renater'. We find that C has the universal properties of the Gaussian orthogonal ensemble of random matrices: The distribution of eigenvalues--up to a rescaling which exhibits a typical correlation time of the order 10 minutes--and the spacing distribution follow the predictions of RMT. There are some deviations for large eigenvalues which contain network-specific information and which identify genuine correlations between connections. The study of the most correlated connections reveals the existence of `active centers' which are exchanging information with a large number of routers thereby inducing correlations between the corresponding connections. These strong correlations could be a reason for the observed self-similarity in the WWW traffic.Comment: 7 pages, 6 figures, final versio

    The validity of Dutch health claims data for identifying patients with chronic kidney disease:a hospital-based study in the Netherlands

    Get PDF
    Background. Health claims data may be an efficient and easily accessible source to study chronic kidney disease (CKD) prevalence in a nationwide population. Our aim was to study Dutch claims data for their ability to identify CKD patients in different subgroups. Methods. From a laboratory database, we selected 24 895 adults with at least one creatinine measurement in 2014 ordered at an outpatient clinic. Of these, 15 805 had >= 2 creatinine measurements at least 3 months apart and could be assessed for the chronicity criterion. We estimated the validity of a claim-based diagnosis of CKD and advanced CKD. The estimated glomerular filtration rate (eGFR)-based definitions for CKD (eGFR = 75 years. The specificity of CKD and advanced CKD was >= 99%. Positive predictive values ranged from 72% to 99% and negative predictive values ranged from 40% to 100%. Conclusion. When using health claims data for the estimation of CKD prevalence, it is important to take into account the characteristics of the population at hand. The younger the subjects and the more advanced the stage of CKD the higher the sensitivity of such data. Understanding which patients are selected using health claims data is crucial for a correct interpretation of study results
    corecore