6,675 research outputs found

    Probing Quantum Geometry at LHC

    Full text link
    We present an evidence, that the volumes of compactified spaces as well as the areas of black hole horizons must be quantized in Planck units. This quantization has phenomenological consequences, most dramatic being for micro black holes in the theories with TeV scale gravity that can be produced at LHC. We predict that black holes come in form of a discrete tower with well defined spacing. Instead of thermal evaporation, they decay through the sequence of spontaneous particle emissions, with each transition reducing the horizon area by strictly integer number of Planck units. Quantization of the horizons can be a crucial missing link by which the notion of the minimal length in gravity eliminates physical singularities. In case when the remnants of the black holes with the minimal possible area and mass of order few TeV are stable, they might be good candidates for the cold dark matter in the Universe.Comment: 14 pages, Late

    Hypernetwork functional image representation

    Full text link
    Motivated by the human way of memorizing images we introduce their functional representation, where an image is represented by a neural network. For this purpose, we construct a hypernetwork which takes an image and returns weights to the target network, which maps point from the plane (representing positions of the pixel) into its corresponding color in the image. Since the obtained representation is continuous, one can easily inspect the image at various resolutions and perform on it arbitrary continuous operations. Moreover, by inspecting interpolations we show that such representation has some properties characteristic to generative models. To evaluate the proposed mechanism experimentally, we apply it to image super-resolution problem. Despite using a single model for various scaling factors, we obtained results comparable to existing super-resolution methods

    A Time-Resolved Line-Focus Acoustic Microscopy Technique for Surface-Breaking Crack Depth Determination

    Get PDF
    Time-resolved line-focus acoustic microscopy (TRLFAM) combines the advantages of a conventional pulse-echo system with those of the acoustic microscope. Compared to high frequency line-focus acoustic microscopy [1], this technique employs a much larger (aperture 28mm) pulsed line-focus immersion transducer at much lower center frequencies. The insonified length of the specimen is an order of magnitude larger than that of the line-focus acoustic microscope operating at 225 MHz. This has the advantage that the amplitudes and the arrival times of the directly reflected wave, the leaky surface wave as well as other possible echo arrivals, can be time-resolved with considerable accuracy when the sample is moved inside the focal region of the transducer. Moreover, since the transducer is line focused, for an anisotropic material leaky surface wave arrivals can be time resolved along different directions. In earlier papers TRLFAM has been used to determine elastic constants for both isotropic and anisotropic materials [2]

    The role of circulating tumour cells and nucleic acids in blood for the detection of bladder cancer: A systematic review

    Get PDF
    BACKGROUND: Blood-based biomarkers are a neglected resource in bladder cancer, where the mainstay of focus has been on urinary biomarkers. However, blood-based biomarkers are gaining popularity in other solid cancers, particularly circulating tumour cells (CTCs) and circulating nucleic acids. In this systematic review, we identify and discuss the diagnostic value of CTC, cell-free DNA and RNA based biomarkers in bladder cancer. METHODS: A MEDLINE/Pubmed systematic search was performed using the following keywords: (bladder cancer) AND (blood OR plasma OR serum) AND biomarker AND (DNA OR RNA OR cfDNA OR cell-free DNA OR RNA OR CTC). All studies including blood-based biomarkers based on DNA, RNA and CTCs were reviewed. Of the included studies, studies reporting sensitivity, specificity and/or AUC/ROC values were further described. RESULTS: Systematic searched yielded 47 studies that were eligible, of which 21, 19 and 3 studies reported DNA, RNA and CTC biomarkers respectively. 15 of these studies included sensitivity, specificity and/or AUC/ROC values. Biomarkers sensitivity and specificity ranged widely at 2.4-97.6% and 43.3-100% respectively. Median number of patients recruited in the studies was 56 (IQR 41-90). Only 3 studies included an independent validation cohort. The highest sensitivity and specificity pairing achieved in the validation cohort was 80.0% and 89.1% respectively. CONCLUSIONS: This systematic review provides a comprehensive overview of the blood-based CTC and nucleic acid biomarkers that have been investigated. An overlap in interest of targets between studies suggests that these could be promising biomarkers, but few biomarkers achieve high sensitivity and specificity, and fewer still have been validated independently

    A simple and robust method for connecting small-molecule drugs using gene-expression signatures

    Get PDF
    Interaction of a drug or chemical with a biological system can result in a gene-expression profile or signature characteristic of the event. Using a suitably robust algorithm these signatures can potentially be used to connect molecules with similar pharmacological or toxicological properties. The Connectivity Map was a novel concept and innovative tool first introduced by Lamb et al to connect small molecules, genes, and diseases using genomic signatures [Lamb et al (2006), Science 313, 1929-1935]. However, the Connectivity Map had some limitations, particularly there was no effective safeguard against false connections if the observed connections were considered on an individual-by-individual basis. Further when several connections to the same small-molecule compound were viewed as a set, the implicit null hypothesis tested was not the most relevant one for the discovery of real connections. Here we propose a simple and robust method for constructing the reference gene-expression profiles and a new connection scoring scheme, which importantly allows the valuation of statistical significance of all the connections observed. We tested the new method with the two example gene-signatures (HDAC inhibitors and Estrogens) used by Lamb et al and also a new gene signature of immunosuppressive drugs. Our testing with this new method shows that it achieves a higher level of specificity and sensitivity than the original method. For example, our method successfully identified raloxifene and tamoxifen as having significant anti-estrogen effects, while Lamb et al's Connectivity Map failed to identify these. With these properties our new method has potential use in drug development for the recognition of pharmacological and toxicological properties in new drug candidates.Comment: 8 pages, 2 figures, and 2 tables; supplementary data supplied as a ZIP fil

    Novel urinary biomarkers for the detection of bladder cancer: A systematic review

    Get PDF
    BACKGROUND: Urinary biomarkers for the diagnosis of bladder cancer represents an area of considerable research which has been tested in both patients presenting with haematuria and non-muscle invasive bladder cancer patients requiring surveillance cystoscopy. In this systematic review, we identify and appraise the diagnostic sensitive and specificity of reported novel biomarkers of different 'omic' class and highlight promising biomarkers investigated to date. METHODS: A MEDLINE/Pubmed systematic search was performed between January 2013 and July 2017 using the following keywords: (bladder cancer OR transitional cell carcinoma OR urothelial cell carcinoma) AND (detection OR diagnosis) AND urine AND (biomarker OR assay). All studies had a minimum of 20 patients in both bladder cancer and control arms and reported sensitivity and/or specificity and/or receiver operating characteristics (ROC) curve. QUADAS-2 tool was used to assess risk of bias and applicability of studies. The search protocol was registered in the PROSPERO database (CRD42016049918). RESULTS: Systematic search yielded 115 reports were included for analysis. In single target biomarkers had a sensitivity of 2-94%, specificity of 46-100%, positive predictive value (PPV) of 47-100% and negative predictive value (NPV) of 21-94%. Multi-target biomarkers achieved a sensitivity of 24-100%, specificity of 48-100%, PPV of 42-95% and NPV of 32-100%. 50 studies achieved a sensitivity and specificity of ≥80%. Protein (n = 59) and transcriptomic (n = 21) biomarkers represents the most studied biomarkers. Multi-target biomarker panels had a better diagnostic accuracy compared to single biomarker targets. Urinary cytology with urinary biomarkers improved the diagnostic ability of the biomarker. The sensitivity and specificity of biomarkers were higher for primary diagnosis compared to patients in the surveillance setting. Most studies were case control studies and did not have a predefined threshold to determine a positive test result indicating a possible risk of bias. CONCLUSION: This comprehensive systematic review provides an update on urinary biomarkers of different 'omic' class and highlights promising biomarkers. Few biomarkers achieve a high sensitivity and negative predictive value. Such biomarkers will require external validation in a prospective observational setting before adoption in clinical practice

    Towards Accurate Estimation of the Proportion of True Null Hypotheses in Multiple Testing

    Get PDF
    BACKGROUND: Biomedical researchers are now often faced with situations where it is necessary to test a large number of hypotheses simultaneously, eg, in comparative gene expression studies using high-throughput microarray technology. To properly control false positive errors the FDR (false discovery rate) approach has become widely used in multiple testing. The accurate estimation of FDR requires the proportion of true null hypotheses being accurately estimated. To date many methods for estimating this quantity have been proposed. Typically when a new method is introduced, some simulations are carried out to show the improved accuracy of the new method. However, the simulations are often very limited to covering only a few points in the parameter space. RESULTS: Here I have carried out extensive in silico experiments to compare some commonly used methods for estimating the proportion of true null hypotheses. The coverage of these simulations is unprecedented thorough over the parameter space compared to typical simulation studies in the literature. Thus this work enables us to draw conclusions globally as to the performance of these different methods. It was found that a very simple method gives the most accurate estimation in a dominantly large area of the parameter space. Given its simplicity and its overall superior accuracy I recommend its use as the first choice for estimating the proportion of true null hypotheses in multiple testing

    Anisotropic Structure of the Order Parameter in FeSe0.45Te0.55 Revealed by Angle Resolved Specific Heat

    Full text link
    The symmetry and structure of the superconducting gap in the Fe-based superconductors are the central issue for understanding these novel materials. So far the experimental data and theoretical models have been highly controversial. Some experiments favor two or more constant or nearly-constant gaps, others indicate strong anisotropy and yet others suggest gap zeros ("nodes"). Theoretical models also vary, suggesting that the absence or presence of the nodes depends quantitatively on the model parameters. An opinion that has gained substantial currency is that the gap structure, unlike all other known superconductors, including cuprates, may be different in different compounds within the same family. A unique method for addressing this issue, one of the very few methods that are bulk and angle-resolved, calls for measuring the electronic specific heat in a rotating magnetic field, as a function of field orientation with respect to the crystallographic axes. In this Communication we present the first such measurement for an Fe-based high-Tc superconductor (FeBSC). We observed a fourfold oscillation of the specific heat as a function of the in-plane magnetic field direction, which allowed us to identify the locations of the gap minima (or nodes) on the Fermi surface. Our results are consistent with the expectations of an extended s-wave model with a significant gap anisotropy on the electron pockets and the gap minima along the \Gamma M (or Fe-Fe bond) direction.Comment: 32 pages, 7 figure

    Observation of inhibited electron-ion coupling in strongly heated graphite

    Get PDF
    Creating non-equilibrium states of matter with highly unequal electron and lattice temperatures (Tele≠Tion) allows unsurpassed insight into the dynamic coupling between electrons and ions through time-resolved energy relaxation measurements. Recent studies on low-temperature laser-heated graphite suggest a complex energy exchange when compared to other materials. To avoid problems related to surface preparation, crystal quality and poor understanding of the energy deposition and transport mechanisms, we apply a different energy deposition mechanism, via laser-accelerated protons, to isochorically and non-radiatively heat macroscopic graphite samples up to temperatures close to the melting threshold. Using time-resolved x ray diffraction, we show clear evidence of a very small electron-ion energy transfer, yielding approximately three times longer relaxation times than previously reported. This is indicative of the existence of an energy transfer bottleneck in non-equilibrium warm dense matter

    The Impact of Non-Equipartition on Cosmological Parameter Estimation from Sunyaev-Zel'dovich Surveys

    Full text link
    The collisionless accretion shock at the outer boundary of a galaxy cluster should primarily heat the ions instead of electrons since they carry most of the kinetic energy of the infalling gas. Near the accretion shock, the density of the intracluster medium is very low and the Coulomb collisional timescale is longer than the accretion timescale. Electrons and ions may not achieve equipartition in these regions. Numerical simulations have shown that the Sunyaev-Zel'dovich observables (e.g., the integrated Comptonization parameter Y) for relaxed clusters can be biased by a few percent. The Y-mass relation can be biased if non-equipartition effects are not properly taken into account. Using a set of hydrodynamical simulations, we have calculated three potential systematic biases in the Y-mass relations introduced by non-equipartition effects during the cross-calibration or self-calibration when using the galaxy cluster abundance technique to constraint cosmological parameters. We then use a semi-analytic technique to estimate the non-equipartition effects on the distribution functions of Y (Y functions) determined from the extended Press-Schechter theory. Depending on the calibration method, we find that non-equipartition effects can induce systematic biases on the Y functions, and the values of the cosmological parameters Omega_8, sigma_8, and the dark energy equation of state parameter w can be biased by a few percent. In particular, non-equipartition effects can introduce an apparent evolution in w of a few percent in all of the systematic cases we considered. Techniques are suggested to take into account the non-equipartition effect empirically when using the cluster abundance technique to study precision cosmology. We conclude that systematic uncertainties in the Y-mass relation of even a few percent can introduce a comparable level of biases in cosmological parameter measurements.Comment: 10 pages, 3 figures, accepted for publication in the Astrophysical Journal, abstract abridged slightly. Typos corrected in version
    • …
    corecore