9,052 research outputs found

    Poor qubits make for rich physics: noise-induced quantum Zeno effects and noise-induced Berry phases

    Full text link
    We briefly review three ways that environmental noise can slow-down (or speed-up) quantum transitions; (i) Lamb shifts, (ii) over-damping and (iii) orthogonality catastrophe. We compare them with the quantum Zeno effect induced by observing the system. These effects are relevant to poor qubits (those strongly coupled to noise). We discuss Berry phases generated by the orthogonality catastrophe, and argue that noise may make it easier to observe Berry phases.Comment: 6 pages - Proceedings of International Conference on Noise and Fluctuations (Pisa, 14-19 June 2009) - Improved with respect to version in Conf. Pro

    Efficient algorithms to discover alterations with complementary functional association in cancer

    Full text link
    Recent large cancer studies have measured somatic alterations in an unprecedented number of tumours. These large datasets allow the identification of cancer-related sets of genetic alterations by identifying relevant combinatorial patterns. Among such patterns, mutual exclusivity has been employed by several recent methods that have shown its effectivenes in characterizing gene sets associated to cancer. Mutual exclusivity arises because of the complementarity, at the functional level, of alterations in genes which are part of a group (e.g., a pathway) performing a given function. The availability of quantitative target profiles, from genetic perturbations or from clinical phenotypes, provides additional information that can be leveraged to improve the identification of cancer related gene sets by discovering groups with complementary functional associations with such targets. In this work we study the problem of finding groups of mutually exclusive alterations associated with a quantitative (functional) target. We propose a combinatorial formulation for the problem, and prove that the associated computation problem is computationally hard. We design two algorithms to solve the problem and implement them in our tool UNCOVER. We provide analytic evidence of the effectiveness of UNCOVER in finding high-quality solutions and show experimentally that UNCOVER finds sets of alterations significantly associated with functional targets in a variety of scenarios. In addition, our algorithms are much faster than the state-of-the-art, allowing the analysis of large datasets of thousands of target profiles from cancer cell lines. We show that on one such dataset from project Achilles our methods identify several significant gene sets with complementary functional associations with targets.Comment: Accepted at RECOMB 201

    Analytical computation of the off-axis Effective Area of grazing incidence X-ray mirrors

    Full text link
    Focusing mirrors for X-ray telescopes in grazing incidence, introduced in the 70s, are characterized in terms of their performance by their imaging quality and effective area, which in turn determines their sensitivity. Even though the on-axis effective area is assumed in general to characterize the collecting power of an X-ray optic, the telescope capability of imaging extended X-ray sources is also determined by the variation in its effective area with the off-axis angle. [...] The complex task of designing optics for future X-ray telescopes entails detailed computations of both imaging quality and effective area on- and off-axis. Because of their apparent complexity, both aspects have been, so far, treated by using ray-tracing routines aimed at simulating the interaction of X-ray photons with the reflecting surfaces of a given focusing system. Although this approach has been widely exploited and proven to be effective, it would also be attractive to regard the same problem from an analytical viewpoint, to assess an optical design of an X-ray optical module with a simpler calculation than a ray-tracing routine. [...] We have developed useful analytical formulae for the off-axis effective area of a double-reflection mirror in the double cone approximation, requiring only an integration and the standard routines to calculate the X-ray coating reflectivity for a given incidence angle. [...] Algebraic expressions are provided for the mirror geometric area, as a function of the off-axis angle. Finally, the results of the analytical computations presented here are validated by comparison with the corresponding predictions of a ray-tracing code.Comment: 12 pages, 11 figures, accepted for publication in "Astronomy & Astrophysics", section "Instruments, observational techniques, and data processing". Updated version after grammatical revision and typos correctio

    Tree Level Unitarity Bounds for the Minimal B-L Model

    Full text link
    We have derived the unitarity bounds in the high energy limit for the minimal B-L extension of the Standard Model by analysing the full class of Higgs and would-be Goldstone boson two-to-two scatterings at tree level. Moreover, we have investigated how these limits could vary at some lower critical value of the energy.Comment: 20 pages, 4 figures, 2 tables; 1d figure modified, typos corrected, bibliography augmented; published in PRD after minor adjustmen

    The Z' boson of the minimal B-L model at future Linear Colliders in e+e- --> mu+mu-

    Get PDF
    We study the capabilities of future electron-positron Linear Colliders, with centre-of-mass energy at the TeV scale, in accessing the parameter space of a ZZ' boson within the minimal BLB-L model. We carry out a detailed comparison between the discovery regions mapped over a two-dimensional configuration space (ZZ' mass and coupling) at the Large Hadron Collider and possible future Linear Colliders for the case of di-muon production. As known in the literature for other ZZ' models, we confirm that leptonic machines, as compared to the CERN hadronic accelerator, display an additional potential in discovering a ZZ' boson as well as in allowing one to study its properties at a level of precision well beyond that of any of the existing colliders.Comment: 5 pages, proceeding of LC09 (Perugia), published by the Italian Physical Society in the Nuovo Cimento C (Colloquia

    Current medical treatment of estrogen receptor-positive breast cancer

    Get PDF
    Approximately 80% of breast cancers (BC) are estrogen receptor (ER)-positive and thus endocrine therapy (ET) should be considered complementary to surgery in the majority of patients. The advantages of oophorectomy, adrenalectomy and hypophysectomy in women with advanced BC have been demonstrated many years ago, and currently ET consist of (i) ovarian function suppression (OFS), usually obtained using gonadotropin-releasing hormone agonists (GnRHa), (ii) selective estrogen receptor modulators or down-regulators (SERMs or SERDs), (iii) aromatase inhibitors (AIs), or a combination of two or more drugs. For patients aged less than 50 years and ER+ BC, there is no conclusive evidence that the combination of OFS and SERMs (i.e. tamoxifen) or chemotherapy is superior to OFS alone. Tamoxifen users exhibit a reduced risk of BC, both invasive and in situ, especially during the first 5 years of therapy, and extending the treatment to 10 years further reduced the risk of recurrences. SERDs (i.e. fulvestrant) are especially useful in the neoadjuvant treatment of advanced BC, alone or in combination with either cytotoxic agents or AIs. There are two types of AIs: type I are permanent steroidal inhibitors of aromatase, while type II are reversible nonsteroidal inhibitors. Several studies demonstrated the superiority of the third-generation AIs (i.e. anastrozole and letrozole) compared with tamoxifen, and adjuvant therapy with AIs reduces the recurrence risk especially in patients with advanced BC. Unfortunately, some cancers are or became ET-resistant, and thus other drugs have been suggested in combination with SERMs or AIs, including cyclin-dependent kinase 4/6 inhibitors (palbociclib) and mammalian target of rapamycin (mTOR) inhibitors, such as everolimus. Further studies are required to confirm their real usefulness

    Financial dollarization: the role of banks and interest rates

    Get PDF
    This paper develops a model to explain the determinants of financial dollarization. Expanding on the existing literature, our framework allows interest rate differentials to play a role in explaining financial dollarization. It also accounts for the increasing presence of foreign banks in the local financial sector. Using a newly compiled data set on transition economies we find that increasing access to foreign funds leads to higher credit dollarization, while it decreases deposit dollarization. Interest rate differentials matter for the dollarization of both loans and deposits. Overall, the empirical results lend support to the predictions of our theoretical model. JEL Classification:Financial Dollarization, Foreign Banks, Interest Rate Differentials, Transition Economies

    Investment Cost Channel and Monetary Transmission

    Get PDF
    We show that a standard DSGE model with investment cost channels has important model stability and policy implications. Our analysis suggests that in economies characterized by supply side well as demand side channels of monetary transmission, policymakers may have to resort to a much more aggressive stand against inflation to obtain locally unique equilibrium. In such an environment targeting output gap may cause model instability. We also show that it is difficult to distinguish between the New Keynesian model and labor cost channel only case, while with investment cost channel differences are more significant. This result is important as it suggests that if one does not take into account the investment cost channel, one is underestimating the importance of supply side effects.Cost channel, Investment finance, Taylor Rule, indeterminacy

    Photon and Pomeron -- induced production of Dijets in pppp, pApA and AAAA collisions

    Full text link
    In this paper we present a detailed comparison of the dijet production by photon -- photon, photon -- pomeron and pomeron -- pomeron interactions in pppp, pApA and AA{\rm AA} collisions at the LHC energy. The transverse momentum, pseudo -- rapidity and angular dependencies of the cross sections are calculated at LHC energy using the Forward Physics Monte Carlo (FPMC), which allows to obtain realistic predictions for the dijet production with two leading intact hadrons. We obtain that \gamma \pom channel is dominant at forward rapidities in pppp collisions and in the full kinematical range in the nuclear collisions of heavy nuclei. Our results indicate that the analysis of dijet production at the LHC can be useful to test the Resolved Pomeron model as well as to constrain the magnitude of the absorption effects.Comment: 11 pages, 6 figures, 1 table. Improved and enlarged version published in European Physical Journal
    corecore