38 research outputs found

    Handbook for Efficiently Quantifying Robustness of Magic

    Full text link
    The nonstabilizerness, or magic, is an essential quantum resource to perform universal quantum computation. Robustness of magic (RoM) in particular characterizes the degree of usefulness of a given quantum state for non-Clifford operation. While the mathematical formalism of RoM can be given in a concise manner, it is extremely challenging to determine the RoM in practice, since it involves superexponentially many pure stabilizer states. In this work, we present efficient novel algorithms to compute the RoM. The crucial technique is a subroutine that achieves the remarkable features in calculation of overlaps between pure stabilizer states: (i) the time complexity per each stabilizer is reduced exponentially, (ii) the space complexity is reduced superexponentially. Based on this subroutine, we present algorithms to compute the RoM for arbitrary states up to n=7n=7 qubits on a laptop, while brute-force methods require a memory size of 86 TiB. As a byproduct, the proposed subroutine allows us to simulate the stabilizer fidelity up to n=8n=8 qubits, for which naive methods require memory size of 86 PiB so that any state-of-the-art classical computer cannot execute the computation. We further propose novel algorithms that utilize the preknowledge on the structure of target quantum state such as the permutation symmetry of disentanglement, and numerically demonstrate our state-of-the-art results for copies of magic states and partially disentangled quantum states. The series of algorithms constitute a comprehensive ``handbook'' to scale up the computation of the RoM, and we envision that the proposed technique applies to the computation of other quantum resource measures as well.Comment: 16+12 pages, 8+1 figure

    In vivo inhibition of angiogenesis by interleukin-13 gene therapy in a rat model of rheumatoid arthritis

    Full text link
    Objective Interleukin-13 (IL-13) is a pleiotropic cytokine that can affect vessel formation, an important component of the rheumatoid arthritis (RA) synovial tissue pannus. The purpose of this study was to use a gene therapy approach to investigate the role of IL-13 in angiogenesis in vivo, using a rat adjuvant-induced arthritis model of RA. Methods Ankle joints of female rats were injected preventatively with an adenovirus vector containing human IL-13 (AxCAIL-13), a control vector with no insert (AxCANI), or phosphate buffered saline (PBS). Joints were harvested at the peak of arthritis, and histologic and biochemical features were evaluated. Results AxCAIL-13–treated joint homogenates had lower hemoglobin levels, suggesting reduced joint vascularity, and both endothelial cell migration and tube formation were significantly inhibited ( P < 0.05). Similarly, AxCAIL-13 inhibited capillary sprouting in the rat aortic ring assay and vessel growth in the Matrigel plug in vivo assay. IL-13 gene delivery resulted in up-regulation and association of phosphorylated ERK-1/2 and protein kinase CΑ/ΒII, suggesting a novel pathway in IL-13–mediated angiostasis. The angiostatic effect of AxCAIL-13 was associated with down-regulation of proangiogenic cytokines (IL-18, cytokine-induced neutrophil chemoattractant 1/CXCL1, lipopolysaccharide-induced CXC chemokine/CXCL5) and up-regulation of the angiogenesis inhibitor endostatin. The expression and activity of matrix metalloproteinases 2 and 9, which participate in angiogenesis, was impaired in response to IL-13 as compared with AxCANI and PBS treatment. Conclusion Our findings support a role for IL-13 as an in vivo antiangiogenic factor and provide a rationale for its use in RA to control pathologic neovascularization.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/56107/1/22823_ftp.pd

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Radiation hardness study for the COMET Phase-I electronics

    No full text
    Radiation damage on front-end readout and trigger electronics is an important issue in the COMET Phase-I experiment at J-PARC, which plans to search for the neutrinoless transition of a muon to an electron. To produce an intense muon beam, a high-power proton beam impinges on a graphite target, resulting in a high-radiation environment. We require radiation tolerance to a total dose of 1.0kGy and 1MeV equivalent neutron fluence of 1.0×10 12 neq cm −2 including a safety factor of 5 over the duration of the physics measurement. The use of commercially-available electronics components which have high radiation tolerance, if such components can be secured, is desirable in such an environment. The radiation hardness of commercial electronic components has been evaluated in gamma-ray and neutron irradiation tests. As results of these tests, voltage regulators, ADCs, DACs, and several other components were found to have enough tolerance to both gamma-ray and neutron irradiation at the level we require. c.2019 Elsevier. B.V. All rights reserved
    corecore