395 research outputs found

    Showcasing a Barren Plateau Theory Beyond the Dynamical Lie Algebra

    Full text link
    Barren plateaus have emerged as a pivotal challenge for variational quantum computing. Our understanding of this phenomenon underwent a transformative shift with the recent introduction of a Lie algebraic theory capable of explaining most sources of barren plateaus. However, this theory requires either initial states or observables that lie in the circuit's Lie algebra. Focusing on parametrized matchgate circuits, in this work we are able to go beyond this assumption and provide an exact formula for the loss function variance that is valid for arbitrary input states and measurements. Our results reveal that new phenomena emerge when the Lie algebra constraint is relaxed. For instance, we find that the variance does not necessarily vanish inversely with the Lie algebra's dimension. Instead, this measure of expressiveness is replaced by a generalized expressiveness quantity: The dimension of the Lie group modules. By characterizing the operators in these modules as products of Majorana operators, we can introduce a precise notion of generalized globality and show that measuring generalized-global operators leads to barren plateaus. Our work also provides operational meaning to the generalized entanglement as we connect it with known fermionic entanglement measures, and show that it satisfies a monogamy relation. Finally, while parameterized matchgate circuits are not efficiently simulable in general, our results suggest that the structure allowing for trainability may also lead to classical simulability.Comment: 5+26 pages, 2+1 figure

    Parallel-in-time quantum simulation via Page and Wootters quantum time

    Full text link
    In the past few decades, researchers have created a veritable zoo of quantum algorithm by drawing inspiration from classical computing, information theory, and even from physical phenomena. Here we present quantum algorithms for parallel-in-time simulations that are inspired by the Page and Wooters formalism. In this framework, and thus in our algorithms, the classical time-variable of quantum mechanics is promoted to the quantum realm by introducing a Hilbert space of "clock" qubits which are then entangled with the "system" qubits. We show that our algorithms can compute temporal properties over NN different times of many-body systems by only using log(N)\log(N) clock qubits. As such, we achieve an exponential trade-off between time and spatial complexities. In addition, we rigorously prove that the entanglement created between the system qubits and the clock qubits has operational meaning, as it encodes valuable information about the system's dynamics. We also provide a circuit depth estimation of all the protocols, showing an exponential advantage in computation times over traditional sequential in time algorithms. In particular, for the case when the dynamics are determined by the Aubry-Andre model, we present a hybrid method for which our algorithms have a depth that only scales as O(log(N)n)\mathcal{O}(\log(N)n). As a by product we can relate the previous schemes to the problem of equilibration of an isolated quantum system, thus indicating that our framework enable a new dimension for studying dynamical properties of many-body systems.Comment: 19+15 pages, 18+1 figure

    Fiber Optic Cable Thermal Preparation to Ensure Stable Operation

    Get PDF
    Fiber optic cables are widely used in modern systems that must provide stable operation during exposure to changing environmental conditions. For example, a fiber optic cable on a satellite may have to reliably function over a temperature range of -50 C up to 125 C. While the system requirements for a particular application will dictate the exact method by which the fibers should be prepared, this work will examine multiple ruggedized fibers prepared in different fashions and subjected to thermal qualification testing. The data show that if properly conditioned the fiber cables can provide stable operation, but if done incorrectly, they will have large fluctuations in transmission

    Validity of low-contrast letter acuity as a visual performance outcome measure for multiple sclerosis.

    Get PDF
    Low-contrast letter acuity (LCLA) has emerged as the leading outcome measure to assess visual disability in multiple sclerosis (MS) research. As visual dysfunction is one of the most common manifestations of MS, sensitive visual outcome measures are important in examining the effect of treatment. Low-contrast acuity captures visual loss not seen in high-contrast visual acuity (HCVA) measurements. These issues are addressed by the MS Outcome Assessments Consortium (MSOAC), including representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. MSOAC goals are acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are clinically meaningful. This review shows that MS and disease-free controls have similar median HCVA, while MS patients have significantly lower LCLA. Deficits in LCLA and vision-specific quality of life are found many years after an episode of acute optic neuritis, even when HCVA has recovered. Studies reveal correlations between LCLA and the Expanded Disability Status Score (EDSS), Multiple Sclerosis Functional Composite (MSFC), retinal nerve fiber layer (RNFL) and ganglion cell layer plus inner plexiform layer (GCL + IPL) thickness on optical coherence tomography (OCT), brain magnetic resonance imaging (MRI), visual evoked potential (VEP), electroretinogram (ERG), pupillary function, and King-Devick testing. This review also concludes that a 7-point change in LCLA is clinically meaningful. The overall goal of this review is to describe and characterize the LCLA metric for research and clinical use among persons with MS

    Space Flight Qualification on a Multi-Fiber Ribbon Cable and Array Connector Assembly

    Get PDF
    NASA's Goddard Space Flight Center (GSFC) cooperatively with Sandia National Laboratories completed a series of tests on three separate configurations of multi-fiber ribbon cable and MTP connector assemblies. These tests simulate the aging process of components during launch and long-term space environmental exposure. The multi-fiber ribbon cable assembly was constructed of non-outgassing materials, with radiation-hardened, graded index 100/140-micron optical fiber. The results of this characterization presented here include vibration testing, thermal vacuum monitoring, and extended radiation exposure testing data

    Photonic Component Qualification and Implementation Activities at NASA Goddard Space Flight Center

    Get PDF
    The photonics group in Code 562 at NASA Goddard Space Flight Center supports a variety of space flight programs at NASA including the: International Space Station (ISS), Shuttle Return to Flight Mission, Lunar Reconnaissance Orbiter (LRO), Express Logistics Carrier, and the NASA Electronic Parts and Packaging Program (NEPP). Through research, development, and testing of the photonic systems to support these missions much information has been gathered on practical implementations for space environments. Presented here are the highlights and lessons learned as a result of striving to satisfy the project requirements for high performance and reliable commercial optical fiber components for space flight systems. The approach of how to qualify optical fiber components for harsh environmental conditions, the physics of failure and development lessons learned will be discussed

    Managing toxicities associated with immune checkpoint inhibitors: consensus recommendations from the Society for Immunotherapy of Cancer (SITC) Toxicity Management Working Group.

    Get PDF
    Cancer immunotherapy has transformed the treatment of cancer. However, increasing use of immune-based therapies, including the widely used class of agents known as immune checkpoint inhibitors, has exposed a discrete group of immune-related adverse events (irAEs). Many of these are driven by the same immunologic mechanisms responsible for the drugs\u27 therapeutic effects, namely blockade of inhibitory mechanisms that suppress the immune system and protect body tissues from an unconstrained acute or chronic immune response. Skin, gut, endocrine, lung and musculoskeletal irAEs are relatively common, whereas cardiovascular, hematologic, renal, neurologic and ophthalmologic irAEs occur much less frequently. The majority of irAEs are mild to moderate in severity; however, serious and occasionally life-threatening irAEs are reported in the literature, and treatment-related deaths occur in up to 2% of patients, varying by ICI. Immunotherapy-related irAEs typically have a delayed onset and prolonged duration compared to adverse events from chemotherapy, and effective management depends on early recognition and prompt intervention with immune suppression and/or immunomodulatory strategies. There is an urgent need for multidisciplinary guidance reflecting broad-based perspectives on how to recognize, report and manage organ-specific toxicities until evidence-based data are available to inform clinical decision-making. The Society for Immunotherapy of Cancer (SITC) established a multidisciplinary Toxicity Management Working Group, which met for a full-day workshop to develop recommendations to standardize management of irAEs. Here we present their consensus recommendations on managing toxicities associated with immune checkpoint inhibitor therapy
    corecore