556 research outputs found

    Cutting to Order in the Rough Mill: A Sampling Approach

    Get PDF
    A cutting order is a list of dimension parts along with demanded quantities. The cutting-order problem is to minimize the total cost of filling a cutting order from a given lumber supply. Similar cutting-order problems arise in many industrial situations outside of forest products. This paper adapts an earlier, linear programming approach that was developed for uniform, defect-free stock materials. The adaptation presented here allows the method to handle nonuniform stock material (e.g., lumber) that contains defects that are not known in advance of cutting. The main differences are the use of a random sample to construct the linear program and the use of prices rather than cutting patterns to specify a solution. The primary result of this research is that the expected cost of filling an order under the proposed method is approximately equal to the minimum possible expected cost for sufficiently large order and sample sizes. A secondary result is a lower bound on the minimum possible expected cost. Computer simulations suggest that the proposed method is capable of attaining nearly minimal expected costs in moderately large orders

    Building biosecurity for synthetic biology.

    Get PDF
    The fast-paced field of synthetic biology is fundamentally changing the global biosecurity framework. Current biosecurity regulations and strategies are based on previous governance paradigms for pathogen-oriented security, recombinant DNA research, and broader concerns related to genetically modified organisms (GMOs). Many scholarly discussions and biosecurity practitioners are therefore concerned that synthetic biology outpaces established biosafety and biosecurity measures to prevent deliberate and malicious or inadvertent and accidental misuse of synthetic biology's processes or products. This commentary proposes three strategies to improve biosecurity: Security must be treated as an investment in the future applicability of the technology; social scientists and policy makers should be engaged early in technology development and forecasting; and coordination among global stakeholders is necessary to ensure acceptable levels of risk

    Characterization of Granulations of Calcium and Apatite in Serum as Pleomorphic Mineralo-Protein Complexes and as Precursors of Putative Nanobacteria

    Get PDF
    Calcium and apatite granulations are demonstrated here to form in both human and fetal bovine serum in response to the simple addition of either calcium or phosphate, or a combination of both. These granulations are shown to represent precipitating complexes of protein and hydroxyapatite (HAP) that display marked pleomorphism, appearing as round, laminated particles, spindles, and films. These same complexes can be found in normal untreated serum, albeit at much lower amounts, and appear to result from the progressive binding of serum proteins with apatite until reaching saturation, upon which the mineralo-protein complexes precipitate. Chemically and morphologically, these complexes are virtually identical to the so-called nanobacteria (NB) implicated in numerous diseases and considered unusual for their small size, pleomorphism, and the presence of HAP. Like NB, serum granulations can seed particles upon transfer to serum-free medium, and their main protein constituents include albumin, complement components 3 and 4A, fetuin-A, and apolipoproteins A1 and B100, as well as other calcium and apatite binding proteins found in the serum. However, these serum mineralo-protein complexes are formed from the direct chemical binding of inorganic and organic phases, bypassing the need for any biological processes, including the long cultivation in cell culture conditions deemed necessary for the demonstration of NB. Thus, these serum granulations may result from physiologically inherent processes that become amplified with calcium phosphate loading or when subjected to culturing in medium. They may be viewed as simple mineralo-protein complexes formed from the deployment of calcification-inhibitory pathways used by the body to cope with excess calcium phosphate so as to prevent unwarranted calcification. Rather than representing novel pathophysiological mechanisms or exotic lifeforms, these results indicate that the entities described earlier as NB most likely originate from calcium and apatite binding factors in the serum, presumably calcification inhibitors, that upon saturation, form seeds for HAP deposition and growth. These calcium granulations are similar to those found in organisms throughout nature and may represent the products of more general calcium regulation pathways involved in the control of calcium storage, retrieval, tissue deposition, and disposal

    The transcriptional landscape of Shh medulloblastoma

    Get PDF
    © The Author(s) 2021. Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.Sonic hedgehog medulloblastoma encompasses a clinically and molecularly diverse group of cancers of the developing central nervous system. Here, we use unbiased sequencing of the transcriptome across a large cohort of 250 tumors to reveal differences among molecular subtypes of the disease, and demonstrate the previously unappreciated importance of non-coding RNA transcripts. We identify alterations within the cAMP dependent pathway (GNAS, PRKAR1A) which converge on GLI2 activity and show that 18% of tumors have a genetic event that directly targets the abundance and/or stability of MYCN. Furthermore, we discover an extensive network of fusions in focally amplified regions encompassing GLI2, and several loss-of-function fusions in tumor suppressor genes PTCH1, SUFU and NCOR1. Molecular convergence on a subset of genes by nucleotide variants, copy number aberrations, and gene fusions highlight the key roles of specific pathways in the pathogenesis of Sonic hedgehog medulloblastoma and open up opportunities for therapeutic intervention.info:eu-repo/semantics/publishedVersio

    Corporate ethical identity as a determinant of firm performance : a test of the mediating role of stakeholder satisfaction

    Get PDF
    In this article, we empirically assess the impact of corporate ethical identity (CEI) on a firm’s financial performance. Drawing on formulations of normative and instrumental stakeholder theory, we argue that firms with a strong ethical identity achieve a greater degree of stakeholder satisfaction (SS), which, in turn, positively influences a firm’s financial performance. We analyze two dimensions of the CEI of firms: corporate revealed ethics and corporate applied ethics. Our results indicate that revealed ethics has informational worth and enhances shareholder value, whereas applied ethics has a positive impact through the improvement of SS. However, revealed ethics by itself (i.e. decoupled from ethical initiatives) is not sufficient to boost economic performance.Publicad

    Readout of a quantum processor with high dynamic range Josephson parametric amplifiers

    Full text link
    We demonstrate a high dynamic range Josephson parametric amplifier (JPA) in which the active nonlinear element is implemented using an array of rf-SQUIDs. The device is matched to the 50 Ω\Omega environment with a Klopfenstein-taper impedance transformer and achieves a bandwidth of 250-300 MHz, with input saturation powers up to -95 dBm at 20 dB gain. A 54-qubit Sycamore processor was used to benchmark these devices, providing a calibration for readout power, an estimate of amplifier added noise, and a platform for comparison against standard impedance matched parametric amplifiers with a single dc-SQUID. We find that the high power rf-SQUID array design has no adverse effect on system noise, readout fidelity, or qubit dephasing, and we estimate an upper bound on amplifier added noise at 1.6 times the quantum limit. Lastly, amplifiers with this design show no degradation in readout fidelity due to gain compression, which can occur in multi-tone multiplexed readout with traditional JPAs.Comment: 9 pages, 8 figure

    Measurement-Induced State Transitions in a Superconducting Qubit: Within the Rotating Wave Approximation

    Full text link
    Superconducting qubits typically use a dispersive readout scheme, where a resonator is coupled to a qubit such that its frequency is qubit-state dependent. Measurement is performed by driving the resonator, where the transmitted resonator field yields information about the resonator frequency and thus the qubit state. Ideally, we could use arbitrarily strong resonator drives to achieve a target signal-to-noise ratio in the shortest possible time. However, experiments have shown that when the average resonator photon number exceeds a certain threshold, the qubit is excited out of its computational subspace, which we refer to as a measurement-induced state transition. These transitions degrade readout fidelity, and constitute leakage which precludes further operation of the qubit in, for example, error correction. Here we study these transitions using a transmon qubit by experimentally measuring their dependence on qubit frequency, average photon number, and qubit state, in the regime where the resonator frequency is lower than the qubit frequency. We observe signatures of resonant transitions between levels in the coupled qubit-resonator system that exhibit noisy behavior when measured repeatedly in time. We provide a semi-classical model of these transitions based on the rotating wave approximation and use it to predict the onset of state transitions in our experiments. Our results suggest the transmon is excited to levels near the top of its cosine potential following a state transition, where the charge dispersion of higher transmon levels explains the observed noisy behavior of state transitions. Moreover, occupation in these higher energy levels poses a major challenge for fast qubit reset

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×10−31 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure

    Search for supersymmetry in final states with two or three soft leptons and missing transverse momentum in proton-proton collisions at s\sqrt{s} = 13 TeV

    Get PDF
    A search for supersymmetry in events with two or three low-momentum leptons and missing transverse momentum is performed. The search uses proton-proton collisions at s\sqrt{s} = 13 TeV collected in the three-year period 2016–2018 by the CMS experiment at the LHC and corresponding to an integrated luminosity of up to 137 fb−1^{-1}. The data are found to be in agreement with expectations from standard model processes. The results are interpreted in terms of electroweakino and top squark pair production with a small mass difference between the produced supersymmetric particles and the lightest neutralino. For the electroweakino interpretation, two simplified models are used, a wino-bino model and a higgsino model. Exclusion limits at 95% confidence level are set on X2/X1 0 +−_{X2/X1}^{~0 ~+-} masses up to 275 GeV for a mass difference of 10 GeV in the wino-bino case, and up to 205(150) GeV for a mass difference of 7.5 (3) GeV in the higgsino case. The results for the higgsino are further interpreted using a phenomenological minimal supersymmetric standard model, excluding the higgsino mass parameter ÎŒ up to 180 GeV with the bino mass parameter M1 at 800 GeV. In the top squark interpretation, exclusion limits are set at top squark masses up to 540 GeV for four-body top squark decays and up to 480 GeV for chargino-mediated decays with a mass difference of 30 GeV

    Measurement-induced entanglement and teleportation on a noisy quantum processor

    Full text link
    Measurement has a special role in quantum theory: by collapsing the wavefunction it can enable phenomena such as teleportation and thereby alter the "arrow of time" that constrains unitary evolution. When integrated in many-body dynamics, measurements can lead to emergent patterns of quantum information in space-time that go beyond established paradigms for characterizing phases, either in or out of equilibrium. On present-day NISQ processors, the experimental realization of this physics is challenging due to noise, hardware limitations, and the stochastic nature of quantum measurement. Here we address each of these experimental challenges and investigate measurement-induced quantum information phases on up to 70 superconducting qubits. By leveraging the interchangeability of space and time, we use a duality mapping, to avoid mid-circuit measurement and access different manifestations of the underlying phases -- from entanglement scaling to measurement-induced teleportation -- in a unified way. We obtain finite-size signatures of a phase transition with a decoding protocol that correlates the experimental measurement record with classical simulation data. The phases display sharply different sensitivity to noise, which we exploit to turn an inherent hardware limitation into a useful diagnostic. Our work demonstrates an approach to realize measurement-induced physics at scales that are at the limits of current NISQ processors
    • 

    corecore