2,413 research outputs found

    Effect of high temperature heat treatments on the quality factor of a large-grain superconducting radio-frequency niobium cavity

    Get PDF
    Large-grain Nb has become a viable alternative to fine-grain Nb for the fabrication of superconducting radio-frequency cavities. In this contribution we report the results from a heat treatment study of a large-grain 1.5 GHz single-cell cavity made of "medium purity" Nb. The baseline surface preparation prior to heat treatment consisted of standard buffered chemical polishing. The heat treatment in the range 800 - 1400 C was done in a newly designed vacuum induction furnace. Q0 values of the order of 2x1010 at 2.0 K and peak surface magnetic field (Bp) of 90 mT were achieved reproducibly. A Q0-value of (5+-1)1010 at 2.0 K and Bp = 90 mT was obtained after heat treatment at 1400 C. This is the highest value ever reported at this temperature, frequency and field. Samples heat treated with the cavity at 1400 C were analyzed by secondary ion mass spectrometry, secondary electron microscopy, energy dispersive X-ray, point contact tunneling and X-ray diffraction and revealed a complex surface composition which includes titanium oxide, increased carbon and nitrogen content but reduced hydrogen concentration compared to a non heat-treated sample

    The Quantum Socket: Three-Dimensional Wiring for Extensible Quantum Computing

    Get PDF
    Quantum computing architectures are on the verge of scalability, a key requirement for the implementation of a universal quantum computer. The next stage in this quest is the realization of quantum error correction codes, which will mitigate the impact of faulty quantum information on a quantum computer. Architectures with ten or more quantum bits (qubits) have been realized using trapped ions and superconducting circuits. While these implementations are potentially scalable, true scalability will require systems engineering to combine quantum and classical hardware. One technology demanding imminent efforts is the realization of a suitable wiring method for the control and measurement of a large number of qubits. In this work, we introduce an interconnect solution for solid-state qubits: The quantum socket. The quantum socket fully exploits the third dimension to connect classical electronics to qubits with higher density and better performance than two-dimensional methods based on wire bonding. The quantum socket is based on spring-mounted micro wires the three-dimensional wires that push directly on a micro-fabricated chip, making electrical contact. A small wire cross section (~1 mmm), nearly non-magnetic components, and functionality at low temperatures make the quantum socket ideal to operate solid-state qubits. The wires have a coaxial geometry and operate over a frequency range from DC to 8 GHz, with a contact resistance of ~150 mohm, an impedance mismatch of ~10 ohm, and minimal crosstalk. As a proof of principle, we fabricated and used a quantum socket to measure superconducting resonators at a temperature of ~10 mK.Comment: Main: 31 pages, 19 figs., 8 tables, 8 apps.; suppl.: 4 pages, 5 figs. (HiRes figs. and movies on request). Submitte

    Molybdenum targets for production of 99mTc by a medical cyclotron

    Get PDF
    Introduction Alternative methods for producing the medical imaging isotope 99mTc are actively being developed around the world in anticipation of the imminent shutdown of the National Research Universal (NRU) reactor in Chalk River, Ontario, Canada and the high flux reactor (HFR) in Petten, Holland that together currently produce up to 80 % of the world’s supply through fission. The most promising alternative methods involve accelerators that focus Bremsstrahlung radiation or protons on metallic targets comprised of 100Mo and a supporting material used to conduct heat away during irradiation. As an example, the reaction 100Mo(p,2n)99mTc provides a direct route that can be incorporated into routine production in regional nuclear medicine centers that possess medical cyclotrons for production of other isotopes, such as those used for Positron Emission Tomography (PET). The targets used to produce 99mTc are subject to a number of operational constraints. They must withstand the temperatures generated by the irradiation and be fashioned to accommodate temperature gradients from in situ cooling. The targets must be resilient, which means they cannot disintegrate during irradiation or post processing, because of the radioactive nature of the products. Yet, the targets must be easily post-processed to separate the 99mTc. In addition, the method used to manufacture the targets must not be wasteful of the 100Mo, because of its cost (~$2/mg). Any manufacturing process should be able to function remotely in a shielded space to accommodate the possibility of radioactive recycled target feedstock. There are a number of methods that have been proposed for large-scale target manufacturing including electrophoretic deposition, pressing and sinter-ing, electroplating and carburization [1]. How to develop these methods for routine production is an active business [2,3]. From the industrial perspective, plasma spraying showed promising results initially [4], but the process became very expensive requiring customized equipment in order to reduce losses because of overspray,which also required a large inventory of expen-sive feedstock. In this paper we report the ex-perimental validation of an industrial process for production of targets comprising a Mo layer and a copper support. Materials and methods Target Design Targets have been manufactured for irradiation at 15 MeV. Two targets are shown in FIG. 1: one as-manufactured and another after irradiation; no visible changes were observed following irradiation. The supporting circular copper (C101) disks have diameters of 24 mm and thickness of 1.6 mm. The molybdenum in the center of the target is fully dense with thickness 230 ÎŒm determined from SEM cross-sections.Targets have also been manufactured for irradi-ation in a general-purpose target holder designed to be attached to all makes of cyclotrons found in regional nuclear medicine centers. The elliptical targets were designed for high-volume production of 99mTc with 15 MeV protons at currents of 400 ”A with 15% collimation [4]. The elliptical shape reduces the heat flux associated with high current sources. The cooling channels on the back of the target are designed to with-stand the high temperature generated during Irradiation. A thermal simulation of expected temperatures during irradiation is shown in FIG. 3. The center of the target is expected to reach 260 oC during irradiation. The elliptical targets were formed from a 27 mm C101 copper plate with width 22 mm and length 55 mm. The molybdenum in the center of the target is fully dense with thickness 60 m de-termined from SEM cross-sections. FIG. 4 shows the molybdenum deposition in the center of the target in a form of an ellipse (38×10 mm). Results and Conclusions Circular targets have been produced and suc-cessfully irradiated for up to 5 h with a proton beam with energy 15 MeV and current 50 ”A. (FIG. 1). The targets were resilient. Before irradi-ation the targets were subjected to mechanical shock tests and thermal gradients with no ob-servable effect. After irradiation there was no indication of any degradation. The manufacturing process produced 20 consistently reproducible targets within an hour with a molybdenum loss of less than 2 %. After irradiation the targets were chemically processed and the products characterized by Ge-HP gamma spectrometry. Only Tc isotopes were found. No other contami-nants were identified after processing. The de-tails of the separation and purification are de-scribed elsewhere [5]. Circular targets suitable for low-volume produc-tion of 99mTc have been manufactured and test-ed. The targets have been shown to meet the required operation constraints: the targets are resilient withstanding mechanical shock and irradiation conditions; they are readily produced with minimal losses; and post-processing after irradiation for 5 h has been shown to produce 99mTc. Elliptical targets suitable for high-volume pro-duction of 99mTc with high power cyclotrons have been manufactured (FIG. 4). Like the circular targets, the elliptical targets are readily pro-duced with minimal losses and are able to with-stand mechanical shock and thermal gradients; however, they have yet to be irradiated

    High-throughput discovery of fluoride-ion conductors via a decoupled, dynamic, and iterative (DDI) framework

    Get PDF
    Fluoride-ion batteries are a promising alternative to lithium-ion batteries with higher theoretical capacities and working voltages, but they have experienced limited success due to the poor ionic conductivities of known electrolytes and electrodes. Here, we report a high-throughput computational screening of 9747 fluoride-containing materials in search of fluoride-ion conductors. Via a combination of empirical, lightweight DFT, and nudged elastic band (NEB) calculations, we identified >10 crystal systems with high fluoride mobility. We applied a search strategy where calculations are performed in any order (decoupled), computational resources are reassigned based on need (dynamic), and predictive models are repeatedly updated (iterative). Unlike hierarchical searches, our decoupled, dynamic, and iterative framework (DDI) began by calculating high-quality barrier heights for fluoride-ion mobility in a large and diverse group of materials. This high-quality dataset provided a benchmark against which a rapid calculation method could be refined. This accurate method was then used to measure the barrier heights for 6797 fluoride-ion pathways. The final dataset has allowed us to discover many fascinating, high-performance conductors and to derive the design rules that govern their performance. These materials will accelerate experimental research into fluoride-ion batteries, while the design rules will provide an improved foundation for understanding ionic conduction

    Atomic Beams

    Get PDF
    Contains research objectives and reports on four research projects

    Rehabilitating antisocial personalities: treatment through self-governance strategies

    Get PDF
    Offenders with antisocial personality disorder (ASPD) are widely assumed to reject psychotherapeutic intervention. Some commentators, therefore, argue that those with the disorder are better managed in the criminal justice system, where, following the introduction of indeterminate sentences, engagement with psychological treatment is coercively linked to the achievement of parole. By comparison, National Institute of Clinical Excellence guidelines on the management and treatment of ASPD recommend that those who are treatment seeking should be considered for admission to specialist psychiatric hospitals. The rationale is that prison-based interventions are underresourced, and the treatment of ASPD is underprioritised. The justification is that offenders with ASPD can be rehabilitated, if they are motivated. One problem, however, is that little is known about why offenders with ASPD seek treatment or what effect subsequent treatment has on their self-understanding. The aim of this paper is to address these unresolved issues. It draws on the findings of Economic and Social Research Council (ESRC) funded qualitative study examining the experiences of sentenced male offenders admitted to a specialist personality disorder ward within the medium secure estate and the medical practitioners who treat them. The data are analysed with reference to Michel Foucault’s work on governmentality and strategy in power relations. Two arguments are advanced: first, offenders with ASPD are motivated by legal coercive pressures to implement a variety of Foucauldian-type strategies to give the false impression of treatment progress. Second, and related, treatment does not result in changes in self-understanding in the resistive client with ASPD. This presupposes that, in respect of this group at least, Foucault was mistaken in his claim that resistive behaviours merely mask the effectiveness of treatment norms over time. Nevertheless, the paper concludes that specialist treatment in the hospital setting can effect changes in the resistive offender’s self-understanding, but not if the completion of treatment results, as is commonplace, in his prison readmission

    Relativistic separable dual-space Gaussian Pseudopotentials from H to Rn

    Full text link
    We generalize the concept of separable dual-space Gaussian pseudopotentials to the relativistic case. This allows us to construct this type of pseudopotential for the whole periodic table and we present a complete table of pseudopotential parameters for all the elements from H to Rn. The relativistic version of this pseudopotential retains all the advantages of its nonrelativistic version. It is separable by construction, it is optimal for integration on a real space grid, it is highly accurate and due to its analytic form it can be specified by a very small number of parameters. The accuracy of the pseudopotential is illustrated by an extensive series of molecular calculations

    Teleology and Realism in Leibniz's Philosophy of Science

    Get PDF
    This paper argues for an interpretation of Leibniz’s claim that physics requires both mechanical and teleological principles as a view regarding the interpretation of physical theories. Granting that Leibniz’s fundamental ontology remains non-physical, or mentalistic, it argues that teleological principles nevertheless ground a realist commitment about mechanical descriptions of phenomena. The empirical results of the new sciences, according to Leibniz, have genuine truth conditions: there is a fact of the matter about the regularities observed in experience. Taking this stance, however, requires bringing non-empirical reasons to bear upon mechanical causal claims. This paper first evaluates extant interpretations of Leibniz’s thesis that there are two realms in physics as describing parallel, self-sufficient sets of laws. It then examines Leibniz’s use of teleological principles to interpret scientific results in the context of his interventions in debates in seventeenth-century kinematic theory, and in the teaching of Copernicanism. Leibniz’s use of the principle of continuity and the principle of simplicity, for instance, reveal an underlying commitment to the truth-aptness, or approximate truth-aptness, of the new natural sciences. The paper concludes with a brief remark on the relation between metaphysics, theology, and physics in Leibniz

    Examining the Impact of Imputation Errors on Fine-Mapping Using DNA Methylation QTL as a Model Trait

    Get PDF
    Genetic variants disrupting DNA methylation at CpG dinucleotides (CpG-SNP) provide a set of known causal variants to serve as models for testing fine-mapping methodology. We use 1716 CpG-SNPs to test three fine-mapping approaches (BIMBAM, BSLMM, and the J-test), assessing the impact of imputation errors and the choice of reference panel by using both whole-genome sequence (WGS), and genotype array data on the same individuals (n=1166). The choice of imputation reference panel had a strong effect on imputation accuracy, with the 1000 Genomes Phase 3 (1000G) reference panel (n=2504 from 26 populations) giving a mean non-reference discordance rate between imputed and sequenced genotypes of 3.2% compared to 1.6% when using the Haplotype Reference Consortium (HRC) reference panel (n=32470 Europeans). These imputation errors impacted on whether the CpG-SNP was included in the 95% credible set, with a difference of ∌ 23% and ∌ 7% between the WGS and the 1000G and HRC imputed datasets respectively. All of the fine-mapping methods failed to reach the expected 95% coverage of the CpG-SNP. This is attributed to secondary cis genetic effects that are unable to be statistically separated from the CpG-SNP, and through a masking mechanism where the effect of the methylation disrupting allele at the CpG-SNP is hidden by the effect of a nearby SNP that has strong LD with the CpG-SNP. The reduced accuracy in fine-mapping a known causal variant in a low level biological trait with imputed genetic data has implications for the study of higher order complex traits and disease
    • 

    corecore