122 research outputs found
Structural and Magnetic Properties of Pyrochlore Solid Solutions (Y,Lu)2Ti2-x(Nb,Ta)xO7+/-y
The synthesis and characterization of the pyrochlore solid solutions,
Y2Ti2-xNbxO7-y, Lu2Ti2-xNbxO7-y, Y2Ti2-xTaxO7-y and Lu2TiTaO7-y (-0.4<y<0.5),
is described. Synthesis at 1600 C, and 10-5 Torr yields oxygen deficiency in
all systems. All compounds are found to be paramagnetic and semiconducting,
with the size of the local moments being less, in some cases substantially
less, than the expected value for the number of nominally unpaired electrons
present. Thermogravimetric analysis (TGA) shows that all compounds can be fully
oxidized while retaining the pyrochlore structure, yielding oxygen rich
pyrochlores as white powders. Powder neutron diffraction of Y2TiNbO7-based
samples was done. Refinement of the data for oxygen deficient Y2TiNbO6.76
indicates the presence of a distribution of oxygen over the 8b and 48f sites.
Refinement of the data for oxygen rich Y2TiNbO7.5 shows these sites to be
completely filled, with an additional half filling of the 8a site. The magnetic
and TGA data strongly suggest a preference for a Ti3+/(Nb,Ta)5+ combination, as
opposed to Ti4+/(Nb,Ta)4+, in this pyrochlore family. In addition, the evidence
clearly points to Ti3+ as the source of the localized moments, with no evidence
for localized Nb4+ moments.Comment: Accepted to Journal of Solid State Chemistr
Quantum Attractor Flows
Motivated by the interpretation of the Ooguri-Strominger-Vafa conjecture as a
holographic correspondence in the mini-superspace approximation, we study the
radial quantization of stationary, spherically symmetric black holes in four
dimensions. A key ingredient is the classical equivalence between the radial
evolution equation and geodesic motion of a fiducial particle on the moduli
space M^*_3 of the three-dimensional theory after reduction along the time
direction. In the case of N=2 supergravity, M^*_3 is a para-quaternionic-Kahler
manifold; in this case, we show that BPS black holes correspond to a particular
class of geodesics which lift holomorphically to the twistor space Z of M^*_3,
and identify Z as the BPS phase space. We give a natural quantization of the
BPS phase space in terms of the sheaf cohomology of Z, and compute the exact
wave function of a BPS black hole with fixed electric and magnetic charges in
this framework. We comment on the relation to the topological string amplitude,
extensions to N>2 supergravity theories, and applications to automorphic black
hole partition functions.Comment: 43 pages, 6 figures; v2: typos and references added; v3: published
version, minor change
Projected WIMP sensitivity of the LUX-ZEPLIN dark matter experiment
LUX-ZEPLIN (LZ) is a next-generation dark matter direct detection experiment that will operate 4850 feet underground at the Sanford Underground Research Facility (SURF) in Lead, South Dakota, USA. Using a two-phase xenon detector with an active mass of 7 tonnes, LZ will search primarily for low-energy interactions with weakly interacting massive particles (WIMPs), which are hypothesized to make up the dark matter in our galactic halo. In this paper, the projected WIMP sensitivity of LZ is presented based on the latest background estimates and simulations of the detector. For a 1000 live day run using a 5.6-tonne fiducial mass, LZ is projected to exclude at 90% confidence level spin-independent WIMP-nucleon cross sections above 1.4 × 10-48cm2 for a 40 GeV/c2 mass WIMP.
Additionally, a 5σ discovery potential is projected, reaching cross sections below the exclusion limits of recent experiments. For spin-dependent WIMP-neutron(-proton) scattering, a sensitivity of 2.3 × 10−43 cm2 (7.1 × 10−42 cm2) for a 40 GeV/c2
mass WIMP is expected. With underground installation well underway, LZ is on track for commissioning at SURF in 2020
TRY plant trait database – enhanced coverage and open access
Plant traits—the morphological, anatomical, physiological, biochemical and phenological characteristics of plants—determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait‐based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits—almost complete coverage for ‘plant growth form’. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait–environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives
Volume I. Introduction to DUNE
The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. This TDR is intended to justify the technical choices for the far detector that flow down from the high-level physics goals through requirements at all levels of the Project. Volume I contains an executive summary that introduces the DUNE science program, the far detector and the strategy for its modular designs, and the organization and management of the Project. The remainder of Volume I provides more detail on the science program that drives the choice of detector technologies and on the technologies themselves. It also introduces the designs for the DUNE near detector and the DUNE computing model, for which DUNE is planning design reports. Volume II of this TDR describes DUNE\u27s physics program in detail. Volume III describes the technical coordination required for the far detector design, construction, installation, and integration, and its organizational structure. Volume IV describes the single-phase far detector technology. A planned Volume V will describe the dual-phase technology
Large expert-curated database for benchmarking document similarity detection in biomedical literature search
Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe
Deep Underground Neutrino Experiment (DUNE), far detector technical design report, volume III: DUNE far detector technical coordination
The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. Volume III of this TDR describes how the activities required to design, construct, fabricate, install, and commission the DUNE far detector modules are organized and managed. This volume details the organizational structures that will carry out and/or oversee the planned far detector activities safely, successfully, on time, and on budget. It presents overviews of the facilities, supporting infrastructure, and detectors for context, and it outlines the project-related functions and methodologies used by the DUNE technical coordination organization, focusing on the areas of integration engineering, technical reviews, quality assurance and control, and safety oversight. Because of its more advanced stage of development, functional examples presented in this volume focus primarily on the single-phase (SP) detector module
Highly-parallelized simulation of a pixelated LArTPC on a GPU
The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype
Whole-genome sequencing reveals host factors underlying critical COVID-19
Critical COVID-19 is caused by immune-mediated inflammatory lung injury. Host genetic variation influences the development of illness requiring critical care1 or hospitalization2,3,4 after infection with SARS-CoV-2. The GenOMICC (Genetics of Mortality in Critical Care) study enables the comparison of genomes from individuals who are critically ill with those of population controls to find underlying disease mechanisms. Here we use whole-genome sequencing in 7,491 critically ill individuals compared with 48,400 controls to discover and replicate 23 independent variants that significantly predispose to critical COVID-19. We identify 16 new independent associations, including variants within genes that are involved in interferon signalling (IL10RB and PLSCR1), leucocyte differentiation (BCL11A) and blood-type antigen secretor status (FUT2). Using transcriptome-wide association and colocalization to infer the effect of gene expression on disease severity, we find evidence that implicates multiple genes—including reduced expression of a membrane flippase (ATP11A), and increased expression of a mucin (MUC1)—in critical disease. Mendelian randomization provides evidence in support of causal roles for myeloid cell adhesion molecules (SELE, ICAM5 and CD209) and the coagulation factor F8, all of which are potentially druggable targets. Our results are broadly consistent with a multi-component model of COVID-19 pathophysiology, in which at least two distinct mechanisms can predispose to life-threatening disease: failure to control viral replication; or an enhanced tendency towards pulmonary inflammation and intravascular coagulation. We show that comparison between cases of critical illness and population controls is highly efficient for the detection of therapeutically relevant mechanisms of disease
Recommended from our members
Helical Pulse Line Structures for Ion Acceleration
The basic concept of the ''Pulse Line Ion Accelerator'' is presented, where pulse power sources create a ramped traveling wave voltage pulse on a helical pulse line. Ions can surf on this traveling wave and achieve energy gains much larger than the peak applied voltage. Tapered and untapered lines are compared, and a transformer coupling technique for launching the wave is described
- …