1,654 research outputs found

    Picosecond time-resolved x-ray diffraction probe of coherent lattice dynamics (abstract) (invited)

    Full text link
    The short pulses of hard x rays from synchrotron and laser based sources are sensitive probes of lattice dynamics on an ultrafast time scale. Using pump–probe time-resolved x-ray diffraction, we are able to follow the propagation of a picosecond coherent acoustic pulse in an ultrafast laser-strained single crystal. Comparison of the data with dynamical diffraction simulations allows for the quantitative determination of both the surface and bulk components of the associated strain. This technique is scalable to femtosecond and shorter time scales as x-ray pulses become shorter in duration, such as in fourth generation light sources. In addition, the diffraction of x rays off of coherent optical phonons may lead to the production of a femtosecond x-ray switch. © 2002 American Institute of Physics.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/69893/2/RSINAK-73-3-1361-1.pd

    Coherent control of pulsed X-ray beams

    Full text link
    Synchrotrons produce continuous trains of closely spaced X-ray pulses. Application of such sources to the study of atomic-scale motion requires efficient modulation of these beams on timescales ranging from nanoseconds to femtoseconds. However, ultrafast X-ray modulators are not generally available. Here we report efficient subnanosecond coherent switching of synchrotron beams by using acoustic pulses in a crystal to modulate the anomalous low-loss transmission of X-ray pulses. The acoustic excitation transfers energy between two X-ray beams in a time shorter than the synchrotron pulse width of about 100 ps. Gigahertz modulation of the diffracted X-rays is also observed. We report different geometric arrangements, such as a switch based on the collision of two counter-propagating acoustic pulses: this doubles the X-ray modulation frequency, and also provides a means of observing a localized transient strain inside an opaque material. We expect that these techniques could be scaled to produce subpicosecond pulses, through laser-generated coherent optical phonon modulation of X-ray diffraction in crystals. Such ultrafast capabilities have been demonstrated thus far only in laser-generated X-ray sources, or through the use of X-ray streak cameras(1-6).Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/62852/1/413825a0.pd

    Rare mutations in SQSTM1 modify susceptibility to frontotemporal lobar degeneration

    Get PDF
    Mutations in the gene coding for Sequestosome 1 (SQSTM1) have been genetically associated with amyotrophic lateral sclerosis (ALS) and Paget disease of bone. In the present study, we analyzed the SQSTM1 coding sequence for mutations in an extended cohort of 1,808 patients with frontotemporal lobar degeneration (FTLD), ascertained within the European Early-Onset Dementia consortium. As control dataset, we sequenced 1,625 European control individuals and analyzed whole-exome sequence data of 2,274 German individuals (total n = 3,899). Association of rare SQSTM1 mutations was calculated in a meta-analysis of 4,332 FTLD and 10,240 control alleles. We identified 25 coding variants in FTLD patients of which 10 have not been described. Fifteen mutations were absent in the control individuals (carrier frequency < 0.00026) whilst the others were rare in both patients and control individuals. When pooling all variants with a minor allele frequency < 0.01, an overall frequency of 3.2 % was calculated in patients. Rare variant association analysis between patients and controls showed no difference over the whole protein, but suggested that rare mutations clustering in the UBA domain of SQSTM1 may influence disease susceptibility by doubling the risk for FTLD (RR = 2.18 [95 % CI 1.24-3.85]; corrected p value = 0.042). Detailed histopathology demonstrated that mutations in SQSTM1 associate with widespread neuronal and glial phospho-TDP-43 pathology. With this study, we provide further evidence for a putative role of rare mutations in SQSTM1 in the genetic etiology of FTLD and showed that, comparable to other FTLD/ALS genes, SQSTM1 mutations are associated with TDP-43 pathology

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Search for a singly produced third-generation scalar leptoquark decaying to a tau lepton and a bottom quark in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for a singly produced third-generation scalar leptoquark decaying to a tau lepton and a bottom quark. Associated production of a leptoquark and a tau lepton is considered, leading to a final state with a bottom quark and two tau leptons. The search uses proton-proton collision data at a center-of-mass energy of 13 TeV recorded with the CMS detector, corresponding to an integrated luminosity of 35.9 fb(-1). Upper limits are set at 95% confidence level on the production cross section of the third-generation scalar leptoquarks as a function of their mass. From a comparison of the results with the theoretical predictions, a third-generation scalar leptoquark decaying to a tau lepton and a bottom quark, assuming unit Yukawa coupling (lambda), is excluded for masses below 740 GeV. Limits are also set on lambda of the hypothesized leptoquark as a function of its mass. Above lambda = 1.4, this result provides the best upper limit on the mass of a third-generation scalar leptoquark decaying to a tau lepton and a bottom quark.Peer reviewe

    Constraints on models of scalar and vector leptoquarks decaying to a quark and a neutrino at root s=13 TeV

    Get PDF
    The results of a previous search by the CMS Collaboration for squarks and gluinos are reinterpreted to constrain models of leptoquark (LQ) production. The search considers jets in association with a transverse momentum imbalance, using the M-T2 variable. The analysis uses proton-proton collision data at root s = 13 TeV, recorded with the CMS detector at the LHC in 2016 and corresponding to an integrated luminosity of 35.9 fb(-1). Leptoquark pair production is considered with LQ decays to a neutrino and a top, bottom, or light quark. This reinterpretation considers higher mass values than the original CMS search to constrain both scalar and vector LQs. Limits on the cross section for LQ pair production are derived at the 95% confidence level depending on the LQ decay mode. A vector LQ decaying with a 50% branching fraction to t nu, and 50% to b tau, has been proposed as part of an explanation of anomalous flavor physics results. In such a model, using only the decays to t nu, LQ masses below 1530 GeV are excluded assuming the Yang-Mills case with coupling kappa = 1, or 1115 GeV in the minimal coupling case kappa = 0, placing the most stringent constraint to date from pair production of vector LQs.Peer reviewe

    Towards Equitable, Diverse, and Inclusive science collaborations: The Multimessenger Diversity Network

    Get PDF

    Non-standard neutrino interactions in IceCube

    Get PDF
    Non-standard neutrino interactions (NSI) may arise in various types of new physics. Their existence would change the potential that atmospheric neutrinos encounter when traversing Earth matter and hence alter their oscillation behavior. This imprint on coherent neutrino forward scattering can be probed using high-statistics neutrino experiments such as IceCube and its low-energy extension, DeepCore. Both provide extensive data samples that include all neutrino flavors, with oscillation baselines between tens of kilometers and the diameter of the Earth. DeepCore event energies reach from a few GeV up to the order of 100 GeV - which marks the lower threshold for higher energy IceCube atmospheric samples, ranging up to 10 TeV. In DeepCore data, the large sample size and energy range allow us to consider not only flavor-violating and flavor-nonuniversal NSI in the μ−τ sector, but also those involving electron flavor. The effective parameterization used in our analyses is independent of the underlying model and the new physics mass scale. In this way, competitive limits on several NSI parameters have been set in the past. The 8 years of data available now result in significantly improved sensitivities. This improvement stems not only from the increase in statistics but also from substantial improvement in the treatment of systematic uncertainties, background rejection and event reconstruction
    corecore