1,968 research outputs found

    The Construction and Operation of a Reaction Time Machine

    Get PDF
    The Encyclopedia Britannica states that One of the cardinal problems of psychophysics is the measurement of the duration of the mental processes . To accomplish this purpose it is necessary to have a chronograph or chronoscope measuring in small fractions of a second with connections suitable to the experiment

    Dual modification of Alzheimer’s disease PHF-tau protein by lysine methylation and ubiquitylation: a mass spectrometry approach

    Get PDF
    In sporadic Alzheimer’s disease (AD), neurofibrillary lesion formation is preceded by extensive post-translational modification of the microtubule associated protein tau. To identify the modification signature associated with tau lesion formation at single amino acid resolution, immunopurified paired helical filaments were isolated from AD brain and subjected to nanoflow liquid chromatography–tandem mass spectrometry analysis. The resulting spectra identified monomethylation of lysine residues as a new tau modification. The methyl-lysine was distributed among seven residues located in the projection and microtubule binding repeat regions of tau protein, with one site, K254, being a substrate for a competing lysine modification, ubiquitylation. To characterize methyl lysine content in intact tissue, hippocampal sections prepared from post mortem late-stage AD cases were subjected to double-label confocal fluorescence microscopy using anti-tau and anti-methyl lysine antibodies. Anti-methyl lysine immunoreactivity colocalized with 78 ± 13% of neurofibrillary tangles in these specimens. Together these data provide the first evidence that tau in neurofibrillary lesions is post-translationally modified by lysine methylation

    Multi-messenger observations of a binary neutron star merger

    Get PDF
    On 2017 August 17 a binary neutron star coalescence candidate (later designated GW170817) with merger time 12:41:04 UTC was observed through gravitational waves by the Advanced LIGO and Advanced Virgo detectors. The Fermi Gamma-ray Burst Monitor independently detected a gamma-ray burst (GRB 170817A) with a time delay of ~1.7 s with respect to the merger time. From the gravitational-wave signal, the source was initially localized to a sky region of 31 deg2 at a luminosity distance of 40+8-8 Mpc and with component masses consistent with neutron stars. The component masses were later measured to be in the range 0.86 to 2.26 Mo. An extensive observing campaign was launched across the electromagnetic spectrum leading to the discovery of a bright optical transient (SSS17a, now with the IAU identification of AT 2017gfo) in NGC 4993 (at ~40 Mpc) less than 11 hours after the merger by the One- Meter, Two Hemisphere (1M2H) team using the 1 m Swope Telescope. The optical transient was independently detected by multiple teams within an hour. Subsequent observations targeted the object and its environment. Early ultraviolet observations revealed a blue transient that faded within 48 hours. Optical and infrared observations showed a redward evolution over ~10 days. Following early non-detections, X-ray and radio emission were discovered at the transient’s position ~9 and ~16 days, respectively, after the merger. Both the X-ray and radio emission likely arise from a physical process that is distinct from the one that generates the UV/optical/near-infrared emission. No ultra-high-energy gamma-rays and no neutrino candidates consistent with the source were found in follow-up searches. These observations support the hypothesis that GW170817 was produced by the merger of two neutron stars in NGC4993 followed by a short gamma-ray burst (GRB 170817A) and a kilonova/macronova powered by the radioactive decay of r-process nuclei synthesized in the ejecta

    Measurement of associated Z plus charm production in proton-proton collisions at root s=8TeV

    Get PDF
    A study of the associated production of a Z boson and a charm quark jet (Z + c), and a comparison to production with a b quark jet (Z + b), in pp collisions at a centre-of-mass energy of 8 TeV are presented. The analysis uses a data sample corresponding to an integrated luminosity of 19.7 fb(-1), collected with the CMS detector at the CERN LHC. The Z boson candidates are identified through their decays into pairs of electrons or muons. Jets originating from heavy flavour quarks are identified using semileptonic decays of c or b flavoured hadrons and hadronic decays of charm hadrons. The measurements are performed in the kinematic region with two leptons with pT(l) > 20 GeV, vertical bar eta(l)vertical bar 25 GeV and vertical bar eta(jet)vertical bar Z + c + X) B(Z -> l(+)l(-)) = 8.8 +/- 0.5 (stat)+/- 0.6 (syst) pb. The ratio of the Z+c and Z+b production cross sections is measured to be sigma(pp -> Z+c+X)/sigma (pp -> Z+b+X) = 2.0 +/- 0.2 (stat)+/- 0.2 (syst). The Z+c production cross section and the cross section ratio are also measured as a function of the transverse momentum of theZ boson and of the heavy flavour jet. The measurements are compared with theoretical predictions.Peer reviewe

    Measurement of the underlying event activity in inclusive Z boson production in proton-proton collisions at root s=13 TeV

    Get PDF
    This paper presents a measurement of the underlying event activity in proton-proton collisions at a center-of-mass energy of 13TeV, performed using inclusive Z boson production events collected with the CMS experiment at the LHC. The analyzed data correspond to an integrated luminosity of 2.1 fb(-1). The underlying event activity is quantified in terms of the charged particle multiplicity, as well as of the scalar sum of the charged particles' transverse momenta in different topological regions defined with respect to the Z boson direction. The distributions are unfolded to the stable particle level and compared with predictions from various Monte Carlo event generators, as well as with similar CDF and CMS measurements at center-of-mass energies of 1.96 and 7TeV respectively.Peer reviewe

    Search for a singly produced third-generation scalar leptoquark decaying to a tau lepton and a bottom quark in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for a singly produced third-generation scalar leptoquark decaying to a tau lepton and a bottom quark. Associated production of a leptoquark and a tau lepton is considered, leading to a final state with a bottom quark and two tau leptons. The search uses proton-proton collision data at a center-of-mass energy of 13 TeV recorded with the CMS detector, corresponding to an integrated luminosity of 35.9 fb(-1). Upper limits are set at 95% confidence level on the production cross section of the third-generation scalar leptoquarks as a function of their mass. From a comparison of the results with the theoretical predictions, a third-generation scalar leptoquark decaying to a tau lepton and a bottom quark, assuming unit Yukawa coupling (lambda), is excluded for masses below 740 GeV. Limits are also set on lambda of the hypothesized leptoquark as a function of its mass. Above lambda = 1.4, this result provides the best upper limit on the mass of a third-generation scalar leptoquark decaying to a tau lepton and a bottom quark.Peer reviewe

    Measurement of differential cross sections in the kinematic angular variable phi* for inclusive Z boson production in pp collisions at root s=8 TeV

    Get PDF
    Measurements of differential cross sections d sigma/d phi* and double-differential cross sections d(2)sigma/ld phi*d/y/ for inclusive Z boson production are presented using the dielectron and dimuon final states. The kinematic observable phi* correlates with the dilepton transverse momentum but has better resolution, and y is the dilepton rapidity. The analysis is based on data collected with the CMS experiment at a centre-of-mass energy of 8 TeV corresponding to an integrated luminosity of 19.7 fb(-1). The normalised cross section (1/sigma) d sigma/d phi*, within the fiducial kinematic region, is measured with a precision of better than 0.5% for phi* <1. The measurements are compared to theoretical predictions and they agree, typically, within few percent.Peer reviewe

    Constraints on models of scalar and vector leptoquarks decaying to a quark and a neutrino at root s=13 TeV

    Get PDF
    The results of a previous search by the CMS Collaboration for squarks and gluinos are reinterpreted to constrain models of leptoquark (LQ) production. The search considers jets in association with a transverse momentum imbalance, using the M-T2 variable. The analysis uses proton-proton collision data at root s = 13 TeV, recorded with the CMS detector at the LHC in 2016 and corresponding to an integrated luminosity of 35.9 fb(-1). Leptoquark pair production is considered with LQ decays to a neutrino and a top, bottom, or light quark. This reinterpretation considers higher mass values than the original CMS search to constrain both scalar and vector LQs. Limits on the cross section for LQ pair production are derived at the 95% confidence level depending on the LQ decay mode. A vector LQ decaying with a 50% branching fraction to t nu, and 50% to b tau, has been proposed as part of an explanation of anomalous flavor physics results. In such a model, using only the decays to t nu, LQ masses below 1530 GeV are excluded assuming the Yang-Mills case with coupling kappa = 1, or 1115 GeV in the minimal coupling case kappa = 0, placing the most stringent constraint to date from pair production of vector LQs.Peer reviewe

    The United States COVID-19 Forecast Hub dataset

    Get PDF
    Academic researchers, government agencies, industry groups, and individuals have produced forecasts at an unprecedented scale during the COVID-19 pandemic. To leverage these forecasts, the United States Centers for Disease Control and Prevention (CDC) partnered with an academic research lab at the University of Massachusetts Amherst to create the US COVID-19 Forecast Hub. Launched in April 2020, the Forecast Hub is a dataset with point and probabilistic forecasts of incident cases, incident hospitalizations, incident deaths, and cumulative deaths due to COVID-19 at county, state, and national, levels in the United States. Included forecasts represent a variety of modeling approaches, data sources, and assumptions regarding the spread of COVID-19. The goal of this dataset is to establish a standardized and comparable set of short-term forecasts from modeling teams. These data can be used to develop ensemble models, communicate forecasts to the public, create visualizations, compare models, and inform policies regarding COVID-19 mitigation. These open-source data are available via download from GitHub, through an online API, and through R packages

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis
    corecore