1,065 research outputs found

    Dense matter with eXTP

    Full text link
    In this White Paper we present the potential of the Enhanced X-ray Timing and Polarimetry (eXTP) mission for determining the nature of dense matter; neutron star cores host an extreme density regime which cannot be replicated in a terrestrial laboratory. The tightest statistical constraints on the dense matter equation of state will come from pulse profile modelling of accretion-powered pulsars, burst oscillation sources, and rotation-powered pulsars. Additional constraints will derive from spin measurements, burst spectra, and properties of the accretion flows in the vicinity of the neutron star. Under development by an international Consortium led by the Institute of High Energy Physics of the Chinese Academy of Science, the eXTP mission is expected to be launched in the mid 2020s.Comment: Accepted for publication on Sci. China Phys. Mech. Astron. (2019

    Detection of cannabinoid receptor type 2 in native cells and zebrafish with a highly potent, cell-permeable fluorescent probe.

    Get PDF
    Despite its essential role in the (patho)physiology of several diseases, CB2R tissue expression profiles and signaling mechanisms are not yet fully understood. We report the development of a highly potent, fluorescent CB2R agonist probe employing structure-based reverse design. It commences with a highly potent, preclinically validated ligand, which is conjugated to a silicon-rhodamine fluorophore, enabling cell permeability. The probe is the first to preserve interspecies affinity and selectivity for both mouse and human CB2R. Extensive cross-validation (FACS, TR-FRET and confocal microscopy) set the stage for CB2R detection in endogenously expressing living cells along with zebrafish larvae. Together, these findings will benefit clinical translatability of CB2R based drugs

    PEDIA: prioritization of exome data by image analysis.

    Get PDF
    PURPOSE: Phenotype information is crucial for the interpretation of genomic variants. So far it has only been accessible for bioinformatics workflows after encoding into clinical terms by expert dysmorphologists. METHODS: Here, we introduce an approach driven by artificial intelligence that uses portrait photographs for the interpretation of clinical exome data. We measured the value added by computer-assisted image analysis to the diagnostic yield on a cohort consisting of 679 individuals with 105 different monogenic disorders. For each case in the cohort we compiled frontal photos, clinical features, and the disease-causing variants, and simulated multiple exomes of different ethnic backgrounds. RESULTS: The additional use of similarity scores from computer-assisted analysis of frontal photos improved the top 1 accuracy rate by more than 20-89% and the top 10 accuracy rate by more than 5-99% for the disease-causing gene. CONCLUSION: Image analysis by deep-learning algorithms can be used to quantify the phenotypic similarity (PP4 criterion of the American College of Medical Genetics and Genomics guidelines) and to advance the performance of bioinformatics pipelines for exome analysis

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Left ventricular function, congestion, and effect of empagliflozin on heart failure risk after myocardial infarction

    Get PDF
    Background Empagliflozin reduces the risk of heart failure (HF) hospitalizations but not all-cause mortality when started within 14 days of acute myocardial infarction (AMI). Objective To evaluate the association between left ventricular ejection fraction (LVEF), congestion, or both on outcomes and the impact of empagliflozin in reducing HF risk post-MI. Methods In the EMPACT-MI trial, patients were randomized within 14 days of an AMI complicated by either newly reduced LVEF<45%, congestion, or both to empagliflozin 10 mg daily or placebo and followed for a median of 17.9 months. Results Among 6522 patients, the mean baseline LVEF was 41%+9%; 2648 patients (40.6%) presented with LVEF<45% alone, 1483 (22.7%) presented with congestion alone, and 2181 (33.4%) presented with both. Among patients in the placebo arm, multivariable adjusted risk for each 10-point reduction in LVEF included all-cause death or HF hospitalization (hazard ratio [HR] 1.49; 95%CI, 1.31-1.69; P<0.0001), first HF hospitalization (HR, 1.64; 95%CI, 1.37-1.96; P<0.0001), and total HF hospitalizations (rate ratio [RR], 1.89; 95%CI, 1.51-2.36; P<0.0001). Presence of congestion was also associated with a significantly higher risk for each of these outcomes (HR 1.52, 1.94, and RR 2.03, respectively). Empagliflozin reduced the risk for first (HR 0.77, 95%CI 0.60-0.98) and total (RR 0.67, 95%CI 0.50-0.89) HF hospitalization, irrespective of LVEF or congestion or both. The safety profile of empagliflozin was consistent across baseline LVEF and irrespective of congestion status. Conclusions In patients with AMI, severity of LV dysfunction and the presence of congestion was associated with worse outcomes. Empagliflozin reduced first and total HF hospitalizations across the range of LVEF with and without congestion

    Search for heavy resonances decaying to two Higgs bosons in final states containing four b quarks

    Get PDF
    A search is presented for narrow heavy resonances X decaying into pairs of Higgs bosons (H) in proton-proton collisions collected by the CMS experiment at the LHC at root s = 8 TeV. The data correspond to an integrated luminosity of 19.7 fb(-1). The search considers HH resonances with masses between 1 and 3 TeV, having final states of two b quark pairs. Each Higgs boson is produced with large momentum, and the hadronization products of the pair of b quarks can usually be reconstructed as single large jets. The background from multijet and t (t) over bar events is significantly reduced by applying requirements related to the flavor of the jet, its mass, and its substructure. The signal would be identified as a peak on top of the dijet invariant mass spectrum of the remaining background events. No evidence is observed for such a signal. Upper limits obtained at 95 confidence level for the product of the production cross section and branching fraction sigma(gg -> X) B(X -> HH -> b (b) over barb (b) over bar) range from 10 to 1.5 fb for the mass of X from 1.15 to 2.0 TeV, significantly extending previous searches. For a warped extra dimension theory with amass scale Lambda(R) = 1 TeV, the data exclude radion scalar masses between 1.15 and 1.55 TeV

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe
    corecore