1,081 research outputs found

    Human-based approaches to pharmacology and cardiology: an interdisciplinary and intersectorial workshop.

    Get PDF
    Both biomedical research and clinical practice rely on complex datasets for the physiological and genetic characterization of human hearts in health and disease. Given the complexity and variety of approaches and recordings, there is now growing recognition of the need to embed computational methods in cardiovascular medicine and science for analysis, integration and prediction. This paper describes a Workshop on Computational Cardiovascular Science that created an international, interdisciplinary and inter-sectorial forum to define the next steps for a human-based approach to disease supported by computational methodologies. The main ideas highlighted were (i) a shift towards human-based methodologies, spurred by advances in new in silico, in vivo, in vitro, and ex vivo techniques and the increasing acknowledgement of the limitations of animal models. (ii) Computational approaches complement, expand, bridge, and integrate in vitro, in vivo, and ex vivo experimental and clinical data and methods, and as such they are an integral part of human-based methodologies in pharmacology and medicine. (iii) The effective implementation of multi- and interdisciplinary approaches, teams, and training combining and integrating computational methods with experimental and clinical approaches across academia, industry, and healthcare settings is a priority. (iv) The human-based cross-disciplinary approach requires experts in specific methodologies and domains, who also have the capacity to communicate and collaborate across disciplines and cross-sector environments. (v) This new translational domain for human-based cardiology and pharmacology requires new partnerships supported financially and institutionally across sectors. Institutional, organizational, and social barriers must be identified, understood and overcome in each specific setting

    A high efficiency photon veto for the Light Dark Matter eXperiment

    Get PDF
    Fixed-target experiments using primary electron beams can be powerful discovery tools for light dark matter in the sub-GeV mass range. The Light Dark Matter eXperiment (LDMX) is designed to measure missing momentum in high-rate electron fixed-target reactions with beam energies of 4 GeV to 16 GeV. A prerequisite for achieving several important sensitivity milestones is the capability to efficiently reject backgrounds associated with few-GeV bremsstrahlung, by twelve orders of magnitude, while maintaining high efficiency for signal. The primary challenge arises from events with photo-nuclear reactions faking the missing-momentum property of a dark matter signal. We present a methodology developed for the LDMX detector concept that is capable of the required rejection. By employing a detailed Geant4-based model of the detector response, we demonstrate that the sampling calorimetry proposed for LDMX can achieve better than 10⁻ÂčÂł rejection of few-GeV photons. This suggests that the luminosity-limited sensitivity of LDMX can be realized at 4 GeV and higher beam energies

    Measurement of the ratio of the inclusive 3-jet cross section to the inclusive 2-jet cross section in pp collisions at √s = 7 TeV and first determination of the strong coupling constant in the TeV range

    Get PDF
    A measurement is presented of the ratio of the inclusive 3-jet cross section to the inclusive 2-jet cross section as a function of the average transverse momentum, ⟹p[subscript T1,2]⟩, of the two leading jets in the event. The data sample was collected during 2011 at a proton–proton centre-of-mass energy of 7 TeV with the CMS detector at the LHC, corresponding to an integrated luminosity of 5.0 fb[subscript −1]. The strong coupling constant at the scale of the Z boson mass is determined to be α[subscript S](M[subscript Z])=0.1148±0.0014 (exp.)±0.0018 (PDF)±0.0050(theory), by comparing the ratio in the range 0.42<⟹p[subscript T1,2]⟩<1.39 TeV to the predictions of perturbative QCD at next-to-leading order. This is the first determination of α[subscript S](M[subscript Z]) from measurements at momentum scales beyond 0.6 TeV. The predicted ratio depends only indirectly on the evolution of the parton distribution functions of the proton such that this measurement also serves as a test of the evolution of the strong coupling constant. No deviation from the expected behaviour is observed.United States. Department of EnergyNational Science Foundation (U.S.

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð„with constraintsð ð ð„ „ ðandðŽð„ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation