716 research outputs found

    Diagnosis of choroidal disease with deep learning-based image enhancement and volumetric quantification of optical coherence tomography

    Get PDF
    Purpose: The purpose of this study was to quantify choroidal vessels (CVs) in pathological eyes in three dimensions (3D) using optical coherence tomography (OCT) and a deep-learning analysis. Methods: A single-center retrospective study including 34 eyes of 34 patients (7 women and 27 men) with treatment-naïve central serous chorioretinopathy (CSC) and 33 eyes of 17 patients (7 women and 10 men) with Vogt-Koyanagi-Harada disease (VKH) or sympathetic ophthalmitis (SO) were imaged consecutively between October 2012 and May 2019 with a swept source OCT. Seventy-seven eyes of 39 age-matched volunteers (26 women and 13 men) with no sign of ocular pathology were imaged for comparison. Deep-learning-based image enhancement pipeline enabled CV segmentation and visualization in 3D, after which quantitative vessel volume maps were acquired to compare normal and diseased eyes and to track the clinical course of eyes in the disease group. Region-based vessel volumes and vessel indices were utilized for disease diagnosis. Results: OCT-based CV volume maps disclose regional CV changes in patients with CSC, VKH, or SO. Three metrics, (i) choroidal volume, (ii) CV volume, and (iii) CV index, exhibit high sensitivity and specificity in discriminating pathological choroids from healthy ones. Conclusions: The deep-learning analysis of OCT images described here provides a 3D visualization of the choroid, and allows quantification of features in the datasets to identify choroidal disease and distinguish between different diseases. Translational Relevance: This novel analysis can be applied retrospectively to existing OCT datasets, and it represents a significant advance toward the automated diagnosis of choroidal pathologies based on observations and quantifications of the vasculature

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for heavy resonances decaying to two Higgs bosons in final states containing four b quarks

    Get PDF
    A search is presented for narrow heavy resonances X decaying into pairs of Higgs bosons (H) in proton-proton collisions collected by the CMS experiment at the LHC at root s = 8 TeV. The data correspond to an integrated luminosity of 19.7 fb(-1). The search considers HH resonances with masses between 1 and 3 TeV, having final states of two b quark pairs. Each Higgs boson is produced with large momentum, and the hadronization products of the pair of b quarks can usually be reconstructed as single large jets. The background from multijet and t (t) over bar events is significantly reduced by applying requirements related to the flavor of the jet, its mass, and its substructure. The signal would be identified as a peak on top of the dijet invariant mass spectrum of the remaining background events. No evidence is observed for such a signal. Upper limits obtained at 95 confidence level for the product of the production cross section and branching fraction sigma(gg -> X) B(X -> HH -> b (b) over barb (b) over bar) range from 10 to 1.5 fb for the mass of X from 1.15 to 2.0 TeV, significantly extending previous searches. For a warped extra dimension theory with amass scale Lambda(R) = 1 TeV, the data exclude radion scalar masses between 1.15 and 1.55 TeV

    Measurement of the top quark mass using charged particles in pp collisions at root s=8 TeV

    Get PDF
    Peer reviewe

    Search for Physics beyond the Standard Model in Events with Overlapping Photons and Jets

    Get PDF
    Results are reported from a search for new particles that decay into a photon and two gluons, in events with jets. Novel jet substructure techniques are developed that allow photons to be identified in an environment densely populated with hadrons. The analyzed proton-proton collision data were collected by the CMS experiment at the LHC, in 2016 at root s = 13 TeV, and correspond to an integrated luminosity of 35.9 fb(-1). The spectra of total transverse hadronic energy of candidate events are examined for deviations from the standard model predictions. No statistically significant excess is observed over the expected background. The first cross section limits on new physics processes resulting in such events are set. The results are interpreted as upper limits on the rate of gluino pair production, utilizing a simplified stealth supersymmetry model. The excluded gluino masses extend up to 1.7 TeV, for a neutralino mass of 200 GeV and exceed previous mass constraints set by analyses targeting events with isolated photons.Peer reviewe

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    Calibration of the CMS hadron calorimeters using proton-proton collision data at root s=13 TeV

    Get PDF
    Methods are presented for calibrating the hadron calorimeter system of theCMSetector at the LHC. The hadron calorimeters of the CMS experiment are sampling calorimeters of brass and scintillator, and are in the form of one central detector and two endcaps. These calorimeters cover pseudorapidities vertical bar eta vertical bar ee data. The energy scale of the outer calorimeters has been determined with test beam data and is confirmed through data with high transverse momentum jets. In this paper, we present the details of the calibration methods and accuracy.Peer reviewe

    Measurement of the Jet Mass Distribution and Top Quark Mass in Hadronic Decays of Boosted Top Quarks in pp Collisions at root s=13 TeV

    Get PDF
    A measurement is reported of the jet mass distribution in hadronic decays of boosted top quarks produced in pp collisions at root s = 13 TeV. The data were collected with the CMS detector at the LHC and correspond to an integrated luminosity of 35.9 fb(-1). The measurement is performed in the lepton + jets channel of t (t) over bar events, where the lepton is an electron or muon. The products of the hadronic top quark decay t -> bW -> bq (q) over bar' are reconstructed as a single jet with transverse momentum larger than 400 GeV. The t (t) over bar cross section as a function of the jet mass is unfolded at the particle level and used to extract a value of the top quark mass of 172.6 +/- 2.5 GeV. A novel jet reconstruction technique is used for the first time at the LHC, which improves the precision by a factor of 3 relative to an earlier measurement. This highlights the potential of measurements using boosted top quarks, where the new technique will enable future precision measurements.Peer reviewe

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe
    corecore