95 research outputs found

    Tetrahedral Volume Reconstruction in X-Ray Tomography using GPU Architecture

    Get PDF
    International audienceIn this paper, we propose the use of the graphics processor unit (GPU) to accelerate a ray-tracing method in the framework of X-ray tomographic image reconstruction. We first describe an innovative iterative reconstruction method we have developed based on a tetrahedral volume with conjugate gradient. We do not use voxels here but instead tetrahedrons to increase the quality of reconstruction and the reduction of data as thus we need less resolution of the volume to fit the object reconstructed. This is an important point to use the GPU. We present here the algorithms adapted to the GPU and the results obtained compared to CPU

    Adaptive Mesh Reconstruction in X-Ray Tomography

    Get PDF
    International audiencehis paper presents an X-ray tomographic reconstruction method based on an adaptive mesh in order to directly obtain the typical gray level reconstructed image simultaneously with its segmentation. It also leads to reduce the number of unknows throughout the iterations of reconstruction and accelerates the process of algebraic algorithms. The process of reconstruction is no more based on a regular grid of voxels but on a mesh composed of non regular tetraedra that are progressively adapted to the content of the image. Each iteration is composed by two main steps that successively estimate the values of the mesh elements and segment the sample in order to make the grid adapted to the content of the image. The method was applied on numerical and experimental data. The results show that the method provides reliable reconstructions and leads to drastically reduce the memory storage compared to usual reconstructions based on pixel representation

    Primary open angle glaucoma: an overview on medical therapy

    Get PDF
    The purpose of this review is to discuss the topics relevant to the use of intraocular pressure-lowering strategies, which remains the first line in the management of glaucoma. Estimates of blindness from glaucoma and identification of risk factors remain of interest for all ophthalmologists. New functional tests offer promise for better detection and more accurate diagnosis of glaucoma. We finally discuss the impact of various glaucoma therapies, the principles of monotherapy and fixed combinations, which offer benefits of convenience, cost, and safety

    Upgrade of the TOTEM DAQ using the Scalable Readout System (SRS)

    Get PDF
    The main goals of the TOTEM Experiment at the LHC are the measurements of the elastic and total p-p cross sections and the studies of the diffractive dissociation processes. At LHC, collisions are produced at a rate of 40 MHz, imposing strong requirements for the Data Acquisition Systems (DAQ) in terms of trigger rate and data throughput. The TOTEM DAQ adopts a modular approach that, in standalone mode, is based on VME bus system. The VME based Front End Driver (FED) modules, host mezzanines that receive data through optical fibres directly from the detectors. After data checks and formatting are applied in the mezzanine, data is retransmitted to the VME interface and to another mezzanine card plugged in the FED module. The VME bus maximum bandwidth limits the maximum first level trigger (L1A) to 1 kHz rate. In order to get rid of the VME bottleneck and improve scalability and the overall capabilities of the DAQ, a new system was designed and constructed based on the Scalable Readout System (SRS), developed in the framework of the RD51 Collaboration. The project aims to increase the efficiency of the actual readout system providing higher bandwidth, and increasing data filtering, implementing a second-level trigger event selection based on hardware pattern recognition algorithms. This goal is to be achieved preserving the maximum back compatibility with the LHC Timing, Trigger and Control (TTC) system as well as with the CMS DAQ. The obtained results and the perspectives of the project are reported. In particular, we describe the system architecture and the new Opto-FEC adapter card developed to connect the SRS with the FED mezzanine modules. A first test bench was built and validated during the last TOTEM data taking period (February 2013). Readout of a set of 3 TOTEM Roman Pot silicon detectors was carried out to verify performance in the real LHC environment. In addition, the test allowed a check of data consistency and quality

    Clinical Features, Cardiovascular Risk Profile, and Therapeutic Trajectories of Patients with Type 2 Diabetes Candidate for Oral Semaglutide Therapy in the Italian Specialist Care

    Get PDF
    Introduction: This study aimed to address therapeutic inertia in the management of type 2 diabetes (T2D) by investigating the potential of early treatment with oral semaglutide. Methods: A cross-sectional survey was conducted between October 2021 and April 2022 among specialists treating individuals with T2D. A scientific committee designed a data collection form covering demographics, cardiovascular risk, glucose control metrics, ongoing therapies, and physician judgments on treatment appropriateness. Participants completed anonymous patient questionnaires reflecting routine clinical encounters. The preferred therapeutic regimen for each patient was also identified. Results: The analysis was conducted on 4449 patients initiating oral semaglutide. The population had a relatively short disease duration (42%  60% of patients, and more often than sitagliptin or empagliflozin. Conclusion: The study supports the potential of early implementation of oral semaglutide as a strategy to overcome therapeutic inertia and enhance T2D management

    Omecamtiv mecarbil in chronic heart failure with reduced ejection fraction, GALACTIC‐HF: baseline characteristics and comparison with contemporary clinical trials

    Get PDF
    Aims: The safety and efficacy of the novel selective cardiac myosin activator, omecamtiv mecarbil, in patients with heart failure with reduced ejection fraction (HFrEF) is tested in the Global Approach to Lowering Adverse Cardiac outcomes Through Improving Contractility in Heart Failure (GALACTIC‐HF) trial. Here we describe the baseline characteristics of participants in GALACTIC‐HF and how these compare with other contemporary trials. Methods and Results: Adults with established HFrEF, New York Heart Association functional class (NYHA) ≄ II, EF ≀35%, elevated natriuretic peptides and either current hospitalization for HF or history of hospitalization/ emergency department visit for HF within a year were randomized to either placebo or omecamtiv mecarbil (pharmacokinetic‐guided dosing: 25, 37.5 or 50 mg bid). 8256 patients [male (79%), non‐white (22%), mean age 65 years] were enrolled with a mean EF 27%, ischemic etiology in 54%, NYHA II 53% and III/IV 47%, and median NT‐proBNP 1971 pg/mL. HF therapies at baseline were among the most effectively employed in contemporary HF trials. GALACTIC‐HF randomized patients representative of recent HF registries and trials with substantial numbers of patients also having characteristics understudied in previous trials including more from North America (n = 1386), enrolled as inpatients (n = 2084), systolic blood pressure < 100 mmHg (n = 1127), estimated glomerular filtration rate < 30 mL/min/1.73 m2 (n = 528), and treated with sacubitril‐valsartan at baseline (n = 1594). Conclusions: GALACTIC‐HF enrolled a well‐treated, high‐risk population from both inpatient and outpatient settings, which will provide a definitive evaluation of the efficacy and safety of this novel therapy, as well as informing its potential future implementation

    Search for a Light Charged Higgs Boson Decaying to a W Boson and a CP-Odd Higgs Boson in Final States with eΌΌ or ΌΌΌ in Proton-Proton Collisions at √s=13  TeV

    Get PDF
    A search for a light charged Higgs boson (H+) decaying to a W boson and a CP-odd Higgs boson (A) in final states with eΌΌ or ΌΌΌ is performed using data from pp collisions at √s=13  TeV, recorded by the CMS detector at the LHC and corresponding to an integrated luminosity of 35.9  fb−1. In this search, it is assumed that the H+ boson is produced in decays of top quarks, and the A boson decays to two oppositely charged muons. The presence of signals for H+ boson masses between 100 and 160 GeV and A boson masses between 15 and 75 GeV is investigated. No evidence for the production of the H+ boson is found. Upper limits at 95% confidence level are obtained on the combined branching fraction for the decay chain, t→bH+→bW+A→bW+ÎŒ+Ό−, of 1.9×10−6 to 8.6×10−6, depending on the masses of the H+ and A bosons. These are the first limits for these decay modes of the H+ and A bosons.Peer reviewe

    Search for dark matter particles produced in association with a Higgs boson in proton-proton collisions at √s = 13 TeV

    Get PDF
    © 2020, The Author(s). A search for dark matter (DM) particles is performed using events with a Higgs boson candidate and large missing transverse momentum. The analysis is based on proton- proton collision data at a center-of-mass energy of 13 TeV collected by the CMS experiment at the LHC in 2016, corresponding to an integrated luminosity of 35.9 fb−1. The search is performed in five Higgs boson decay channels: h → b b ÂŻ , γγ, τ+τ−, W+W−, and ZZ. The results from the individual channels are combined to maximize the sensitivity of the analysis. No significant excess over the expected standard model background is observed in any of the five channels or in their combination. Limits are set on DM production in the context of two simplified models. The results are also interpreted in terms of a spin-independent DM-nucleon scattering cross section and compared to those from direct-detection DM experiments. This is the first search for DM particles produced in association with a Higgs boson decaying to a pair of W or Z bosons, and the first statistical combination based on five Higgs boson decay channels. [Figure not available: see fulltext.].SCOAP
    • 

    corecore