248 research outputs found

    Accordion Effect During Percutaneous Coronary Intervention (Pci)

    Get PDF
    Pseudo lesions appear in a coronary artery during intervention due to guide wire manipulation are referred as “accordion phenomenon” and are infrequent occurrences during percutaneous coronary intervention of tortuous coronary arteries. Occurrence of pseudo lesion or accordion phenomena puts a diagnostic challenge to the interventionist and might leads to unnecessary intervention. The differential diagnosis includes coronary dissection, thrombus and its spasm. Ischemia & hemodynamic compromise are possible complications. The common method to overcome this situation is pulling the guide wire out of the affected segments and leaving aside only the floppy segment. Pseudo lesion at times response to intracoronary nitro-glycerine however, sometimes they are refractory. We hereby report a case of Accordion phenomenon during PCI of right coronary artery & technical challenges it imposed on

    THE IMPLEMENTATION OF CLOUD COMPUTING AS STRATEGIC TECHNOLOGY FOR SUSTAINABLE DEVELOPMENT USING REGRESSION ANALYSIS

    Get PDF
    As information technology has advanced, there has been a shift toward relying more and more on online cloud storage and computing services. There is no getting around the fact that recent times have seen a meteoric rise in interest in cloud computing. This technology is used by many different organisations as the central component of their information technology infrastructure. The use of cloud computing results in increased data processing efficiency across a variety of computer and storage systems that are available over the internet. The approaches have advanced as a direct result of the cutting-edge and forward-thinking computer procedures that are the foundation of the internet's core database and network architecture. In the 1990s, a new sort of cutting-edge computing known as grid computing came into being. 2005 saw the birth of two new computing paradigms: cloud computing and utility computing. Consolidating several virtual computing components into a single physical platform is one of the most distinguishing features of cloud computing services and infrastructure. These components include the central processing unit (CPU), the network, storage, and memory. A piece of software known as a hypervisor is responsible for isolating each virtual machine (VM) (used by Virtual box and VMware, for example). Using this strategy, one virtual disc or machine may be prevented from directly accessing the memory and programmes of another inside the same environment. This can be accomplished by using a firewall. Through the use of hardware abstraction, it is feasible to conceal the complexity of operating physical computer systems, while at the same time efficiently boosting the systems' processing capacity. Utilizing virtualization technology in the cloud comes with a number of benefits, including scalability and the capacity to support many tenants (one software programme serving many users at once). These properties are essential to cloud computing because they make sharing and pooling resources possible. Sharing and pooling resources provides a number of benefits, some of which include increased business value, more flexibility, and cost savings. When it comes to the process of moving assets from cloud providers to cloud virtualization users, provisioning is an extremely important step. In order to fulfil the requirements of its clientele, the cloud service provider must create an acceptable number of virtual machines and make available sufficient amounts of resources. This may be accomplished by any one of the following three methods: advanced provisioning, dynamic provisioning, or user self-provisioning. The mechanism by which cloud services and resources are made available, known as dynamic provisioning, faces a number of challenges. These challenges include the correct configuration of virtual machines (VMs) and technological constraints such as disc space, processing power, memory, and network throughput. It's possible that the scalability of virtual machines, the setup of cloud systems, and other aspects of virtualization's deployment might provide some difficulties

    Soil conservation issues in India

    Get PDF
    Despite years of study and substantial investment in remediation and prevention, soil erosion continues to be a major environmental problem with regard to land use in India and elsewhere around the world. Furthermore, changing climate and/or weather patterns are exacerbating the problem. Our objective was to review past and current soil conservation programmes in India to better understand how production-, environmental-, social-, economic- and policy-related issues have affected soil and water conservation and the incentives needed to address the most critical problems. We found that to achieve success in soil and water conservation policies, institutions and operations must be co-ordinated using a holistic approach. Watershed programmes have been shown to be one of the most effective strategies for bringing socio-economic change to different parts of India. Within both dryland and rainfed areas, watershed management has quietly revolutionized agriculture by aligning various sectors through technological soil and water conservation interventions and land-use diversification. Significant results associated with various watershed-scale soil and water conservation programmes and interventions that were effective for reducing land degradation and improving productivity in different parts of the country are discussed

    Search for heavy neutral leptons in final states with electrons, muons, and hadronically decaying tau leptons in proton-proton collisions at s \sqrt{s} = 13 TeV

    Get PDF
    A search for heavy neutral leptons (HNLs) of Majorana or Dirac type using proton-proton collision data at = 13 TeV is presented. The data were collected by the CMS experiment at the CERN LHC and correspond to an integrated luminosity of 138 fb−1. Events with three charged leptons (electrons, muons, and hadronically decaying tau leptons) are selected, corresponding to HNL production in association with a charged lepton and decay of the HNL to two charged leptons and a standard model (SM) neutrino. The search is performed for HNL masses between 10 GeV and 1.5 TeV. No evidence for an HNL signal is observed in data. Upper limits at 95% confidence level are found for the squared coupling strength of the HNL to SM neutrinos, considering exclusive coupling of the HNL to a single SM neutrino generation, for both Majorana and Dirac HNLs. The limits exceed previously achieved experimental constraints for a wide range of HNL masses, and the limits on tau neutrino coupling scenarios with HNL masses above the W boson mass are presented for the first time

    Observation of the J / ψ → ÎŒâș Ό⁻ ÎŒâș Ό⁻ decay in proton-proton collisions at √s = 13 TeV

    Get PDF

    Search for new physics in high-mass diphoton events from proton-proton collisions at √s = 13 TeV

    Get PDF
    Results are presented from a search for new physics in high-mass diphoton events from proton-proton collisions at sqrt(s) = 13 TeV. The data set was collected in 2016–2018 with the CMS detector at the LHC and corresponds to an integrated luminosity of 138 fb−1 . Events with a diphoton invariant mass greater than 500 GeV are considered. Two diferent techniques are used to predict the standard model backgrounds: parametric fts to the smoothly-falling background and a frst-principles calculation of the standard model diphoton spectrum at next-to-next-to-leading order in perturbative quantum chromodynamics calculations. The frst technique is sensitive to resonant excesses while the second technique can identify broad diferences in the invariant mass shape. The data are used to constrain the production of heavy Higgs bosons, Randall-Sundrum gravitons, the large extra dimensions model of Arkani-Hamed, Dimopoulos, and Dvali (ADD), and the continuum clockwork mechanism. No statistically signifcant excess is observed. The present results are the strongest limits to date on ADD extra dimensions and RS gravitons with a coupling parameter greater than 0.1

    Measurement of the polarizations of prompt and non-prompt J/ψ and ψ (2S) mesons produced in pp collisions at s\sqrt{s} = 13 TeV

    Get PDF
    The polarizations of prompt and non-prompt J∕ψ and ψ(2S) mesons are measured in proton-proton collisions at √ = 13 TeV, using data samples collected by the CMS experiment in 2017 and 2018, corresponding to a total integrated luminosity of 103.3 fb−1^{−1}. Based on the analysis of the dimuon decay angular distributions in the helicity frame, the polar anisotropy, , is measured as a function of the transverse momentum, T_T, of the charmonium states, in the 25–120 and 20–100 GeV ranges for the J∕ψ and ψ(2S), respectively. The non-prompt polarizations agree with predictions based on the hypothesis that, for T ≳ 25 GeV, the non-prompt J∕ψ and ψ(2S) are predominantly produced in two-body B meson decays. The prompt results clearly exclude strong transverse polarizations, even for T_T exceeding 30 times the J∕ψ mass, where tends to an asymptotic value around 0.3. Taken together with previous measurements, by CMS and LHCb at √ = 7 TeV, the prompt polarizations show a significant variation with T_T, at low T_T

    Omecamtiv mecarbil in chronic heart failure with reduced ejection fraction, GALACTIC‐HF: baseline characteristics and comparison with contemporary clinical trials

    Get PDF
    Aims: The safety and efficacy of the novel selective cardiac myosin activator, omecamtiv mecarbil, in patients with heart failure with reduced ejection fraction (HFrEF) is tested in the Global Approach to Lowering Adverse Cardiac outcomes Through Improving Contractility in Heart Failure (GALACTIC‐HF) trial. Here we describe the baseline characteristics of participants in GALACTIC‐HF and how these compare with other contemporary trials. Methods and Results: Adults with established HFrEF, New York Heart Association functional class (NYHA) ≄ II, EF ≀35%, elevated natriuretic peptides and either current hospitalization for HF or history of hospitalization/ emergency department visit for HF within a year were randomized to either placebo or omecamtiv mecarbil (pharmacokinetic‐guided dosing: 25, 37.5 or 50 mg bid). 8256 patients [male (79%), non‐white (22%), mean age 65 years] were enrolled with a mean EF 27%, ischemic etiology in 54%, NYHA II 53% and III/IV 47%, and median NT‐proBNP 1971 pg/mL. HF therapies at baseline were among the most effectively employed in contemporary HF trials. GALACTIC‐HF randomized patients representative of recent HF registries and trials with substantial numbers of patients also having characteristics understudied in previous trials including more from North America (n = 1386), enrolled as inpatients (n = 2084), systolic blood pressure < 100 mmHg (n = 1127), estimated glomerular filtration rate < 30 mL/min/1.73 m2 (n = 528), and treated with sacubitril‐valsartan at baseline (n = 1594). Conclusions: GALACTIC‐HF enrolled a well‐treated, high‐risk population from both inpatient and outpatient settings, which will provide a definitive evaluation of the efficacy and safety of this novel therapy, as well as informing its potential future implementation
    • 

    corecore