29 research outputs found

    Long-term (10 years) prognostic value of a normal thallium-201 myocardial exercise scintigraphy in patients with coronary artery disease documented by angiography

    Get PDF
    In order to assess the prognostic significance of normal exercise thallium-210 myocardial scintigraphy in patients with documented coronary artery disease, we studied the incidence of cardiac death and non-fatal myocardial infarction in 69 symptomatic patients without prior Q wave myocardial infarction, who demonstrated one or more significant coronary lesions (stenosis ≤70%) on an angiogram performed within 3 months of scintigraphy (Group 1). These patients were compared to a second group of 136 patients with an abnormal exercise scintigram, defined by the presence of reversible defect(s) and angiographically proven coronary artery disease (Group 2), and to a third group of 102 patients with normal exercise scintigraphy without significant coronary lesions (stenosis ≥30%) or with normal coronary angiography (Group 3). In contrast to coronary lesions observed in Group 2, patients in Group I presented more frequently with single- vessel disease (83% vs 35%, P>0·0001) and with more distal lesions (55% vs 23%, P>0·0001). Over a mean follow-up period of 8·6 years, one fatal and eight non-fatal cases of myocardial infarction were observed in Group 1. The majority of patients in Group 1 were treated medically: only 24 (35%) underwent myocardial revascularization, usually by coronary angioplasty. There was no significant difference in the incidence of combined major cardiac events (cardiac death, non-fatal myocardial infarction) in patients with normal exercise scintigraphy, with or without documented coronary artery disease (Groups 1 and 3), while the incidence was higher in Group 2. However, while the mortality remained very low in Group 1, the incidence of non-fatal myocardial infraction was not different from that of Group 2, where most patients underwent revascularization procedures. In conclusion, patients with coronary artery disease and a normal exercise thallium-201 myocardial scintigram usually have mild coronary lesions (single-vessel disease, distal location) and good long-term prognosis, with a low incidence of cardiac deat

    Incorporating scale effect into a failure criterion for predicting stress-induced overbreak around excavations

    No full text
    The evaluation of the depth of brittle failure around excavations is of major importance in order to optimize the design of underground excavations and ensure the safety of workers and equipment. The current proposed approaches to evaluate it are related to a single scale of study (intact rock or rock mass scale). Therefore, they are scale-dependent, and cannot be applied for all excavation diameter. In this paper, a generalized failure criterion including the scale effect for predicting stress-induced overbreak around excavations is developed. This failure criterion is based on the damage initiation relation (sigma(1) = A sigma(3) + B sigma(c)). The scale effect is included into it through a relation proposed to evaluate the B parameter and depending on the scale of study. The fit parameters of the relation proposed have been defined considering a database at both rock mass and intact rock scales arising from a literature review. For intact rock scale, the B parameter is defined as a function of the diameter of the excavation, expressed following a potential form. For rock mass scale, the B parameter is defined equal to 0.35, regardless the diameter of the excavation. Based on the proposed B parameter relation, the depth and extension of the brittle failure around excavations can be evaluated for any scale of study.Advanced Mining Technology Center (AMTC) through the BASAL Project FB-080

    Influence of expertise on rockfall hazard assessment using empirical methods

    No full text
    To date, many rockfall hazard assessment methods still consider qualitative observations within their analysis. Based on this statement, knowledge and expertise are supposed to be major parameters of rockfall assessment. To test this hypothesis, an experiment was carried out in order to evaluate the influence of knowledge and expertise on rockfall hazard assessment. Three populations were selected, having different levels of expertise: (1) students in geosciences, (2) researchers in geosciences and (3) confirmed experts. These three populations evaluated the rockfall hazard level on the same site, considering two different methods: the Laboratoire des Ponts et Chaussées (LPC) method and a method partly based on the "slope mass rating" (SMR) method. To complement the analysis, the completion of an "a priori" assessment of the rockfall hazard was requested of each population, without using any method. The LPC method is the most widely used method in France for official hazard mapping. It combines two main indicators: the predisposition to instability and the expected magnitude. Reversely, the SMR method was used as an ad hoc quantitative method to investigate the effect of quantification within a method. These procedures were applied on a test site divided into three different sectors. A statistical treatment of the results (descriptive statistical analysis, chi-square independent test and ANOVA) shows that there is a significant influence of the method used on the rockfall hazard assessment, whatever the sector. However, there is a non-significant influence of the level of expertise of the population the sectors 2 and 3. On sector 1, there is a significant influence of the level of expertise, explained by the importance of the temporal probability assessment in the rockfall hazard assessment process. The SMR-based method seems highly sensitive to the "site activity" indicator and exhibits an important dispersion in its results. However, the results are more similar with the LPC qualitative method, even in the case of sector 1

    Statistical correlation between meteorological and rockfall databases

    Get PDF
    International audienceRockfalls are a major and essentially unpredictable sources of danger, particularly along transportation routes (roads and railways). Thus, the assessment of their probability of occurrence is a major challenge for risk management. From a qualitative perspective, it is known that rockfalls occur mainly during periods of rain, snowmelt, or freeze-thaw. Nevertheless, from a quantitative perspective, these generally assumed correlations between rockfalls and their possible meteorological triggering events are often difficult to identify because (i) rockfalls are too rare for the use of classical statistical analysis techniques and (ii) not all intensities of triggering factors have the same probability. In this study, we propose a new approach for investigating the correlation of rockfalls with rain, freezing periods, and strong temperature variations. This approach is tested on three French rockfall databases, the first of which exhibits a high frequency of rockfalls (approximately 950 events over 11 years), whereas the other two databases are more typical (approximately 140 events over 11 years). These databases come from (1) national highway RN1 on R union, (2) a railway in Burgundy, and (3) a railway in Auvergne. Whereas a basic correlation analysis is only able to highlight an already obvious correlation in the case of the ``rich'' database, the newly suggested method appears to detect correlations even in the ``poor'' databases. Indeed, the use of this method confirms the positive correlation between rainfall and rockfalls in the R union database. This method highlights a correlation between cumulative rainfall and rockfalls in Burgundy, and it detects a correlation between the daily minimum temperature and rockfalls in the Auvergne database. This new approach is easy to use and also serves to determine the conditional probability of rockfall according to a given meteorological factor. The approach will help to optimize risk management in the studied areas based on their meteorological conditions

    A Clamped Be Window for the Dump of the HiRadMat Experiment at CERN

    No full text
    At CERN, the High Radiation to Materials facility (HiRadMat) is designed to test accelerator components under the impact of high-intensity pulsed beams and will start operation in 2012. In this frame an LHC TED-type dump was installed at the end of the line, working in nitrogen over-pressure, and a 254μm-thick beryllium window was placed as barrier between the inside of the dump and the external atmosphere. Because of the special loading conditions, a clamped window design was especially developed, optimized and implemented, the more standard welded window not being suitable for such loads. Considering then the clamping force and the applied differential pressures, the stresses on the window components were carefully evaluated thanks to empirical as well as numerical models, to guarantee the structural integrity of the beryllium foil. This paper reports on choices and optimizations that led to the final design, presenting also comparative results from different solutions and the detailed results for the adopted one

    Internal H0/H- Dump for the Proton Synchrotron Booster Injection at CERN

    No full text
    In the frame of the LHC Injectors Upgrade Project at CERN (LIU), the new 160MeV H- Linac4 will inject into the four existing PS Booster rings after the conversion of H- into H+ in a stripping foil. Given a limited stripping efficiency and possible foil failures, a certain percentage of the beam is foreseen to remain partially (H0) or completely (H-) unstripped. An internal dump installed into the chicane magnet to stop these unstripped beams is therefore required. This paper presents the conceptual design of the internal dump, reviewing loading assumptions, design constraints, limitations and integration studies. Power evacuation through the thermal contact between the core and the external active cooling is addressed and, finally, results from the numerical thermo-mechanical analyses are reported

    Use of Silicon Carbide as Beam Intercepting Device Material: Tests, Issues and Numerical Simulations

    No full text
    Silicon Carbide (SiC) stands as one of the most promising ceramic material with respect to its thermal shock resistance and mechanical strengths. It has hence been considered as candidate material for the development of higher performance beam intercepting devices at CERN. Its brazing with a metal counterpart has been tested and characterized by means of microstructural and ultrasound techniques. Despite the positive results, its use has to be evaluated with care, due to the strong evidence in literature of large and permanent volumetric expansion, called swelling, under the effect of neutron and ion irradiation. This may cause premature and sudden failure, and can be mitigated to some extent by operating at high temperature. For this reason limited information is available for irradiation below 100°C, which is the typical temperature of interest for beam intercepting devices like dumps or collimators. This paper describes the brazing campaign carried out at CERN, the results, and the theoretical and numerical approach used to characterize the extent of the swelling phenomenon with radiation, as well as the p+ irradiation test program to be conducted in the next future

    Development of a proton-to-neutron converter for radioisotope production at ISAC-TRIUMF

    No full text
    At ISAC-TRIUMF, a 500 MeV proton beam is impinged upon "thick" targets to induce nuclear reactions to produce reaction products that are delivered as a Radioactive Ion Beam (RIB) to experiments. Uranium carbide is among the most commonly used target materials which produces a vast radionuclide inventory coming from both spallation and fission-events. This can also represent a major limitation for the successful delivery of certain RIBs to experiments since, for a given mass, many isobaric isotopes are to be filtered by the dipole mass separator. These contaminants can exceed the yield of the isotope of interest by orders of magnitude, often causing a significant reduction in the sensitivity of experiments or even making them impossible. The design of a 50 kW proton-to-neutron (p2n) converter-target is ongoing to enhance the production of neutron-rich nuclei while significantly reducing the rate of neutron-deficient contaminants. The converter is made out of a bulk tungsten block which converts proton beams into neutrons through spallation. The neutrons, in turn, induce pure fission in an upstream UCx target. The present target design and the service infra-structure needed for its operation will be discussed in this paper.</p

    Production and release of ISOL beams from molten fluoride salt targets

    Get PDF
    In the framework of the Beta Beams project, a molten fluoride target has been proposed for the production of the required 10(13) Ne-18/s. The production and extraction of such rates are predicted to be possible on a circulating molten salt with 160 MeV proton beams at close to 1 MW power. As a most important step to validate the concept, a prototype has been designed and investigated at CERN-ISOLDE using a static target unit. The target material consisted of a binary fluoride system, NaF:LiF (39:61 mol.%), with melting point at 649 degrees C. The production of Ne beams has been monitored as a function of the target temperature and proton beam intensity. The prototype development and the results of the first online tests with 1.4 GeV proton beam are presented in this paper. (C) 2014 Elsevier B.V. All rights reserved
    corecore