5 research outputs found

    Millimeter-wave beam scattering and induced broadening by plasma turbulence in the TCV tokamak

    No full text
    The scattering of millimeter-wave beams from electron density fluctuations and the associated beam broadening are experimentally demonstrated. Using a dedicated setup, instantaneous deflection and (de-)focusing of the beam due to density blobs on the beam path are shown to agree with full-wave simulations. The detected time-averaged wave power transmitted through the turbulent plasma is reproduced by the radiative-transfer model implemented in the WKBeam code, which predicts a ∼50% turbulence-induced broadening of the beam cross-section. The role of core turbulence for the considered geometry is highlighted

    Disruption prediction with artificial intelligence techniques in tokamak plasmas

    Get PDF
    In nuclear fusion reactors, plasmas are heated to very high temperatures of more than 100 million kelvin and, in so-called tokamaks, they are confined by magnetic fields in the shape of a torus. Light nuclei, such as deuterium and tritium, undergo a fusion reaction that releases energy, making fusion a promising option for a sustainable and clean energy source. Tokamak plasmas, however, are prone to disruptions as a result of a sudden collapse of the system terminating the fusion reactions. As disruptions lead to an abrupt loss of confinement, they can cause irreversible damage to present-day fusion devices and are expected to have a more devastating effect in future devices. Disruptions expected in the next-generation tokamak, ITER, for example, could cause electromagnetic forces larger than the weight of an Airbus A380. Furthermore, the thermal loads in such an event could exceed the melting threshold of the most resistant state-of-the-art materials by more than an order of magnitude. To prevent disruptions or at least mitigate their detrimental effects, empirical models obtained with artificial intelligence methods, of which an overview is given here, are commonly employed to predict their occurrence—and ideally give enough time to introduce counteracting measures

    Runaway electron beam control

    Get PDF
    Post-disruption runaway electron (RE) beams in tokamaks with large current can cause deep melting of the vessel and are one of the major concerns for ITER operations. Consequently, a considerable effort is provided by the scientific community in order to test RE mitigation strategies. We present an overview of the results obtained at FTU and TCV controlling the current and position of RE beams to improve safety and repeatability of mitigation studies such as massive gas (MGI) and shattered pellet injections (SPI). We show that the proposed RE beam controller (REB-C) implemented at FTU and TCV is effective and that current reduction of the beam can be performed via the central solenoid reducing the energy of REs, providing an alternative/parallel mitigation strategy to MGI/SPI. Experimental results show that, meanwhile deuterium pellets injected on a fully formed RE beam are ablated but do not improve RE energy dissipation rate, heavy metals injected by a laser blow off system on low-density flat-top discharges with a high level of RE seeding seem to induce disruptions expelling REs. Instabilities during the RE beam plateau phase have shown to enhance losses of REs, expelled from the beam core. Then, with the aim of triggering instabilities to increase RE losses, an oscillating loop voltage has been tested on RE beam plateau phase at TCV revealing, for the first time, what seems to be a full conversion from runaway to ohmic current. We finally report progresses in the design of control strategies at JET in view of the incoming SPI mitigation experiments
    corecore