12 research outputs found
Experimental vertical stability studies for ITER performance and design
Operating experimental devices have provided key inputs to the design process for ITER axisymmetric control. In particular, experiments have quantified controllability and robustness requirements in the presence of realistic noise and disturbance environments, which are difficult or impossible to characterize with modelling and simulation alone. This kind of information is particularly critical for ITER vertical control, which poses the highest demands on poloidal field system performance, since the consequences of loss of vertical control can be severe. This work describes results of multi-machine studies performed under a joint ITPA experiment (MDC-13) on fundamental vertical control performance and controllability limits. We present experimental results from Alcator C-Mod, DIII-D, NSTX, TCV and JET, along with analysis of these data to provide vertical control performance guidance to ITER. Useful metrics to quantify this control performance include the stability margin and maximum controllable vertical displacement. Theoretical analysis of the maximum controllable vertical displacement suggests effective approaches to improving performance in terms of this metric, with implications for ITER design modifications. Typical levels of noise in the vertical position measurement and several common disturbances which can challenge the vertical control loop are assessed and analysed.United States Department of Energy (DE-FC02-04ER54698, DEAC52- 07NA27344, and DE-FG02-04ER54235
Recommended from our members
A comprehensive computing initiative for MFE. Revision 1
The authors propose that a national initiative by launched to develop a comprehensive simulation facility for MFE. The facility would consist of physics codes developed by the national MFE community tightly but flexibly coupled through a programmable shell, enabling effectively simultaneous solution of the models in the various codes. The world ``facility`` is chosen to convey the notion that this is where one would go to conduct numerical experiments, using a full set of modules to describe an entire device, a coupled subset to describe particular aspects of a device, or a combination of the facility`s modules plus the user`s own physics
Recommended from our members
White paper: A vision for a computing initiative for MFE. Revised version
The scientific base of magnetic fusion research comprises three capabilities: experimental research, theoretical understanding and computational modeling, with modeling providing the necessary link between the other two. The US now faces a budget climate that will preclude the construction of major new MFE facilities and limit MFE experimental operations. The situation is rather analogous to the one experienced by the DOE Defense Programs (DP), in which continued viability of the nuclear stockpile must be ensured despite the prohibition of underground experimental tests. DP is meeting this challenge, in part, by launching the Accelerated Strategic Computing Initiative (ASCI) to bring advanced algorithms and new hardware to bear on the problems of science-based stockpile stewardship (SBSS). ASCI has as its goal the establishment of a ``virtual testing`` capability, and it is expected to drive scientific software and hardware development through the next decade. The authors argue that a similar effort is warranted for the MFE program, that is, an initiative aimed at developing a comprehensive simulation capability for MFE, with the goal of enabling ``virtual experiments.`` It would play a role for MFE analogous to that played by present-day and future (ASCI) codes for nuclear weapons design and by LASNEX for ICF, and provide a powerful augmentation to constrained experimental programs. Developing a comprehensive simulation capability could provide an organizing theme for a restructured science-based MFE program. The code would become a central vehicle for integrating the accumulating science base. In the context the authors propose, the relationship would ultimately be reversed: computer simulation would become a primary vehicle for exploration, with experiments providing the necessary confirmatory evidence (or guidance for code improvements)
Generic programming in POOMA and PETE
POOMA is a C++ framework for developing portable scientific applications for serial and parallel computers using high-level physical abstractions. PETE is the expression template library used by POOMA. This paper discusses generic programming techniques that are used to achieve flexibility and high performance in POOMA and PETE. POOMA uses an engine class that factors the data representation out of its array classes. PETE`s expression templates are used to build up and operate efficiently on expressions. PETE itself uses generic techniques to adapt to a variety of client-class interfaces, and to provide a powerful and flexible compile-time expression-tree traversal mechanism
Concept-Controlled Polymorphism
Concepts -- sets of abstractions related by common requirements -- have a central role in generic programming. This paper proposes a general framework for using concepts to control polymorphism in different ways. First, concepts can be used to constrain parametric polymorphism, as exemplified by type classes in Haskell. Second, concepts can be used to provide fine-grained control of function and operator overloading. Finally, generic functions can be overloaded (specialized) based on concepts, rather than simply on types. We describe aC ++ implementation of a new mechanism, which we call enable if , and its role in concept-controlled polymorphism