Ideal GLM-MHD - a new mathematical model for simulating astrophysical plasmas

Abstract

Magnetic fields are ubiquitous in space. As there is strong evidence that magnetic fields play an important role in a variety of astrophysical processes, they should not be neglected recklessly. However, analytic models in astrophysical either do often not take magnetic fields into account or can do this after limiting simplifications reducing their overall predictive power. Therefore, computational astrophysics has evolved as a modern field of research using sophisticated computer simulations to gain insight into physical processes. The ideal MHD equations, which are the most often used basis for simulating magnetized plasmas, have two critical drawbacks: Firstly, they do not limit the growth of numerically caused magnetic monopoles, and, secondly, most numerical schemes built from the ideal MHD equations are not conformable with thermodynamics. In my work, at the interplay of math and physics, I developed and presented the first thermodynamically consistent model with effective inbuilt divergence cleaning. My new Galilean-invariant model is suitable for simulating magnetized plasmas under extreme conditions as those typically encountered in astrophysical scenarios. The new model is called the "ideal GLM-MHD" equations and supports nine wave solutions. The accuracy and robustness of my numerical implementation are demonstrated with a number of tests, including comparisons to other schemes available within in the multi-physics, multi-scale adaptive mesh refinement (AMR) simulation code FLASH. A possible astrophysical application scenario is discussed in detail

    Similar works