25 research outputs found

    Porting a Hall MHD Code to a Graphic Processing Unit

    Get PDF
    We present our experience porting a Hall MHD code to a Graphics Processing Unit (GPU). The code is a 2nd order accurate MUSCL-Hancock scheme which makes use of an HLL Riemann solver to compute numerical fluxes and second-order finite differences to compute the Hall contribution to the electric field. The divergence of the magnetic field is controlled with Dedner?s hyperbolic divergence cleaning method. Preliminary benchmark tests indicate a speedup (relative to a single Nehalem core) of 58x for a double precision calculation. We discuss scaling issues which arise when distributing work across multiple GPUs in a CPU-GPU cluster

    Does the Hall Effect Solve the Flux Pileup Saturation Problem?

    Get PDF
    It is well known that magnetic flux pileup can significantly speed up the rate of magnetic reconnection in high Lundquist number resistive MHD,allowing reconnection to proceed at a rate which is insensitive to the plasma resistivity over a wide range of Lundquist number. Hence, pileup is a possible solution to the Sweet-Parker time scale problem. Unfortunately, pileup tends to saturate above a critical value of the Lundquist number, S_c, where the value ofS_c depends on initial and boundary conditions, with Sweet-Parker scaling returning above S_c. It has been argued (see Dorelli and Bim [2003] and Dorelli [2003]) that the Hall effect can allow flux pileup to saturate (when the scale of the current sheet approaches ion inertial scale, di) before the reconnection rate begins to stall. However, the resulting saturated reconnection rate, while insensitive to the plasma resistivity, was found to depend strongly on the di. In this presentation, we revisit the problem of magnetic island coalescence (which is a well known example of flux pileup reconnection), addressing the dependence of the maximum coalescence rate on the ratio of di in the "large island" limit in which the following inequality is always satisfied: l_eta di lambda, where I_eta is the resistive diffusion length and lambda is the island wavelength

    The role of the Hall effect in the global structure and dynamics of planetary magnetospheres: Ganymede as a case study

    Full text link
    We present high resolution Hall MHD simulations of Ganymede's magnetosphere demonstrating that Hall electric fields in ion-scale magnetic reconnection layers have significant global effects not captured in resistive MHD simulations. Consistent with local kinetic simulations of magnetic reconnection, our global simulations show the development of intense field-aligned currents along the magnetic separatrices. These currents extend all the way down to the moon's surface, where they may contribute to Ganymede's aurora. Within the magnetopause and magnetotail current sheets, Hall currents in the reconnection plane accelerate ions to the local Alfv\'en speed in the out-of-plane direction, producing a global system of ion drift belts that circulates Jovian magnetospheric plasma throughout Ganymede's magnetosphere. We discuss some observable consequences of these Hall-induced currents and ion drifts: the appearance of a sub-Jovian "double magnetopause" structure, an Alfv\'enic ion jet extending across the upstream magnetopause and an asymmetric pattern of magnetopause Kelvin-Helmholtz waves.Comment: 14 pages, 12 figures; presented at Geospace Environment Modeling (GEM) workshop (June, 2014) and Fall American Geophysical Union (AGU) meeting (December, 2014); submitted to Journal of Geophysical Research, December 201

    A Simple GPU-Accelerated Two-Dimensional MUSCL-Hancock Solver for Ideal Magnetohydrodynamics

    Get PDF
    We describe our experience using NVIDIA's CUDA (Compute Unified Device Architecture) C programming environment to implement a two-dimensional second-order MUSCL-Hancock ideal magnetohydrodynamics (MHD) solver on a GTX 480 Graphics Processing Unit (GPU). Taking a simple approach in which the MHD variables are stored exclusively in the global memory of the GTX 480 and accessed in a cache-friendly manner (without further optimizing memory access by, for example, staging data in the GPU's faster shared memory), we achieved a maximum speed-up of approx. = 126 for a sq 1024 grid relative to the sequential C code running on a single Intel Nehalem (2.8 GHz) core. This speedup is consistent with simple estimates based on the known floating point performance, memory throughput and parallel processing capacity of the GTX 480

    Extended magnetohydrodynamics with embedded particle‐in‐cell simulation of Ganymede’s magnetosphere

    Full text link
    We have recently developed a new modeling capability to embed the implicit particle‐in‐cell (PIC) model iPIC3D into the Block‐Adaptive‐Tree‐Solarwind‐Roe‐Upwind‐Scheme magnetohydrodynamic (MHD) model. The MHD with embedded PIC domains (MHD‐EPIC) algorithm is a two‐way coupled kinetic‐fluid model. As one of the very first applications of the MHD‐EPIC algorithm, we simulate the interaction between Jupiter’s magnetospheric plasma and Ganymede’s magnetosphere. We compare the MHD‐EPIC simulations with pure Hall MHD simulations and compare both model results with Galileo observations to assess the importance of kinetic effects in controlling the configuration and dynamics of Ganymede’s magnetosphere. We find that the Hall MHD and MHD‐EPIC solutions are qualitatively similar, but there are significant quantitative differences. In particular, the density and pressure inside the magnetosphere show different distributions. For our baseline grid resolution the PIC solution is more dynamic than the Hall MHD simulation and it compares significantly better with the Galileo magnetic measurements than the Hall MHD solution. The power spectra of the observed and simulated magnetic field fluctuations agree extremely well for the MHD‐EPIC model. The MHD‐EPIC simulation also produced a few flux transfer events (FTEs) that have magnetic signatures very similar to an observed event. The simulation shows that the FTEs often exhibit complex 3‐D structures with their orientations changing substantially between the equatorial plane and the Galileo trajectory, which explains the magnetic signatures observed during the magnetopause crossings. The computational cost of the MHD‐EPIC simulation was only about 4 times more than that of the Hall MHD simulation.Key PointsFirst particle‐in‐cell simulation of Ganymede’s magnetosphereThe MHD‐EPIC algorithm makes global kinetic simulations affordableMHD‐EPIC simulation suggests that Galileo observed a flux transfer event during the G8 flybyPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/135161/1/jgra52397.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/135161/2/jgra52397_am.pd

    In Flight Calibration of the Magnetospheric Multiscale Mission Fast Plasma Investigation

    Get PDF
    The Fast Plasma Investigation (FPI) on the Magnetospheric Multiscale mission (MMS) combines data from eight spectrometers, each with four deflection states, into a single map of the sky. Any systematic discontinuity, artifact, noise source, etc. present in this map may be incorrectly interpreted as legitimate data and incorrect conclusions reached. For this reason it is desirable to have all spectrometers return the same output for a given input, and for this output to be low in noise sources or other errors. While many missions use statistical analyses of data to calibrate instruments in flight, this process is difficult with FPI for two reasons: 1. Only a small fraction of high resolution data is downloaded to the ground due to bandwidth limitations and 2: The data that is downloaded is, by definition, scientifically interesting and therefore not ideal for calibration. FPI uses a suite of new tools to calibrate in flight. A new method for detection system ground calibration has been developed involving sweeping the detection threshold to fully define the pulse height distribution. This method has now been extended for use in flight as a means to calibrate MCP voltage and threshold (together forming the operating point) of the Dual Electron Spectrometers (DES) and Dual Ion Spectrometers (DIS). A method of comparing higher energy data (which has low fractional voltage error) to lower energy data (which has a higher fractional voltage error) will be used to calibrate the high voltage outputs. Finally, a comparison of pitch angle distributions will be used to find remaining discrepancies among sensors

    Ion‐scale structure in Mercury’s magnetopause reconnection diffusion region

    Full text link
    The strength and time dependence of the electric field in a magnetopause diffusion region relate to the rate of magnetic reconnection between the solar wind and a planetary magnetic field. Here we use ~150 ms measurements of energetic electrons from the Mercury Surface, Space Environment, GEochemistry, and Ranging (MESSENGER) spacecraft observed over Mercury’s dayside polar cap boundary (PCB) to infer such small‐scale changes in magnetic topology and reconnection rates. We provide the first direct measurement of open magnetic topology in flux transfer events at Mercury, structures thought to account for a significant portion of the open magnetic flux transport throughout the magnetosphere. In addition, variations in PCB latitude likely correspond to intermittent bursts of ~0.3–3 mV/m reconnection electric fields separated by ~5–10 s, resulting in average and peak normalized dayside reconnection rates of ~0.02 and ~0.2, respectively. These data demonstrate that structure in the magnetopause diffusion region at Mercury occurs at the smallest ion scales relevant to reconnection physics.Key PointsEnergetic electrons at Mercury map magnetic topology at ~150 msFirst direct observation of flux transfer event open‐field topology at MercuryModulations of the reconnection rate at Mercury occur at ion kinetic scalesPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/133575/1/grl54476_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/133575/2/grl54476.pd

    Deep Learning for Space Weather Prediction: Bridging the Gap between Heliophysics Data and Theory

    Full text link
    Traditionally, data analysis and theory have been viewed as separate disciplines, each feeding into fundamentally different types of models. Modern deep learning technology is beginning to unify these two disciplines and will produce a new class of predictively powerful space weather models that combine the physical insights gained by data and theory. We call on NASA to invest in the research and infrastructure necessary for the heliophysics' community to take advantage of these advances.Comment: Heliophysics 2050 White Pape

    MMS Measurements of the Vlasov Equation: Probing the Electron Pressure Divergence Within Thin Current Sheets

    Get PDF
    We investigate the kinetic structure of electron‐scale current sheets found in the vicinity of the magnetopause and embedded in the magnetosheath within the reconnection exhaust. A new technique for computing terms of the Vlasov equation using Magnetospheric Multiscale (MMS) measurements is presented and applied to study phase space density gradients and the kinetic origins of the electron pressure divergence found within these current sheets. Crescent‐shaped structures in ∇⊄2fe give rise to bipolar and quadrupolar signatures in v·∇fe measured near the maximum ∇·Pe inside the current layers. The current density perpendicular to the magnetic field is strong (J⊄∌2 ÎŒA/m2), and the thickness of the current layers ranges from 3 to 5 electron inertial lengths. The electron flows supporting the current layers mainly result from the combination of E×B and diamagnetic drifts. We find nonzero J·Eâ€Č within the current sheets even though they are observed apart from typical diffusion region signatures.publishedVersio
    corecore