77 research outputs found

    Space-Time Block Preconditioning for Incompressible Resistive Magnetohydrodynamics

    Full text link
    This work develops a novel all-at-once space-time preconditioning approach for resistive magnetohydrodynamics (MHD), with a focus on model problems targeting fusion reactor design. We consider parallel-in-time due to the long time domains required to capture the physics of interest, as well as the complexity of the underlying system and thereby computational cost of long-time integration. To ameliorate this cost by using many processors, we thus develop a novel approach to solving the whole space-time system that is parallelizable in both space and time. We develop a space-time block preconditioning for resistive MHD, following the space-time block preconditioning concept first introduced by Danieli et al. in 2022 for incompressible flow, where an effective preconditioner for classic sequential time-stepping is extended to the space-time setting. The starting point for our derivation is the continuous Schur complement preconditioner by Cyr et al. in 2021, which we proceed to generalise in order to produce, to our knowledge, the first space-time block preconditioning approach for the challenging equations governing incompressible resistive MHD. The numerical results are promising for the model problems of island coalescence and tearing mode, with the overhead computational cost associated with space-time preconditioning versus sequential time-stepping being modest and primarily in the range of 2x-5x, which is low for parallel-in-time schemes in general. Additionally, the scaling results for inner (linear) and outer (nonlinear) iterations are flat in the case of fixed time-step size and only grow very slowly in the case of time-step refinement.Comment: 25 pages, 4 figures, 3 table

    Space-time block preconditioning for incompressible flow

    Full text link
    Parallel-in-time methods have become increasingly popular in the simulation of time-dependent numerical PDEs, allowing for the efficient use of additional MPI processes when spatial parallelism saturates. Most methods treat the solution and parallelism in space and time separately. In contrast, all-at-once methods solve the full space-time system directly, largely treating time as simply another spatial dimension. All-at-once methods offer a number of benefits over separate treatment of space and time, most notably significantly increased parallelism and faster time-to-solution (when applicable). However, the development of fast, scalable all-at-once methods has largely been limited to time-dependent (advection-)diffusion problems. This paper introduces the concept of space-time block preconditioning for the all-at-once solution of incompressible flow. By extending well-known concepts of spatial block preconditioning to the space-time setting, we develop a block preconditioner whose application requires the solution of a space-time (advection-)diffusion equation in the velocity block, coupled with a pressure Schur complement approximation consisting of independent spatial solves at each time-step, and a space-time matrix-vector multiplication. The new method is tested on four classical models in incompressible flow. Results indicate perfect scalability in refinement of spatial and temporal mesh spacing, perfect scalability in nonlinear Picard iterations count when applied to a nonlinear Navier-Stokes problem, and minimal overhead in terms of number of preconditioner applications compared with sequential time-stepping.Comment: 28 pages, 7 figures, 4 table

    Influence of dietary vitamin E supplementation on cholesterol oxidation and fresh colour in beef aged for 3 and 14 days

    Get PDF
    The effects of dietary vitamin E supplementation on the susceptibility to lipid oxidation and colour of the Longissimus thoracis (LT) muscle aged in vacuum packaged conditions for 3 or 14 days were studied. For this purpose, Charolais cattle were fed on a diet providing daily 60 mg (control) or 5500 mg of vitamin E per animal (supplemented) for 30 and 60 days before slaughter. Dietary vitamin E supplementation increased the liver vitamin E content, but not in the LT muscle of treated animals. The vitamin supplementation for 30 and 60 days has shown non-consistent effects in reducing cholesterol oxidation products of vacuum-packed aged meat. However, the vitamin E supplementation for 60 days was effective on Lightness stability in LT muscle during vacuum-packed ageing. Overall, from the practical standpoint, this study suggests that supranutritional supplementation up to 60 days may not increase the vitamin E content of Charolais LT muscle giving little, if any, benefits on meat colour and cholesterol oxidation. However, the present study suggests that it would be interesting to determine in which extent specific oxysterols are related to the meat colour and whether colour parameters can be useful for predicting the formation of cholesterol oxidation products along the industrial meat production chain.The effects of dietary vitamin E supplementation on the susceptibility to lipid oxidation and colour of the Longissimus thoracis (LT) muscle aged in vacuum packaged conditions for 3 or 14 days were studied. For this purpose, Charolais cattle were fed on a diet providing daily 60mg (control) or 5500mg of vitamin E per animal (supplemented) for 30 and 60 days before slaughter. Dietary vitamin E supplementation increased the liver vitamin E content, but not in the LT muscle of treated animals. The vitamin supplementation for 30 and 60 days has shown non-consistent effects in reducing cholesterol oxidation products of vacuum-packed aged meat. However, the vitamin E supplementation for 60 days was effective on Lightness stability in LT muscle during vacuum-packed ageing. Overall, from the practical standpoint, this study suggests that supranutritional supplementation up to 60 days may not increase the vitamin E content of Charolais LT muscle giving little, if any, benefits on meat colour and cholesterol oxidation. However, the present study suggests that it would be interesting to determine in which extent specific oxysterols are related to the meat colour and whether colour parameters can be useful for predicting the formation of cholesterol oxidation products along the industrial meat production chain

    DeepPCR: Parallelizing Sequential Operations in Neural Networks

    Full text link
    Parallelization techniques have become ubiquitous for accelerating inference and training of deep neural networks. Despite this, several operations are still performed in a sequential manner. For instance, the forward and backward passes are executed layer-by-layer, and the output of diffusion models is produced by applying a sequence of denoising steps. This sequential approach results in a computational cost proportional to the number of steps involved, presenting a potential bottleneck as the number of steps increases. In this work, we introduce DeepPCR, a novel algorithm which parallelizes typically sequential operations in order to speed up inference and training of neural networks. DeepPCR is based on interpreting a sequence of LL steps as the solution of a specific system of equations, which we recover using the Parallel Cyclic Reduction algorithm. This reduces the complexity of computing the sequential operations from O(L)\mathcal{O}(L) to O(log2L)\mathcal{O}(\log_2L), thus yielding a speedup for large LL. To verify the theoretical lower complexity of the algorithm, and to identify regimes for speedup, we test the effectiveness of DeepPCR in parallelizing the forward and backward pass in multi-layer perceptrons, and reach speedups of up to 30×30\times for the forward and 200×200\times for the backward pass. We additionally showcase the flexibility of DeepPCR by parallelizing training of ResNets with as many as 1024 layers, and generation in diffusion models, enabling up to 7×7\times faster training and 11×11\times faster generation, respectively, when compared to the sequential approach

    REALM: Robust Entropy Adaptive Loss Minimization for Improved Single-Sample Test-Time Adaptation

    Full text link
    Fully-test-time adaptation (F-TTA) can mitigate performance loss due to distribution shifts between train and test data (1) without access to the training data, and (2) without knowledge of the model training procedure. In online F-TTA, a pre-trained model is adapted using a stream of test samples by minimizing a self-supervised objective, such as entropy minimization. However, models adapted with online using entropy minimization, are unstable especially in single sample settings, leading to degenerate solutions, and limiting the adoption of TTA inference strategies. Prior works identify noisy, or unreliable, samples as a cause of failure in online F-TTA. One solution is to ignore these samples, which can lead to bias in the update procedure, slow adaptation, and poor generalization. In this work, we present a general framework for improving robustness of F-TTA to these noisy samples, inspired by self-paced learning and robust loss functions. Our proposed approach, Robust Entropy Adaptive Loss Minimization (REALM), achieves better adaptation accuracy than previous approaches throughout the adaptation process on corruptions of CIFAR-10 and ImageNet-1K, demonstrating its effectiveness.Comment: Accepted at WACV 2024, 17 pages, 7 figures, 11 table

    DUET: 2D Structured and Approximately Equivariant Representations

    Full text link
    Multiview Self-Supervised Learning (MSSL) is based on learning invariances with respect to a set of input transformations. However, invariance partially or totally removes transformation-related information from the representations, which might harm performance for specific downstream tasks that require such information. We propose 2D strUctured and EquivarianT representations (coined DUET), which are 2d representations organized in a matrix structure, and equivariant with respect to transformations acting on the input data. DUET representations maintain information about an input transformation, while remaining semantically expressive. Compared to SimCLR (Chen et al., 2020) (unstructured and invariant) and ESSL (Dangovski et al., 2022) (unstructured and equivariant), the structured and equivariant nature of DUET representations enables controlled generation with lower reconstruction error, while controllability is not possible with SimCLR or ESSL. DUET also achieves higher accuracy for several discriminative tasks, and improves transfer learning.Comment: Accepted at ICML 202
    corecore