940 research outputs found

    Coefficient-Robust A Posteriori Error Estimation for H(curl)-elliptic Problems

    Full text link
    We extend the framework of a posteriori error estimation by preconditioning in [Li, Y., Zikatanov, L.: Computers \& Mathematics with Applications. \textbf{91}, 192-201 (2021)] and derive new a posteriori error estimates for H(curl)-elliptic two-phase interface problems. The proposed error estimator provides two-sided bounds for the discretization error and is robust with respect to coefficient variation under mild assumptions. For H(curl) problems with constant coefficients, the performance of this estimator is numerically compared with the one analyzed in [Sch\"oberl, J.: Math.~Comp. \textbf{77}(262), 633-649 (2008)]

    Quasi-optimal adaptive hybridized mixed finite element methods for linear elasticity

    Full text link
    For the planar Navier--Lam\'e equation in mixed form with symmetric stress tensors, we prove the uniform quasi-optimal convergence of an adaptive method based on the hybridized mixed finite element proposed in [Gong, Wu, and Xu: Numer.~Math., 141 (2019), pp.~569--604]. The main ingredients in the analysis consist of a discrete a posteriori upper bound and a quasi-orthogonality result for the stress field under the mixed boundary condition. Compared with existing adaptive methods, the proposed adaptive algorithm could be directly applied to the traction boundary condition and be easily implemented

    Entropy-based convergence rates of greedy algorithms

    Full text link
    We present convergence estimates of two types of greedy algorithms in terms of the metric entropy of underlying compact sets. In the first part, we measure the error of a standard greedy reduced basis method for parametric PDEs by the metric entropy of the solution manifold in Banach spaces. This contrasts with the classical analysis based on the Kolmogorov n-widths and enables us to obtain direct comparisons between the greedy algorithm error and the entropy numbers, where the multiplicative constants are explicit and simple. The entropy-based convergence estimate is sharp and improves upon the classical width-based analysis of reduced basis methods for elliptic model problems. In the second part, we derive a novel and simple convergence analysis of the classical orthogonal greedy algorithm for nonlinear dictionary approximation using the metric entropy of the symmetric convex hull of the dictionary. This also improves upon existing results by giving a direct comparison between the algorithm error and the metric entropy.Comment: 22 pages, no figure
    • …
    corecore