81 research outputs found

    Multilevel fast multipole algorithm for fields

    Get PDF
    An efficient implementation of the multilevel fast multipole algorithm is herein applied to accelerate the calculation of the electromagnetic near- and far-fields after the equivalent surface currents have been obtained. In spite of all the research efforts being drawn to the latter, the electric and/or magnetic fields (or other parameters derived from these) are ultimately the magnitudes of interest in most of the cases. Though straightforward, their calculation can be computationally demanding, and hence the importance of finding a sped-up accurate representation of the fields via a suitable setup of the method. A complete self-contained formulation for both near- and far-fields and for problems including multiple penetrable regions is shown in full detail. Through numerical examples we show that the efficiency and scalability of the implementation leads to a drastic reduction of the computation time.Ministerio de EconomĂ­a y Competitividad | Ref. MAT2014-58201-C2-1-RMinisterio de EconomĂ­a y Competitividad | Ref. MAT2014-58201-C2-2-RGobierno Regional de Extremadura | Ref. IB1318

    On the 3D electromagnetic quantitative inverse scattering problem: algorithms and regularization

    Get PDF
    In this thesis, 3D quantitative microwave imaging algorithms are developed with emphasis on efficiency of the algorithms and quality of the reconstruction. First, a fast simulation tool has been implemented which makes use of a volume integral equation (VIE) to solve the forward scattering problem. The solution of the resulting linear system is done iteratively. To do this efficiently, two strategies are combined. First, the matrix-vector multiplications needed in every step of the iterative solution are accelerated using a combination of the Fast Fourier Transform (FFT) method and the Multilevel Fast Multipole Algorithm (MLFMA). It is shown that this hybridMLFMA-FFT method is most suited for large, sparse scattering problems. Secondly, the number of iterations is reduced by using an extrapolation technique to determine suitable initial guesses, which are already close to the solution. This technique combines a marching-on-in-source-position scheme with a linear extrapolation over the permittivity under the form of a Born approximation. It is shown that this forward simulator indeed exhibits a better efficiency. The fast forward simulator is incorporated in an optimization technique which minimizes the discrepancy between measured data and simulated data by adjusting the permittivity profile. A Gauss-Newton optimization method with line search is employed in this dissertation to minimize a least squares data fit cost function with additional regularization. Two different regularization methods were developed in this research. The first regularization method penalizes strong fluctuations in the permittivity by imposing a smoothing constraint, which is a widely used approach in inverse scattering. However, in this thesis, this constraint is incorporated in a multiplicative way instead of in the usual additive way, i.e. its weight in the cost function is reduced with an improving data fit. The second regularization method is Value Picking regularization, which is a new method proposed in this dissertation. This regularization is designed to reconstruct piecewise homogeneous permittivity profiles. Such profiles are hard to reconstruct since sharp interfaces between different permittivity regions have to be preserved, while other strong fluctuations need to be suppressed. Instead of operating on the spatial distribution of the permittivity, as certain existing methods for edge preservation do, it imposes the restriction that only a few different permittivity values should appear in the reconstruction. The permittivity values just mentioned do not have to be known in advance, however, and their number is also updated in a stepwise relaxed VP (SRVP) regularization scheme. Both regularization techniques have been incorporated in the Gauss-Newton optimization framework and yield significantly improved reconstruction quality. The efficiency of the minimization algorithm can also be improved. In every step of the iterative optimization, a linear Gauss-Newton update system has to be solved. This typically is a large system and therefore is solved iteratively. However, these systems are ill-conditioned as a result of the ill-posedness of the inverse scattering problem. Fortunately, the aforementioned regularization techniques allow for the use of a subspace preconditioned LSQR method to solve these systems efficiently, as is shown in this thesis. Finally, the incorporation of constraints on the permittivity through a modified line search path, helps to keep the forward problem well-posed and thus the number of forward iterations low. Another contribution of this thesis is the proposal of a new Consistency Inversion (CI) algorithm. It is based on the same principles as another well known reconstruction algorithm, the Contrast Source Inversion (CSI) method, which considers the contrast currents – equivalent currents that generate a field identical to the scattered field – as fundamental unknowns together with the permittivity. In the CI method, however, the permittivity variables are eliminated from the optimization and are only reconstructed in a final step. This avoids alternating updates of permittivity and contrast currents, which may result in a faster convergence. The CI method has also been supplemented with VP regularization, yielding the VPCI method. The quantitative electromagnetic imaging methods developed in this work have been validated on both synthetic and measured data, for both homogeneous and inhomogeneous objects and yield a high reconstruction quality in all these cases. The successful, completely blind reconstruction of an unknown target from measured data, provided by the Institut Fresnel in Marseille, France, demonstrates at once the validity of the forward scattering code, the performance of the reconstruction algorithm and the quality of the measurements. The reconstruction of a numerical MRI based breast phantom is encouraging for the further development of biomedical microwave imaging and of microwave breast cancer screening in particular

    Hybridization of fast multipole techniques and geometrical optics for the modeling of large electromagnetic problems

    Get PDF

    Software for Exascale Computing - SPPEXA 2016-2019

    Get PDF
    This open access book summarizes the research done and results obtained in the second funding phase of the Priority Program 1648 "Software for Exascale Computing" (SPPEXA) of the German Research Foundation (DFG) presented at the SPPEXA Symposium in Dresden during October 21-23, 2019. In that respect, it both represents a continuation of Vol. 113 in Springer’s series Lecture Notes in Computational Science and Engineering, the corresponding report of SPPEXA’s first funding phase, and provides an overview of SPPEXA’s contributions towards exascale computing in today's sumpercomputer technology. The individual chapters address one or more of the research directions (1) computational algorithms, (2) system software, (3) application software, (4) data management and exploration, (5) programming, and (6) software tools. The book has an interdisciplinary appeal: scholars from computational sub-fields in computer science, mathematics, physics, or engineering will find it of particular interest

    Towards a more efficient spectrum usage: spectrum sensing and cognitive radio techniques

    No full text
    The traditional approach of dealing with spectrum management in wireless communications has been through the definition on a license user granted exclusive exploitation rights for a specific frequency.Peer ReviewedPostprint (published version

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    International Workshop on Finite Elements for Microwave Engineering

    Get PDF
    When Courant prepared the text of his 1942 address to the American Mathematical Society for publication, he added a two-page Appendix to illustrate how the variational methods first described by Lord Rayleigh could be put to wider use in potential theory. Choosing piecewise-linear approximants on a set of triangles which he called elements, he dashed off a couple of two-dimensional examples and the finite element method was born. … Finite element activity in electrical engineering began in earnest about 1968-1969. A paper on waveguide analysis was published in Alta Frequenza in early 1969, giving the details of a finite element formulation of the classical hollow waveguide problem. It was followed by a rapid succession of papers on magnetic fields in saturable materials, dielectric loaded waveguides, and other well-known boundary value problems of electromagnetics. … In the decade of the eighties, finite element methods spread quickly. In several technical areas, they assumed a dominant role in field problems. P.P. Silvester, San Miniato (PI), Italy, 1992 Early in the nineties the International Workshop on Finite Elements for Microwave Engineering started. This volume contains the history of the Workshop and the Proceedings of the 13th edition, Florence (Italy), 2016 . The 14th Workshop will be in Cartagena (Colombia), 2018
    • …
    corecore