315,788 research outputs found
Higher Order Bases in a 2D Hybrid BEM/FEM Formulation
The advantages of using higher order, interpolatory basis functions are examined in the analysis of transverse electric (TE) plane wave scattering by homogeneous, dielectric cylinders. A boundary-element/finite-element (BEM/FEM) hybrid formulation is employed in which the interior dielectric region is modeled with the vector Helmholtz equation, and a radiation boundary condition is supplied by an Electric Field Integral Equation (EFIE). An efficient method of handling the singular self-term arising in the EFIE is presented. The iterative solution of the partially dense system of equations is obtained using the Quasi-Minimal Residual (QMR) algorithm with an Incomplete LU Threshold (ILUT) preconditioner. Numerical results are shown for the case of an incident wave impinging upon a square dielectric cylinder. The convergence of the solution is shown versus the number of unknowns as a function of the completeness order of the basis functions
Optimization and synthesis of multilayer frequency selective surfaces via bioinspired hybrid techniques
In this study, two bioinspired computation (BIC) techniques are discussed and applied to the project and synthesis of multilayer frequency selective surfaces (FSS) within the microwave band, specifically for C, X and Ku bands. The proposed BIC techniques consist of combining an artificial, general regression neural network to a genetic algorithm (GA) and a cuckoo search algorithm, respectively. The objective is to find the optimal values of separation between the investigated FSS. Numerical analysis of the electromagnetic properties of the device is made possible with the finite integration method (FIT) and validated through the finite element method (FEM), utilizing both softwares CST Microwave Studio and Ansys HFSS respectively. Thus, the BIC-optimized devices present good phase / angular stability for angles 10°, 20°, 30° and 40°, as well as being polarization independent. The cutoff frequencies to control the operating frequency range of the FSS, referring to transmission coefficient in decibels (dB), were obtained at a threshold of –10dB. Numerical results denote good accordance with measured data
Non-monotone Submodular Maximization with Nearly Optimal Adaptivity and Query Complexity
Submodular maximization is a general optimization problem with a wide range
of applications in machine learning (e.g., active learning, clustering, and
feature selection). In large-scale optimization, the parallel running time of
an algorithm is governed by its adaptivity, which measures the number of
sequential rounds needed if the algorithm can execute polynomially-many
independent oracle queries in parallel. While low adaptivity is ideal, it is
not sufficient for an algorithm to be efficient in practice---there are many
applications of distributed submodular optimization where the number of
function evaluations becomes prohibitively expensive. Motivated by these
applications, we study the adaptivity and query complexity of submodular
maximization. In this paper, we give the first constant-factor approximation
algorithm for maximizing a non-monotone submodular function subject to a
cardinality constraint that runs in adaptive rounds and makes
oracle queries in expectation. In our empirical study, we use
three real-world applications to compare our algorithm with several benchmarks
for non-monotone submodular maximization. The results demonstrate that our
algorithm finds competitive solutions using significantly fewer rounds and
queries.Comment: 12 pages, 8 figure
Hierarchical stack filtering : a bitplane-based algorithm for massively parallel processors
With the development of novel parallel architectures for image processing, the implementation
of well-known image operators needs to be reformulated to take advantage of the so-called
massive parallelism. In this work, we propose a general algorithm that implements a large
class of nonlinear filters, called stack filters, with a 2D-array processor. The proposed method consists of decomposing an image into bitplanes with the bitwise decomposition, and then process every bitplane hierarchically. The filtered image is reconstructed by simply stacking the filtered bitplanes according to their order of significance. Owing to its hierarchical structure, our algorithm allows us to trade-off between image quality and processing time, and to significantly reduce the computation time of low-entropy images. Also, experimental tests show that the processing time of our method is substantially lower than that of classical methods when using large structuring elements. All these features are of interest to a variety of real-time applications based on morphological operations such as video segmentation and video enhancement
Assessing and Remedying Coverage for a Given Dataset
Data analysis impacts virtually every aspect of our society today. Often,
this analysis is performed on an existing dataset, possibly collected through a
process that the data scientists had limited control over. The existing data
analyzed may not include the complete universe, but it is expected to cover the
diversity of items in the universe. Lack of adequate coverage in the dataset
can result in undesirable outcomes such as biased decisions and algorithmic
racism, as well as creating vulnerabilities such as opening up room for
adversarial attacks.
In this paper, we assess the coverage of a given dataset over multiple
categorical attributes. We first provide efficient techniques for traversing
the combinatorial explosion of value combinations to identify any regions of
attribute space not adequately covered by the data. Then, we determine the
least amount of additional data that must be obtained to resolve this lack of
adequate coverage. We confirm the value of our proposal through both
theoretical analyses and comprehensive experiments on real data.Comment: in ICDE 201
- …