676,993 research outputs found
Auto-generation of passive scalable macromodels for microwave components using scattered sequential sampling
This paper presents a method for automatic construction of stable and passive scalable macromodels for parameterized frequency responses. The method requires very little prior knowledge to build the scalable macromodels thereby considerably reducing the burden on the designers. The proposed method uses an efficient scattered sequential sampling strategy with as few expensive simulations as possible to generate accurate macromodels for the system using state-of-the-art scalable macromodeling methods. The scalable macromodels can be used as a replacement model for the actual simulator in overall design processes. Pertinent numerical results validate the proposed sequential sampling strategy
Measures of scalability
Scalable frames are frames with the property that the frame vectors can be
rescaled resulting in tight frames. However, if a frame is not scalable, one
has to aim for an approximate procedure. For this, in this paper we introduce
three novel quantitative measures of the closeness to scalability for frames in
finite dimensional real Euclidean spaces. Besides the natural measure of
scalability given by the distance of a frame to the set of scalable frames,
another measure is obtained by optimizing a quadratic functional, while the
third is given by the volume of the ellipsoid of minimal volume containing the
symmetrized frame. After proving that these measures are equivalent in a
certain sense, we establish bounds on the probability of a randomly selected
frame to be scalable. In the process, we also derive new necessary and
sufficient conditions for a frame to be scalable.Comment: 27 pages, 5 figure
A note on scalable frames
We study the problem of determining whether a given frame is scalable, and
when it is, understanding the set of all possible scalings. We show that for
most frames this is a relatively simple task in that the frame is either not
scalable or is scalable in a unique way, and to find this scaling we just have
to solve a linear system. We also provide some insight into the set of all
scalings when there is not a unique scaling. In particular, we show that this
set is a convex polytope whose vertices correspond to minimal scalings
zfit: scalable pythonic fitting
Statistical modeling is a key element in many scientific fields and
especially in High-Energy Physics (HEP) analysis. The standard framework to
perform this task in HEP is the C++ ROOT/RooFit toolkit; with Python bindings
that are only loosely integrated into the scientific Python ecosystem. In this
paper, zfit, a new alternative to RooFit written in pure Python, is presented.
Most of all, zfit provides a well defined high-level API and workflow for
advanced model building and fitting, together with an implementation on top of
TensorFlow, allowing a transparent usage of CPUs and GPUs. It is designed to be
extendable in a very simple fashion, allowing the usage of cutting-edge
developments from the scientific Python ecosystem in a transparent way. The
main features of zfit are introduced, and its extension to data analysis,
especially in the context of HEP experiments, is discussed.Comment: 12 pages, 2 figure
Scalable Peer-to-Peer Indexing with Constant State
We present a distributed indexing scheme for peer to peer networks. Past work on distributed indexing traded off fast search times with non-constant degree topologies or network-unfriendly behavior such as flooding. In contrast, the scheme we present optimizes all three of these performance measures. That is, we provide logarithmic round searches while maintaining connections to a fixed number of peers and avoiding network flooding. In comparison to the well known scheme Chord, we provide competitive constant factors. Finally, we observe that arbitrary linear speedups are possible and discuss both a general brute force approach and specific economical optimizations
- …
