2,496 research outputs found
An introduction to Multitrace Formulations and Associated Domain Decomposition Solvers
Multitrace formulations (MTFs) are based on a decomposition of the problem
domain into subdomains, and thus domain decomposition solvers are of interest.
The fully rigorous mathematical MTF can however be daunting for the
non-specialist. We introduce in this paper MTFs on a simple model problem using
concepts familiar to researchers in domain decomposition. This allows us to get
a new understanding of MTFs and a natural block Jacobi iteration, for which we
determine optimal relaxation parameters. We then show how iterative multitrace
formulation solvers are related to a well known domain decomposition method
called optimal Schwarz method: a method which used Dirichlet to Neumann maps in
the transmission condition. We finally show that the insight gained from the
simple model problem leads to remarkable identities for Calderon projectors and
related operators, and the convergence results and optimal choice of the
relaxation parameter we obtained is independent of the geometry, the space
dimension of the problem{\color{black}, and the precise form of the spatial
elliptic operator, like for optimal Schwarz methods. We illustrate our analysis
with numerical experiments
Correlation kernels for sums and products of random matrices
Let be a random matrix whose squared singular value density is a
polynomial ensemble. We derive double contour integral formulas for the
correlation kernels of the squared singular values of and , where
is a complex Ginibre matrix and is a truncated unitary matrix. We also
consider the product of and several complex Ginibre/truncated unitary
matrices. As an application, we derive the precise condition for the squared
singular values of the product of several truncated unitary matrices to follow
a polynomial ensemble. We also consider the sum where is a GUE
matrix and is a random matrix whose eigenvalue density is a polynomial
ensemble. We show that the eigenvalues of follow a polynomial ensemble
whose correlation kernel can be expressed as a double contour integral. As an
application, we point out a connection to the two-matrix model.Comment: 33 pages, some changes suggested by the referee is made and some
references are adde
PopCORN: Hunting down the differences between binary population synthesis codes
Binary population synthesis (BPS) modelling is a very effective tool to study
the evolution and properties of close binary systems. The uncertainty in the
parameters of the model and their effect on a population can be tested in a
statistical way, which then leads to a deeper understanding of the underlying
physical processes involved. To understand the predictive power of BPS codes,
we study the similarities and differences in the predicted populations of four
different BPS codes for low- and intermediate-mass binaries. We investigate
whether the differences are caused by different assumptions made in the BPS
codes or by numerical effects. To simplify the complex problem of comparing BPS
codes, we equalise the inherent assumptions as much as possible. We find that
the simulated populations are similar between the codes. Regarding the
population of binaries with one WD, there is very good agreement between the
physical characteristics, the evolutionary channels that lead to the birth of
these systems, and their birthrates. Regarding the double WD population, there
is a good agreement on which evolutionary channels exist to create double WDs
and a rough agreement on the characteristics of the double WD population.
Regarding which progenitor systems lead to a single and double WD system and
which systems do not, the four codes agree well. Most importantly, we find that
for these two populations, the differences in the predictions from the four
codes are not due to numerical differences, but because of different inherent
assumptions. We identify critical assumptions for BPS studies that need to be
studied in more detail.Comment: 13 pages, +21 pages appendix, 35 figures, accepted for publishing in
A&A, Minor change to match published version, most important the added link
to the website http://www.astro.ru.nl/~silviato/popcorn for more detailed
figures and informatio
Semantic validation of affinity constrained service function chain requests
Network Function Virtualization (NFV) has been proposed as a paradigm to increase the cost-efficiency, flexibility and innovation in network service provisioning. By leveraging IT virtualization techniques in combination with programmable networks, NFV is able to decouple network functionality from the physical devices on which they
are deployed. This opens up new business opportunities for both Infrastructure Providers (InPs) as well as Service Providers (SPs), where the SP can request to deploy a chain of Virtual Network Functions (VNFs) on top of which its service can run. However, current NFV approaches lack the possibility for SPs to define location requirements and constraints on the mapping of virtual functions and paths onto physical hosts and links. Nevertheless, many scenarios
can be envisioned in which the SP would like to attach placement constraints for efficiency, resilience, legislative, privacy and economic reasons. Therefore, we propose a set of affinity and anti-affinity constraints, which can be used by SPs to define such placement restrictions. This newfound ability to add constraints to Service Function Chain (SFC) requests also introduces an additional risk that SFCs with conflicting constraints are requested or automatically
generated. Therefore, a framework is proposed that allows the InP to check the validity of a set of constraints and provide feedback to the SP. To achieve this, the SFC request and relevant information on the physical topology are modeled as an ontology of which the consistency can be checked using a semantic reasoner. Enabling semantic
validation of SFC requests, eliminates inconsistent SFCs requests from being transferred to the embedding algorithm.Peer Reviewe
- …