13,267 research outputs found
Stability of Correction Procedure via Reconstruction With Summation-by-Parts Operators for Burgers' Equation Using a Polynomial Chaos Approach
In this paper, we consider Burgers' equation with uncertain boundary and
initial conditions. The polynomial chaos (PC) approach yields a hyperbolic
system of deterministic equations, which can be solved by several numerical
methods. Here, we apply the correction procedure via reconstruction (CPR) using
summation-by-parts operators. We focus especially on stability, which is proven
for CPR methods and the systems arising from the PC approach. Due to the usage
of split-forms, the major challenge is to construct entropy stable numerical
fluxes. For the first time, such numerical fluxes are constructed for all
systems resulting from the PC approach for Burgers' equation. In numerical
tests, we verify our results and show also the advantage of the given ansatz
using CPR methods. Moreover, one of the simulations, i.e. Burgers' equation
equipped with an initial shock, demonstrates quite fascinating observations.
The behaviour of the numerical solutions from several methods (finite volume,
finite difference, CPR) differ significantly from each other. Through careful
investigations, we conclude that the reason for this is the high sensitivity of
the system to varying dissipation. Furthermore, it should be stressed that the
system is not strictly hyperbolic with genuinely nonlinear or linearly
degenerate fields
A posteriori error analysis and adaptive non-intrusive numerical schemes for systems of random conservation laws
In this article we consider one-dimensional random systems of hyperbolic
conservation laws. We first establish existence and uniqueness of random
entropy admissible solutions for initial value problems of conservation laws
which involve random initial data and random flux functions. Based on these
results we present an a posteriori error analysis for a numerical approximation
of the random entropy admissible solution. For the stochastic discretization,
we consider a non-intrusive approach, the Stochastic Collocation method. The
spatio-temporal discretization relies on the Runge--Kutta Discontinuous
Galerkin method. We derive the a posteriori estimator using continuous
reconstructions of the discrete solution. Combined with the relative entropy
stability framework this yields computable error bounds for the entire
space-stochastic discretization error. The estimator admits a splitting into a
stochastic and a deterministic (space-time) part, allowing for a novel
residual-based space-stochastic adaptive mesh refinement algorithm. We conclude
with various numerical examples investigating the scaling properties of the
residuals and illustrating the efficiency of the proposed adaptive algorithm
An Economist´s guide to the Kalman filter
Almost since its appearance, the Kalman Filter (KF) has been successfully used in control engineering. Unfortunately, most of its important results have been published in engineering journals with language, notation and style proper of engineers. In this paper, we want to present the KF in an attractive way to economists by using information theory and Bayesian inference.
Injecting Uncertainty in Graphs for Identity Obfuscation
Data collected nowadays by social-networking applications create fascinating
opportunities for building novel services, as well as expanding our
understanding about social structures and their dynamics. Unfortunately,
publishing social-network graphs is considered an ill-advised practice due to
privacy concerns. To alleviate this problem, several anonymization methods have
been proposed, aiming at reducing the risk of a privacy breach on the published
data, while still allowing to analyze them and draw relevant conclusions. In
this paper we introduce a new anonymization approach that is based on injecting
uncertainty in social graphs and publishing the resulting uncertain graphs.
While existing approaches obfuscate graph data by adding or removing edges
entirely, we propose using a finer-grained perturbation that adds or removes
edges partially: this way we can achieve the same desired level of obfuscation
with smaller changes in the data, thus maintaining higher utility. Our
experiments on real-world networks confirm that at the same level of identity
obfuscation our method provides higher usefulness than existing randomized
methods that publish standard graphs.Comment: VLDB201
Quantitative analysis of the leakage of confidential data
Basic information theory is used to analyse the amount of confidential information which may be leaked by programs written in a very simple imperative language. In particular, a detailed analysis is given of the possible leakage due to equality tests and if statements. The analysis is presented as a set of syntax-directed inference rules and can readily be automated
- …