Analysis of a probabilistic system often requires to learn the joint
probability distribution of its random variables. The computation of the exact
distribution is usually an exhaustive precise analysis on all executions of the
system. To avoid the high computational cost of such an exhaustive search,
statistical analysis has been studied to efficiently obtain approximate
estimates by analyzing only a small but representative subset of the system's
behavior. In this paper we propose a hybrid statistical estimation method that
combines precise and statistical analyses to estimate mutual information,
Shannon entropy, and conditional entropy, together with their confidence
intervals. We show how to combine the analyses on different components of a
discrete system with different accuracy to obtain an estimate for the whole
system. The new method performs weighted statistical analysis with different
sample sizes over different components and dynamically finds their optimal
sample sizes. Moreover, it can reduce sample sizes by using prior knowledge
about systems and a new abstraction-then-sampling technique based on
qualitative analysis. To apply the method to the source code of a system, we
show how to decompose the code into components and to determine the analysis
method for each component by overviewing the implementation of those techniques
in the HyLeak tool. We demonstrate with case studies that the new method
outperforms the state of the art in quantifying information leakage.Comment: Accepted by Formal Aspects of Computin
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.