1 research outputs found
A Bayesian Approach to the Partitioning of Workflows
When partitioning workflows in realistic scenarios, the knowledge of the
processing units is often vague or unknown. A naive approach to addressing this
issue is to perform many controlled experiments for different workloads, each
consisting of multiple number of trials in order to estimate the mean and
variance of the specific workload. Since this controlled experimental approach
can be quite costly in terms of time and resources, we propose a variant of the
Gibbs Sampling algorithm that uses a sequence of Bayesian inference updates to
estimate the processing characteristics of the processing units. Using the
inferred characteristics of the processing units, we are able to determine the
best way to split a workflow for processing it in parallel with the lowest
expected completion time and least variance