Loss of precision due to the conservative nature of compile-time
dataflow analysis is a general problem and impacts a wide variety
of optimizations. We propose a limited form of runtime dataflow
analysis, called deferred dataflow analysis (DDFA), which
attempts to sharpen dataflow results by using control-flow
information that is available at runtime. The overheads of
runtime analysis are minimized by performing the bulk of the
analysis at compile-time and deferring only a summarized version
of the dataflow problem to runtime. Caching and reusing of
dataflow results reduces these overheads further.
DDFA is an interprocedural framework and can handle arbitrary
control structures including multi-way forks, recursion,
separately compiled functions and higher-order functions. It is
primarily targeted towards optimization of heavy-weight
operations such as communication calls, where one can expect
significant benefits from sharper dataflow analysis. We outline
how DDFA can be used to optimize different kinds of heavy-weight
operations such as bulk-prefetching on distributed systems and
dynamic linking in mobile programs. We prove that DDFA is safe
and that it yields better dataflow information than strictly
compile-time dataflow analysis. (Also cross-referenced as UMIACS-TR-98-46