This paper proposes an adaptive neural-compilation framework to address the
problem of efficient program learning. Traditional code optimisation strategies
used in compilers are based on applying pre-specified set of transformations
that make the code faster to execute without changing its semantics. In
contrast, our work involves adapting programs to make them more efficient while
considering correctness only on a target input distribution. Our approach is
inspired by the recent works on differentiable representations of programs. We
show that it is possible to compile programs written in a low-level language to
a differentiable representation. We also show how programs in this
representation can be optimised to make them efficient on a target distribution
of inputs. Experimental results demonstrate that our approach enables learning
specifically-tuned algorithms for given data distributions with a high success
rate.Comment: Submitted to NIPS 2016, code and supplementary materials will be
available on author's pag