(Submitted on 25 Oct 2016 (v1), last revised 9 Dec 2016 (this version, v2))
Abstract: In this paper we develop operational calculus on programming spaces that generalizes existing approaches to automatic differentiation of computer programs and provides a rigorous framework for program analysis through calculus.
We present an abstract computing machine that models automatically differentiable computer programs. Computer programs are viewed as maps on a finite dimensional vector space called virtual memory space, which we extend by the tensor algebra of its dual to accommodate derivatives. The extended virtual memory is by itself an algebra of programs and its elements give the expansion of the original program as an infinite tensor series at program's input values.
We define the operator of differentiation on programming spaces and implement higher order derivatives as well as generalized shift operator in terms of its powers. Operational calculus is used to prove properties of the defined operators. Several possible applications to computer science are presented, most notably trainable general tensor neural networks that can provide a meaningful way of neural network initialization and in some cases yield better performing approximations of programs.
Our approach offers a powerful tool for program analysis and approximation as well as a unified approach to automatic differentiation covering both forward and reverse mode of arbitrary order under a single operator. General tensor networks enable generalization of the existing state of the art methods for analyzing neural networks to any computer program.
From: Žiga Sajovic [view email]
[v1] Tue, 25 Oct 2016 00:45:10 GMT (32kb)
[v2] Fri, 9 Dec 2016 19:08:27 GMT (26kb)
[v1] Tue, 25 Oct 2016 00:45:10 GMT (32kb)
[v2] Fri, 9 Dec 2016 19:08:27 GMT (26kb)