The importance of algorithms is now recognized in all mathematical sciences,\ud thanks to the development of computability and computational\ud complexity theory in the 20th century. The basic understanding of computability\ud theory developed in the nineteen thirties with the pioneering\ud work of mathematicians like G¨odel, Church, Turing and Post. Their work\ud provided the mathematical basis for the study of algorithms as a formalized\ud concept. The work of Hartmanis, Stearns, Karp, Cook and others\ud in the nineteen sixties and seventies showed that the refinement of the\ud theory to resource-bounded computations gave the means to explain the\ud many intuitions concerning the complexity or hardness of algorithmic\ud problems in a precise and rigorous framework.\ud The theory has its roots in the older questions of definability, provability\ud and decidability in formal systems. The breakthrough in the nineteen\ud thirties was the formalisation of the intuitive concept of algorithmic\ud computability by Turing. In his famous 1936-paper, Turing  presented\ud a model of computation that was both mathematically rigorous and general\ud enough to capture the possible actions that any human computer \ud could carry out. Although the model was presented well before digital\ud computers arrived on the scene, it has the generality of describing\ud computations at the individual bit-level, using very basic control commands.\ud Computability and computational complexity theory are now\ud firmly founded on the Turing machine paradigm and its ramifications\ud in recursion theory. In this paper we will extend the Turing machine\ud paradigm to include several key features of contemporary information\ud processing systems
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.