2 research outputs found
A Call-by-Need Strategy for Higher-Order Functional-Logic Programming
We present an approach to truely higher-order functional-logic programming based on higher-order narrowing. Roughly speaking, we model a higherorder functional core language by higher-order rewriting and extend it by logic variables. For the integration of logic programs, conditional rules are supported. For solving goals in this framework, we present a complete calculus for higher-order conditional narrowing. We develop several refinements that utilize the determinism of functional programs. These refinements can be combined to a narrowing strategy which generalizes call-by-need as in functional programming, where the dedicated higher-order methods are only used for full higher-order goals. Furthermore, we propose an implementational model for this narrowing strategy which delays computations until needed
Recommended from our members
Aspects of Qualitative Consciousness: A Computer Science Perspective
The domain of artificial intelligence (AI) has been characterised by John Searle [Sear84] by distinguishing between iveak AI, according to which computers are useful tools for studying mind, and strong AI, according to which an equivalence is made between mind and programs such that computers executing programs actually possess minds. This dissertation explores a third alternative, namely: the prospects and promise of m ild AI, according to which a suitable computer is capable of possessing species of mentality that may differ from or be weaker than ordinary human mentality, but qualify as “mentality” nonetheless. The purpose of this dissertation is to explore the prospects and promise of mild AI.
The approach adopted explores whether mind can be replicated, as opposed to merely simulated, in digital machines. This requires a definition of mind in order to judge success. James Fetzer [Fetz90] has suggested minds can be defined as sign using systems in the sense of Charles Peirce’s semiotic (theory of signs) and, on this basis, argues convincingly against strong AI. Determining if his negative conclusion applies to mild AI requires rejoining Fetzer’s analysis of the analogical argument for strong AI and redressing his laws of human beings and digital machines. This is tackled by focusing on the nature and form of the operational relationship between the physical machine and mind, and suggesting some operational requirements for a minimal semiotic system independently of any underlying physical implementation. This involves four steps.
Firstly, as a formal foundation, a characterisation of systems is developed in terms of the causal structure and ontological levels in the system, where an ontological level is individuated by the laws that are in effect. This is in contrast to levels of organisation, such as levels of software abstraction. This exploration suggests the necessity — as a matter of natural law — for a mediating level between the physical machine and mind that is or, at least, appears to be necessary for producing forms of mentality. The lawful structure that appears to be required within this level and between levels is examined with respect to the prospects for implementing a semiotic system.
Secondly, how a system can operate in terms of semiotic processes based on a network of instantiated dispositions is explored. These are modelled as the temporal counterparts of state-transitions and stationary-representations, which are termed causal-flows and temporal-representations, respectively. They highlight the varying interactive structure of temporal patterns of causal activity in time. For the purposes of replicating mind, preserving the causal-flow structure of mental processes arises as an important requirement.
Thirdly, the system structure sufficient for generating consciousness is explored — a necessary condition for a cognitive semiotic system. This suggests a requirement relating to the causal accessibility of the contents of consciousness. This structuring is driven by the system’s need to signify reality by categorising these aspects as operational entities upon which decisions can be made. Consciousness arises through the manner in which the signified reality is generated. This makes mind and consciousness the result of a co-ordinated occurrent system wide activity.
Fourthly, in a mathematical sense, brains and computers can be classified as types of numeric and symbolic systems, respectively. These systems are compared and conditions formulated under which they may give rise to equivalent ontological levels. Peirce’s triadic sign relation is analysed in terms of ontological levels and the results used to clarify the nature of the ground relation in machine forms of mentality.
According to the theorems developed, the introduction of a dispositional mediating level might effectively enable a suitable computer to replicate species of mentality. An important factor in determining whether a computer is suitable for this purpose is its performance capacity and thus some estimates are calculated in this respect. It is shown how these requirements, along with a number of others, can help in the development of semiotic systems and variants, such as the iconic state machine of Igor Aleksander [Alek96]