28 research outputs found

    A Swiss Pocket Knife for Computability

    Get PDF
    This research is about operational- and complexity-oriented aspects of classical foundations of computability theory. The approach is to re-examine some classical theorems and constructions, but with new criteria for success that are natural from a programming language perspective. Three cornerstones of computability theory are the S-m-ntheorem; Turing's "universal machine"; and Kleene's second recursion theorem. In today's programming language parlance these are respectively partial evaluation, self-interpretation, and reflection. In retrospect it is fascinating that Kleene's 1938 proof is constructive; and in essence builds a self-reproducing program. Computability theory originated in the 1930s, long before the invention of computers and programs. Its emphasis was on delimiting the boundaries of computability. Some milestones include 1936 (Turing), 1938 (Kleene), 1967 (isomorphism of programming languages), 1985 (partial evaluation), 1989 (theory implementation), 1993 (efficient self-interpretation) and 2006 (term register machines). The "Swiss pocket knife" of the title is a programming language that allows efficient computer implementation of all three computability cornerstones, emphasising the third: Kleene's second recursion theorem. We describe experiments with a tree-based computational model aiming for both fast program generation and fast execution of the generated programs.Comment: In Proceedings Festschrift for Dave Schmidt, arXiv:1309.455

    On the Semantics of Intensionality and Intensional Recursion

    Full text link
    Intensionality is a phenomenon that occurs in logic and computation. In the most general sense, a function is intensional if it operates at a level finer than (extensional) equality. This is a familiar setting for computer scientists, who often study different programs or processes that are interchangeable, i.e. extensionally equal, even though they are not implemented in the same way, so intensionally distinct. Concomitant with intensionality is the phenomenon of intensional recursion, which refers to the ability of a program to have access to its own code. In computability theory, intensional recursion is enabled by Kleene's Second Recursion Theorem. This thesis is concerned with the crafting of a logical toolkit through which these phenomena can be studied. Our main contribution is a framework in which mathematical and computational constructions can be considered either extensionally, i.e. as abstract values, or intensionally, i.e. as fine-grained descriptions of their construction. Once this is achieved, it may be used to analyse intensional recursion.Comment: DPhil thesis, Department of Computer Science & St John's College, University of Oxfor

    Parafiction as Matter and Method

    Get PDF
    The thesis examines the different ways in which artists have engaged with parafiction in the twentieth and twenty-first century. Parafiction – a fiction experienced as fact - has become an important mode of practice within contemporary art, with this shift concurrent to the exponential growth of digital technology. The term contemporary art is applied here in an expanded sense to acknowledge the effect of digital processes and matter on art and to include practices that use technology as form or subject or a combination of the two. Parafiction appears in various materialities, both digital and physical, and could be described as having neomateriality. Parafiction as Matter and Method inevitably locates the research within the context of the digital. The research investigates how the usage of parafiction has changed since 1989 with the rapid advancement of technology and widespread access to the internet. Changes in the social and political landscape have also affected the function of parafiction in contemporary society. These conditions are not necessarily time bound or linear. Drawing upon and extending Carrie Lambert-Beatty’s concept of parafictions (2009), the research is rooted in art history and contemporary art for its theoretical frameworks. The research engages deeply with art history and contemporary art in an expanded sense to contextualise and analyse parafictions, whilst utilising an interdisciplinary approach. To augment this deep context the research has combined the following fields: artistic practice, digital cultures, media studies, performance art, philosophy and politics. By synthesising this broad range of fields the research is original and complex in its approach aiming to consider the topic at a planetary scale within the bounds of the possible. As an overarching method, this research applies fiction as a method to produce new knowledge. The research uses primary and secondary methods including the production of a body of artwork and diagrammatic reasoning to augment the theoretical proposal. The art practice is employed as a method to synthesise the theory with practice and to apply the knowledge learnt outside of its text-based constraints. The practice appears as interludes interspersed throughout the thesis, that produce a duo-linear narrative with the aim of the thesis becoming an artwork in its own right. Primary data collection included interviews with relevant artists, attending and speaking at international conferences and research visits to exhibitions. This thesis has evolved through the attendance at international conferences as speaker and audience member, peer-reviewed publication, interaction with academic peers and research visits to exhibitions. This thesis evaluates how parafiction renegotiates physical and digital spatio-temporal parameters to offer alternatives for the present, pasts and futures, for both human and nonhuman users of those spaces. As parafiction becomes matter it has the ability to converge the digital and the physical to extend the lives of artworks beyond their initial existence. It is argued that fictioning methods have the most impact within contemporary art in its most expanded sense. The research advocates for parafiction as a vital method, found within artistic practice in the twentieth and twenty-first century, which produces new information and perspectives. This thesis uniquely concludes that parafiction is matter, as material that intersects and interacts with the modularity of digital technologies. Significantly, the research has found that parafiction acts as an additional module that connects physical and digital spatio-temporal with alternative potential for pasts, presents, and futures

    An analyser and generator for Irish inflectional morphology using finite-state transducers

    Get PDF
    Computational morphology is an important step in natural language processing. Finite-state techniques have been applied successfully in computational phonology and morphology to many of the world’s major languages. Celtic languages, such as Modern Irish, present unique and challenging morphological features that to date have not been addressed using finite-state technology. This thesis presents a finite-state morphology of Irish developed using Xerox Finite-State Tools. To the best of our knowledge, such a resource does not exist. The computational model, implemented as a finite-state transducer, encodes the inflectional morphology of nouns, adjectives, and verbs. Other parts of speech are also included in the interests of language coverage. The implementation is a strictly lexicalised design: the morphotactics of stems and affixes are encoded in the lexicon using replace rule triggers. Word mutations are then implemented as a series of replace rules written as regular expressions. Both components are compiled into finite state transducers and then combined, to produce a single two-level morphological transducer for the language. A major advantage of finite-state implementations of morphology is their inherent bi-directionality; the same system is used for both analysis and generation of word forms in the language. This resource can be used as a component part in parsing and generation in natural language processing (NLP) applications, such as spelling checkers/correctors, stemmers and text to speech synthesisers. It can also be used for tokenising text, lemmatising, and as an input to automatic partof- speech tagging of a corpus. The system is designed for broad coverage of the language and this is evaluated by comparing it with a list of the 1000 most frequently found word forms in a corpus of contemporary Irish texts. Finally, maintainability of the system is discussed and possible extensions to the system are suggested, such as derivational morphology and the inclusion of dialectal or historical word-forms

    THE REALISM OF ALGORITHMIC HUMAN FIGURES A Study of Selected Examples 1964 to 2001

    Get PDF
    It is more than forty years since the first wireframe images of the Boeing Man revealed a stylized hu-man pilot in a simulated pilot's cabin. Since then, it has almost become standard to include scenes in Hollywood movies which incorporate virtual human actors. A trait particularly recognizable in the games industry world-wide is the eagerness to render athletic muscular young men, and young women with hour-glass body-shapes, to traverse dangerous cyberworlds as invincible heroic figures. Tremendous efforts in algorithmic modeling, animation and rendering are spent to produce a realistic and believable appearance of these algorithmic humans. This thesis develops two main strands of research by the interpreting a selection of examples. Firstly, in the computer graphics context, over the forty years, it documents the development of the creation of the naturalistic appearance of images (usually called photorealism ). In particular, it de-scribes and reviews the impact of key algorithms in the course of the journey of the algorithmic human figures towards realism . Secondly, taking a historical perspective, this work provides an analysis of computer graphics in relation to the concept of realism. A comparison of realistic images of human figures throughout history with their algorithmically-generated counterparts allows us to see that computer graphics has both learned from previous and contemporary art movements such as photorealism but also taken out-of-context elements, symbols and properties from these art movements with a questionable naivety. Therefore, this work also offers a critique of the justification of the use of their typical conceptualization in computer graphics. Although the astounding technical achievements in the field of algorithmically-generated human figures are paralleled by an equally astounding disregard for the history of visual culture, from the beginning 1964 till the breakthrough 2001, in the period of the digital information processing machine, a new approach has emerged to meet the apparently incessant desire of humans to create artificial counterparts of themselves. Conversely, the theories of traditional realism have to be extended to include new problems that those active algorithmic human figures present

    A philosophical essay on artifacts and norms

    Get PDF

    The Machine as Art/ The Machine as Artist

    Get PDF
    The articles collected in this volume from the two companion Arts Special Issues, “The Machine as Art (in the 20th Century)” and “The Machine as Artist (in the 21st Century)”, represent a unique scholarly resource: analyses by artists, scientists, and engineers, as well as art historians, covering not only the current (and astounding) rapprochement between art and technology but also the vital post-World War II period that has led up to it; this collection is also distinguished by several of the contributors being prominent individuals within their own fields, or as artists who have actually participated in the still unfolding events with which it is concerne

    Machine Medical Ethics

    Get PDF
    In medical settings, machines are in close proximity with human beings: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. Machines in these contexts are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for empathy and emotion detection necessary? What about consciousness? The essays in this collection by researchers from both humanities and science describe various theoretical and experimental approaches to adding medical ethics to a machine, what design features are necessary in order to achieve this, philosophical and practical questions concerning justice, rights, decision-making and responsibility, and accurately modeling essential physician-machine-patient relationships. This collection is the first book to address these 21st-century concerns
    corecore