744 research outputs found
Designing Normative Theories for Ethical and Legal Reasoning: LogiKEy Framework, Methodology, and Tool Support
A framework and methodology---termed LogiKEy---for the design and engineering
of ethical reasoners, normative theories and deontic logics is presented. The
overall motivation is the development of suitable means for the control and
governance of intelligent autonomous systems. LogiKEy's unifying formal
framework is based on semantical embeddings of deontic logics, logic
combinations and ethico-legal domain theories in expressive classic
higher-order logic (HOL). This meta-logical approach enables the provision of
powerful tool support in LogiKEy: off-the-shelf theorem provers and model
finders for HOL are assisting the LogiKEy designer of ethical intelligent
agents to flexibly experiment with underlying logics and their combinations,
with ethico-legal domain theories, and with concrete examples---all at the same
time. Continuous improvements of these off-the-shelf provers, without further
ado, leverage the reasoning performance in LogiKEy. Case studies, in which the
LogiKEy framework and methodology has been applied and tested, give evidence
that HOL's undecidability often does not hinder efficient experimentation.Comment: 50 pages; 10 figure
Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web
The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the authorâs and shouldnât be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very
instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that
they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our
technologies is still barely visible. McLuhanâs predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet
the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the
services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge
management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The
combination of this expertise, and the time and space afforded the consortium by the
IRC structure, suggested the opportunity for a concerted effort to develop an approach
to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to
the knowledge management services AKT tries to provide. As a medium for the
semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the
provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different
applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing
ontologies to create a third). Ontology mapping, and the elimination of conflicts of
reference, will be important tasks. All of these issues are discussed along with our
proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices
that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which
semantic hygiene prevails interesting enough to reason in? These and many other
questions need to be addressed if we are to provide effective knowledge technologies
for our content on the web
Applications of formal methods in engineering
The main idea presented in this thesis is to propose and justify a general framework for the development of safety-related systems based on a selection of criticality and the required level of integrity. We show that formal methods can be practically and consistently introduced into the system design lifecycle without incurring excessive development cost.
An insight into the process of generating and validating a formal specification from an engineering point of view is illustrated, in conjunction with formal definitions of specification models, safety criteria and risk assessments. Engineering specifications are classified into two main classes of systems, memoryless and memory bearing systems. Heuristic approaches for specification generation and validation of these systems are presented and discussed with a brief summary of currently available formal systems and their supporting tools.
It is further shown that to efficiently address different aspects of real-world problems, the concept of embedding one logic within another mechanised logic, in order to provide mechanical support for proofs and reasoning, is practical. A temporal logic framework, which is embedded in Higher Order Logic, is used to verify and validate the design of a real-time system. Formal definitions and properties of temporal operators are defined in HOL and real-time concepts such as timing marker, interrupt and timeout are presented. A second major case study is presented on the specification a solid model for mechanical parts. This work discusses the modelling theory with set theoretic topology and Boolean operations. The theory is used to specify the mechanical properties of large distribution transformers. Associated mechanical properties such as volumetric operations are also discussed
A contextual study of Boris Asafiev\u27s Musical form as a process and and application of concepts to his Sonata for solo viola
This dissertation examines the work of Russian composer, critic, and musicologist, Boris Vladimirovich Asafiev (1884-1949) against contemporaneous systems of cultural activity associated with Soviet communism. Over the course of his lifetime, Asafiev designed and developed a unique aestheticâphilosophical theory on the process of musical formation and perception. This study examines the political and ideological forces that contributed to the appearance of socialist realism, and places Asafiev within this context, evaluating his life and works. Central to this dissertation are two musicological volumes taken from Asafievâs immense catalogue of works: Musical Form as a Process (1930), and Musical Form as a Process: Intonations (1947). The theories developed in these works are applied in an analysis and close reading of Asafievâs Sonata for Solo Viola (1938). This study includes insights gleaned from a recital performance of Asafievâs works (including the Sonata for Solo Viola). The recital which took place on March 29th 2015 forms the creative/performance component of this research, and is attached as a DVD-Rom
On the Extensibility of Formal Methods Tools
Modern software systems often have long lifespans over which they must continually evolve to meet new, and sometimes unforeseen, requirements. One way to effectively deal with this is by developing the system as a series of extensions. As requirements change, the system evolves through the addition of new extensions and, potentially, the removal of existing extensions. In order for this kind of development process to thrive, it is necessary that the system have a high level of extensibility. Extensibility is the capability of a system to support the gradual addition of new, unplanned functionalities. This dissertation investigates extensibility of software systems and focuses on a particular class of software: formal methods tools. The approach is broad in scope. Extensibility of systems is addressed in terms of design, analysis and improvement, which are carried out in terms of source code and software architecture. For additional perspective, extensibility is also considered in the context of formal modelling. The work carried out in this dissertation led to the development of various extensions to the Overture tool supporting the Vienna Development Method, including a new proof obligation generator and integration with theorem provers. Additionally, the extensibility of Overture itself was also improved and it now better supports the development and integration of various kinds of extensions. Finally, extensibility techniques have been applied to formal modelling, leading to an extensible architectural style for formal models
ââThe Wind in the Burlap Trees: Vachel Lindsayâs Utopian Film Theory â
This thesis argues that Vachel Lindsayâs utopian film theory has new relevance today. In 1915 in his first book of film theory, The Art of the Moving Picture, Lindsay put forward a concept of film as an intermedial art form which could restore an imagistic consciousness and revive regional cultures. While out of step with the mechanised concept of film which dominated early 20th century film theory, his work can now be seen to anticipate the breakdown of medium essentialism, the ascent of the image in modern life, the amateurisation of media, and the rise of maker culture.
Lindsayâs film theory is best understood within the context of his utopian vision of American modernity in which preindustrial sensibilities are sustained alongside urbanisation and industrialisation and this thesis draws heavily on his utopian writings. In approaching his film theory from this vantage point, the cultural eclecticism and strains of antimodernism which inform it are no longer problems to be overcome but, on the contrary, are revealed to be central to his concept of film as a hybrid, intermedial technology which can revive important elements of pre-modern life. Moreover, central to Lindsayâs utopian social programme was the democratisation of culture and the localisation of artistic production and viewing his film theory in this context illuminates the relevance of his aesthetic theories to contemporary developments in digital technology and maker culture.Â
While interest in Lindsay has increased in recent years his work still exists on the margins of film theory. This thesis seeks to show not only the prescience of his ideas, but the various contributions he makes to key debates in aesthetic theory, including the relationship between text and image, the value of amateur aesthetics, and the politics of artifice. Too long neglected, Lindsayâs work enriches the field of film theory by providing a unique vision of filmâs relationship to modernity, while also illuminating the utopian possibilities of the contemporary media landscape
Interaction Design: Foundations, Experiments
Interaction Design: Foundations, Experiments is the result of a series of projects, experiments and curricula aimed at investigating the foundations of interaction design in particular and design research in general.
The first part of the book - Foundations - deals with foundational theoretical issues in interaction design. An analysis of two categorical mistakes -the empirical and interactive fallacies- forms a background to a discussion of interaction design as act design and of computational technology as material in design.
The second part of the book - Experiments - describes a range of design methods, programs and examples that have been used to probe foundational issues through systematic questioning of what is given. Based on experimental design work such as Slow Technology, Abstract Information Displays, Design for Sound Hiders, Zero Expression Fashion, and IT+Textiles, this section also explores how design experiments can play a central role when developing new design theory
Recommended from our members
Mechanising and evolving the formal semantics of WebAssembly: the Web's new low-level language
WebAssembly is the first new programming language to be supported natively by all major Web browsers since JavaScript. It is designed to be a natural low-level compilation target for languages such as C, C++, and Rust, enabling programs written in these languages to be compiled and executed efficiently on the Web. WebAssemblyâs specification is managed by the W3C WebAssembly Working Group (made up of representatives from a number of major tech companies). Uniquely, the language is specified by way of a full pen-and-paper formal semantics.
This thesis describes a number of ways in which I have both helped to shape the specification of WebAssembly, and built upon it. By mechanising the WebAssembly formal semantics in Isabelle/HOL while it was being drafted, I discovered a number of errors in the specification, drove the adoption of official corrections, and provided the first type soundness proof for the corrected language. This thesis also details a verified type checker and interpreter, and a security type system extension for cryptography primitives, all of which have been mechanised as extensions of my initial WebAssembly mechanisation.
A major component of the thesis is my work on the specification of shared memory concurrency in Web languages: correcting and verifying properties of JavaScriptâs existing relaxed memory model, and defining the WebAssembly-specific extensions to the corrected model which have been adopted as the basis of WebAssemblyâs official threads specification. A number of deficiencies in the original JavaScript model are detailed. Some errors have been corrected, with the verified fixes officially adopted into subsequent editions of the language specification. However one discovered deficiency is fundamental to the model, an instance of the well-known "thin-air problem".
My work demonstrates the value of formalisation and mechanisation in industrial programming language design, not only in discovering and correcting specification errors, but also in building confidence both in the correctness of the languageâs design and in the design of proposed extensions.2019 Google PhD Fellowship in Programming Technology and Software Engineering
Peterhouse Research Fellowshi
- âŠ