122 research outputs found
Recommended from our members
Formalizing graphical notations
The thesis describes research into graphical notations for software engineering, with a principal interest in ways of formalizing them. The research seeks to provide a theoretical basis that will help in designing both notations and the software tools that process them.
The work starts from a survey of literature on notation, followed by a review of techniques for formal description and for computational handling of notations. The survey concentrates on collecting views of the benefits and the problems attending notation use in software development; the review covers picture description languages, grammars and tools such as generic editors and visual programming environments. The main problem of notation is found to be a lack of any coherent, rigorous description methods. The current approaches to this problem are analysed as lacking in consensus on syntax specification and also lacking a clear focus on a defined concept of notated expression.
To address these deficiencies, the thesis embarks upon an exploration of serniotic, linguistic and logical theory; this culminates in a proposed formalization of serniosis in notations, using categorial model theory as a mathematical foundation. An argument about the structure of sign systems leads to an analysis of notation into a layered system of tractable theories, spanning the gap between expressive pictorial medium and subject domain. This notion of 'tectonic' theory aims to treat both diagrams and formulae together.
The research gives details of how syntactic structure can be sketched in a mathematical sense, with examples applying to software development diagrams, offering a new solution to the problem of notation specification. Based on these methods, the thesis discusses directions for resolving the harder problems of supporting notation design, processing and computer-aided generic editing. A number of future research areas are thereby opened up. For practical trial of the ideas, the work proceeds to the development and partial implementation of a system to aid the design of notations and editors. Finally the thesis is evaluated as a contribution to theory in an area which has not attracted a standard approach
Diagrammatic Representations in Domain-Specific Languages
One emerging approach to reducing the labour and costs of software development
favours the specialisation of techniques to particular application domains.
The rationale is that programs within a given domain often share enough common
features and assumptions to enable the incorporation of substantial support
mechanisms into domain-specific programming languages and associated tools.
Instead of being machine-oriented, algorithmic implementations, programs in
many domain-specific languages (DSLs) are rather user-level, problem-oriented
specifications of solutions. Taken further, this view suggests that the most appropriate
representation of programs in many domains is diagrammatic, in a way
which derives from existing design notations in the domain.
This thesis conducts an investigation, using mathematical techniques and supported
by case studies, of issues arising from the use of diagrammatic representations
in DSLs. Its structure is conceptually divided into two parts: the first is
concerned with semantic and reasoning issues; the second introduces an approach
to describing the syntax and layout of diagrams, in a way which addresses some
pragmatic aspects of their use.
The empirical context of our work is that of IEC 1131-3, an industry standard
programming language for embedded control systems. The diagrammatic syntax
of IEC 1131-3 consists of circuit (i.e. box-and-wire) diagrams, emphasising a data-
flow view, and variants of Petri net diagrams, suited to a control-flow view.
The first contribution of the thesis is the formalisation of the diagrammatic
syntax and the semantics of IEC 1131-3 languages, as a prerequisite to the application
of algebraic techniques. More generally, we outline an approach to the
design of diagrammatic DSLs, emphasising compositionality in the semantics of
the language so as to allow the development of simple proof systems for inferring
properties which are deemed essential in the domain. The control-flow subset
of IEC 1131-3 is carefully evaluated, and is subsequently re-designed, to yield a
straightforward proof system for a restricted, yet commonly occurring, class of
safety properties.
A substantial part of the thesis deals with DSLs in which programs may be
represented both textually and diagrammatically, as indeed is the case with IEC
1131-3. We develop a formalisation of the data-flow diagrams in IEC 1131-
Improving the programming language translation process via static structure abstraction and algorithmic code transliteration
Fully automated programming language translation has been described as an unrealistic goal, with previous research being limited by a ceiling of 90% successful code translation. The key issues hindering automatic translation efficacy are the: maintainability of the translated constructs; full utilisation of the target language\u27s features; and amount of manual intervention required to complete the translation process. This study has concentrated on demonstrating improvements to the translation process by introducing the programming-language-independent, Unified Modelling Language (UML) and Computer Assisted Software Engineering (CASE) tools to the legacy-system language migration project. UML and CASE tools may be used to abstract the static framework of the source application to reduce the so called opaqueness of the translated constructs, yielding a significantly more maintainable product. The UML and CASE tools also enhance use of the target language features, through forward engineering of the native constructs of the target language during the reproduction of the static framework. Source application algorithmic code translation, performed as a separate process using transliteration, may preserve maximum functionality of the source application after completion of the static structure translation process. Introduction of the UML and CASE tools in conjunction with algorithmic code transliteration offers a reduction of the manual intervention required to complete the translation process
Well-Formed and Scalable Invasive Software Composition
Software components provide essential means to structure and organize software effectively. However, frequently, required component abstractions are not available in a programming language or system, or are not adequately combinable with each other. Invasive software composition (ISC) is a general approach to software composition that unifies component-like abstractions such as templates, aspects and macros. ISC is based on fragment composition, and composes programs and other software artifacts at the level of syntax trees. Therefore, a unifying fragment component model is related to the context-free grammar of a language to identify extension and variation points in syntax trees as well as valid component types. By doing so, fragment components can be composed by transformations at respective extension and variation points so that always valid composition results regarding the underlying context-free grammar are yielded. However, given a languageās context-free grammar, the composition result may still be incorrect.
Context-sensitive constraints such as type constraints may be violated so that the program cannot be compiled and/or interpreted correctly. While a compiler can detect such errors after composition, it is difficult to relate them back to the original transformation step in the composition system, especially in the case of complex compositions with several hundreds of such steps. To tackle this problem, this thesis proposes well-formed ISCāan extension to ISC that uses reference attribute grammars (RAGs) to specify fragment component models and fragment contracts to guard compositions with context-sensitive constraints. Additionally, well-formed ISC provides composition strategies as a means to configure composition algorithms and handle interferences between composition steps.
Developing ISC systems for complex languages such as programming languages is a complex undertaking. Composition-system developers need to supply or develop adequate language and parser specifications that can be processed by an ISC composition engine. Moreover, the specifications may need to be extended with rules for the intended composition abstractions.
Current approaches to ISC require complete grammars to be able to compose fragments in the respective languages. Hence, the specifications need to be developed exhaustively before any component model can be supplied. To tackle this problem, this thesis introduces scalable ISCāa variant of ISC that uses island component models as a means to define component models for partially specified languages while still the whole language is supported. Additionally, a scalable workflow for agile composition-system development is proposed which supports a development of ISC systems in small increments using modular extensions.
All theoretical concepts introduced in this thesis are implemented in the Skeletons and Application Templates framework SkAT. It supports āclassicā, well-formed and scalable ISC by leveraging RAGs as its main specification and implementation language. Moreover, several composition systems based on SkAT are discussed, e.g., a well-formed composition system for Java and a C preprocessor-like macro language. In turn, those composition systems are used as composers in several example applications such as a library of parallel algorithmic skeletons
A Pattern-based Foundation for Language-Driven Software Engineering
This work brings together two fundamental ideas for modelling, programming and analysing software systems. The first idea is of a methodological nature: engineering software by systematically creating and relating languages. The second idea is of a technical nature: using patterns as a practical foundation for computing. The goal is to show that the systematic creation and layering of languages can be reduced to the elementary operations of pattern matching and instantiation and that this pattern-based approach provides a formal and practical foundation for language-driven modelling, programming and analysis.
The underpinning of the work is a novel formalism for recognising, deconstructing, creating, searching, transforming and generally manipulating data structures. The formalism is based on typed sequences, a generic structure for representing trees. It defines basic pattern expressions for matching and instantiating atomic values and variables. Horizontal, vertical, diagonal and hierarchical operators are different ways of combining patterns. Transformations combine matching and instantiating patterns and they are patterns themselves. A quasiquotation mechanism allows arbitrary levels of meta-pattern functionality and forms the basis of pattern abstraction. Path polymorphic operators are used to specify fine-grained search of structures. A range of core concepts such as layering, parsing and pattern-based computing can naturally be defined through pattern expressions.
Three language-driven tools that utilise the pattern formalism showcase the applicability of the pattern-approach. Concat is a self-sustaining (meta-)programming system in which all computations are expressed by matching and instantiation. This includes parsing, executing and optimising programs. By applying its language engineering tools to its own meta-language, Concat can extend itself from within. XMF (XML Modeling Framework) is a browser-based modelling- and meta-modelling framework that provides flexible means to create and relate modelling languages and to query and validate models. The pattern functionality that makes this possible is partly exposed as a schema language and partly as a JavaScript library. CFR (Channel Filter Rule Language) implements a language-driven approach for layered analysis of communication in complex networked systems. The communication on each layer is visible in the language of an āabstract protocolā that is defined by communication patterns
Inductive Acquisition of Expert Knowledge
Expert systems divide neatly into two categories: those in which ( 1) the expert decisions result in
changes to some external environment (control systems), and (2) the expert decisions merely seek
to describe the environment (classification systems). Both the explanation of computer-based
reasoning and the "bottleneck" (Feigenbaum, 1979) of knowledge acquisition are major issues in
expert systems research. We have contributed to these areas of research in two ways. Firstly, we
have implemented an expert system shell, the Mugol environment, which facilitates knowledge
acquisition by inductive inference and provides automatic explanation of run-time reasoning on
demand. RuleMaster, a commercial version of this environment, has been used to advantage
industrially in the construction and testing of two large classification systems. Secondly, we have
investigated a new technique called sequence induction which can be used in the construction of
control systems. Sequence induction is based on theoretical work in grammatical learning. We
have improved existing grammatical learning algorithms as well as suggesting and theoretically
characterising new ones. These algorithms have been successfully applied to the acquisition of
knowledge for a diverse set of control systems, including inductive construction of robot plans and
chess end-game strategies
Declarative operations on nets
To increase the expressiveness of knowledge representations, the graph-theoretical basis of semantic networks is reconsidered. Directed labeled graphs are generalized to directed recursive labelnode hypergraphs, which permit a most natural representation of multi-level structures and n-ary relationships. This net formalism is embedded into the relational/functional programming language RELFUN. Operations on (generalized) graphs are specified in a declarative fashion to enhance readability and maintainability. For this, nets are represented as nested RELFUN terms kept in a normal form by rules associated directly with their constructors. These rules rely on equational axioms postulated in the formal definition of the generalized graphs as a constructor algebra. Certain kinds of sharing in net diagrams are mirrored by binding common subterms to logical variables. A package of declarative transformations on net terms is developed. It includes generalized set operations, structure-reducing operations, and extended path searching. The generation of parts lists is given as an application in mechanical engineering. Finally, imperative net storage and retrieval operations are discussed
On the automatic construction of program translators with minimal read/write storage requirement
Abstract available: p. [vi-vii
Models and Modelling between Digital and Humanities: A Multidisciplinary Perspective
This Supplement of Historical Social Research stems from the contributions on the topic of modelling presented at the workshop āThinking in Practiceā, held at Wahn Manor House in Cologne on January 19-20, 2017. With Digital Humanities as starting point, practical examples of model building from different disciplines are considered, with the aim of contributing to the dialogue on modelling from several perspectives. Combined with theoretical considerations, this collection illustrates how the process of modelling is one of coming to know, in which the purpose of each modelling activity and the form in which models are expressed has to be taken into consideration in tandem. The modelling processes presented in this volume belong to specific traditions of scholarly and practical thinking as well as to specific contexts of production and use of models. The claim that supported the project workshop was indeed that establishing connections between different traditions of and approaches toward modelling is vital, whether these connections are complementary or intersectional. The workshop proceedings address an underpinning
goal of the research project itself, namely that of examining the nature of the epistemological questions in the different traditions and how they relate to the nature of the modelled objects and the models being created. This collection is an attempt to move beyond simple representational views on modelling in order to understand modelling processes as scholarly and cultural phenomena as such
- ā¦