207 research outputs found

    Analysis and improvement of a multi-pass compiler for a pipeline architecture

    Get PDF
    In this thesis a parallel environment for the execution of a multi-pass Pascal compiler is considered. Some possible and appropriate ways to speed up each pass of the parallelized compiler are investigated. In addition, a new approach, using the concepts of software science, is explored for obtaining gross performance characteristics of a multi-pass compiler;A pipeline architecture is used for the parallel compilation. The performance characteristics of the pipelined compiler are determined by a trace-driven simulation of the pipelined compiler. The actions in the multi-processor system are synchronized by an event-driven simulation of the pipeline system. The pipelined compiler and possible improvements are analyzed in terms of the location of the bottleneck, queue size, overhead factor, and partition policy. The lexical analysis phase is found to be the initial bottleneck. The improvement of this phase and its effects on the other phases are presented. Also, possible methods for improving the non-lexical analysis phases are investigated based on a study of the data structures and operations of these phases;For obtaining gross performance characteristics of a multi-pass compiler, an analysis based only on the intermediate code files is performed. One of the key concepts in Halstead\u27s software science, called the language level, is applied to this analysis. From the experimental results and statistical verification it is found that there exists a strong correlation between the stand-alone execution time and language level

    Ubiquitous systems and Petri nets

    Get PDF
    Several years before the popularization of the Internet, Mark Weiser proposed the concept of ubiquitous computing with the purpose of enhancing the use of computers by making many computers available throughout the physical environment, but making them effectively invisible to the user. Nowadays, such idea affects all areas of computing science, including both hardware and software. In this paper, a formal model for ubiquitous systems based on Petri nets is introduced and motivated with examples and applications. This simple model allows the definition of two-level ubiquitous systems, composed of a collection of processor nets providing services, and a collection of process nets requesting those services. The modeled systems abstract from middleware details, such as service discovery protocols, and security infrastructures, such as PKI’s or trust policies, but not from mobility or component compatibility

    Mobile Synchronizing Petri Nets: A Choreographic Approach for Coordination in Ubiquitous Systems

    Get PDF
    AbstractThe term Ubiquitous Computing was coined by Mark Weiser almost two decades ago. Despite all the time that has passed since Weiser's vision, ubiquitous computing still has a long way ahead to become a pervasive reality. One of the reasons for this may be the lack of widely accepted formal models capable of capturing and analyzing the complexity of the new paradigm. We propose a simple Petri Net based model to study some of its main characteristics. We model both devices and software components as a special kind of coloured Petri Nets, located in locations, that can move to other locations and synchronize with other co-located nets, offering and requesting services. We obtain an amenable model for ubiquitous computing, due to its graphical representation. We present our proposal in a progressive way, first presenting a basic model where coordination is formalized by the synchronized firing of pairs of compatible transitions that offer and request a specific service, and ad hoc networks are modeled by constraining mobility by the dynamic acquisition of locality names. Next, we introduce a mechanism for the treatment of robust security properties, namely the generation of fresh private names, to be used for authentication properties

    The Standard Model for Programming Languages: The Birth of a Mathematical Theory of Computation

    Get PDF
    International audienceDespite the insight of some of the pioneers (Turing, von Neumann, Curry, Böhm), programming the early computers was a matter of fiddling with small architecture-dependent details. Only in the sixties some form of "mathematical program development" will be in the agenda of some of the most influential players of that time. A "Mathematical Theory of Computation" is the name chosen by John McCarthy for his approach, which uses a class of recursively computable functions as an (extensional) model of a class of programs. It is the beginning of that grand endeavour to present programming as a mathematical activity, and reasoning on programs as a form of mathematical logic. An important part of this process is the standard model of programming languages-the informal assumption that the meaning of programs should be understood on an abstract machine with unbounded resources, and with true arithmetic. We present some crucial moments of this story, concluding with the emergence, in the seventies, of the need of more "intensional" semantics, like the sequential algorithms on concrete data structures. The paper is a small step of a larger project-reflecting and tracing the interaction between mathematical logic and programming (languages), identifying some of the driving forces of this process. To Maurizio Gabbrielli, on his 60th birthda

    Coloured Petri Nets - a Pragmatic Formal Method for Designing and Analysing Distributed Systems

    Get PDF
    The thesis consists of six individual papers, where the present paper contains the mandatory overview, while the remaining five papers are found separately from the overview. The five papers can roughly be divided into three areas of research, namely case studies, education, and extensions to the CPN method.The primary purpose of the PhD thesis is to study the pragmatics, practical aspects, and intuition of CP-nets viewed as a formal method for describing and reasoning about concurrent systems. The perspective of pragmatics is our leitmotif, but at the same time in the context of CP-nets it is a kind of hypothesis of this thesis. This overview paper summarises the research conducted as an investigation of the hypothesis in the three areas of case studies, education, and extensions.The provoking claim of pragmatics should not be underestimated. In the present overview of the thesis, the CPN method is compared with a representative selection of formal methods. The graphics and simplicity of semantics, yet generality and expressiveness of the language constructs, essentially makes CP-nets a viable and attractive alternative to other formal methods. Similar graphical formal methods, such as SDL and Statecharts, typically have significantly more complicated semantics, or are domain-specific languages.research conducted in this thesis, opens a new complex of problems. Firstly, to get wider acceptance of CP-nets in industry, it is important to identify fruitful areas for the effective introduction of the CPN method. Secondly, it would be useful to identify a few extensions to the CPN method inspired by specific domains for easier adaption in industry. Thirdly, which analysis methods do future systems make use of

    Translating expert system rules into Ada code with validation and verification

    Get PDF
    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system

    Symbolic and connectionist learning techniques for grammatical inference

    Get PDF
    This thesis is structured in four parts for a total of ten chapters. The first part, introduction and review (Chapters 1 to 4), presents an extensive state-of-the-art review of both symbolic and connectionist GI methods, that serves also to state most of the basic material needed to describe later the contributions of the thesis. These contributions constitute the contents of the rest of parts (Chapters 5 to 10). The second part, contributions on symbolic and connectionist techniques for regular grammatical inference (Chapters 5 to 7), describes the contributions related to the theory and methods for regular GI, which include other lateral subjects such as the representation oĂ­. finite-state machines (FSMs) in recurrent neural networks (RNNs).The third part of the thesis, augmented regular expressions and their inductive inference, comprises Chapters 8 and 9. The augmented regular expressions (or AREs) are defined and proposed as a new representation for a subclass of CSLs that does not contain all the context-free languages but a large class of languages capable of describing patterns with symmetries and other (context-sensitive) structures of interest in pattern recognition problems.The fourth part of the thesis just includes Chapter 10: conclusions and future research. Chapter 10 summarizes the main results obtained and points out the lines of further research that should be followed both to deepen in some of the theoretical aspects raised and to facilitate the application of the developed GI tools to real-world problems in the area of computer vision
    • …
    corecore