12 research outputs found

    A study of the methodologies currently available for the maintenance of the knowledge-base in an expert system

    Get PDF
    This research studies currently available maintenance methodologies for expert system knowledge bases and taxonomically classifies them according to the functions they perform. The classification falls into two broad categories. These are: (1) Methodologies for building a more maintainable expert system knowledge base. This section covers techniques applicable to the development phases. Software engineering approaches as well as other approaches are discussed. (2) Methodologies for maintaining an existing knowledge base. This section is concerned with the continued maintenance of an existing knowledge base. It is divided into three subsections. The first subsection discusses tools and techniques which aid the understanding of a knowledge base. The second looks at tools which facilitate the actual modification of the knowledge base, while the last secttion examines tools used for the verification or validation of the knowledge base. Every main methodology or tool selected for this study is analysed according to the function it was designed to perform (or its objective); the concept or principles behind the tool or methodology: and its implementation details. This is followed by a general comment at the end of the analysis. Although expert systems as a rule contain significant amount of information related to the user interface, database interface, integration with conventional software for numerical calculations, integration with other knowledge bases through black boarding systems or network interactions, this research is confined to the maintenance of the knowledge base only and does not address the maintenance of these interfaces. Also not included in this thesis are Truth Maintenance Systems. While a Truth Maintenance System (TMS) automatically updates a knowledge base during execution time, these update operations are not considered \u27maintenance\u27 in the sense as used in this thesis. Maintenance in the context of this thesis refers to perfective, adaptive, and corrective maintenance (see introduction to chapter 4). TMS on the other hand refers to a collection of techniques for doing belief revision (Martin, 1990) . That is, a TMS maintains a set of beliefs or facts in the knowledge base to ensure that they remain consistent during execution time. From this perspective, TMS is not regarded as a knowledge base maintenance tool for the purpose of this study

    Mathematics in Software Reliability and Quality Assurance

    Get PDF
    This monograph concerns the mathematical aspects of software reliability and quality assurance and consists of 11 technical papers in this emerging area. Included are the latest research results related to formal methods and design, automatic software testing, software verification and validation, coalgebra theory, automata theory, hybrid system and software reliability modeling and assessment

    Combining SOA and BPM Technologies for Cross-System Process Automation

    Get PDF
    This paper summarizes the results of an industry case study that introduced a cross-system business process automation solution based on a combination of SOA and BPM standard technologies (i.e., BPMN, BPEL, WSDL). Besides discussing major weaknesses of the existing, custom-built, solution and comparing them against experiences with the developed prototype, the paper presents a course of action for transforming the current solution into the proposed solution. This includes a general approach, consisting of four distinct steps, as well as specific action items that are to be performed for every step. The discussion also covers language and tool support and challenges arising from the transformation

    Ingénierie et Architecture d’Entreprise et des Systèmes d’Information - Concepts, Fondements et Méthodes

    Get PDF
    L'ingénierie des systèmes d'information s'est longtemps cantonnée à la modélisation du produit (objet) qu'est le système d’information sans se préoccuper des processus d'usage de ce système. Dans un environnement de plus en plus évolutif, la modélisation du fonctionnement du système d’information au sein de l'entreprise me semble primordiale. Pendant les deux dernières décennies, les pratiques de management, d’ingénierie et d’opération ont subi des mutations profondes et multiformes. Nous devons tenir compte de ces mutations dans les recherches en ingénierie des systèmes d’information afin de produire des formalismes et des démarches méthodologiques qui sauront anticiper et satisfaire les nouveaux besoins, regroupés dans ce document sous quatre thèmes:1) Le système d’information est le lieu même où s’élabore la coordination des actes et des informations sans laquelle une entreprise (et toute organisation), dans la diversité des métiers et des compétences qu’elle met en œuvre, ne peut exister que dans la médiocrité. La compréhension des exigences de coopération dans toutes ses dimensions (communication, coordination, collaboration) et le support que l’informatique peut et doit y apporter deviennent donc un sujet digne d’intérêt pour les recherches en système d’information.2) Le paradigme de management des processus d’entreprise (BPM) est en forte opposition avec le développement traditionnel des systèmes d’information qui, pendant plusieurs décennies, a cristallisé la division verticale des activités des organisations et favorisé ainsi la construction d’îlots d’information et d’applications. Cependant, les approches traditionnelles de modélisation de processus ne sont pas à la hauteur des besoins d’ingénierie des processus dans ce contexte en constant changement, que ce dernier soit de nature contextuelle ou permanente. Nous avons donc besoin de formalismes (i) qui permettent non seulement de représenter les processus d’entreprise et leurs liens avec les composants logiciels du système existant ou à venir mais (ii) qui ont aussi l’aptitude à représenter la nature variable et/ou évolutive (donc parfois éminemment décisionnelle) de ces processus.3) Les systèmes d’information continuent aujourd’hui de supporter les besoins classiques tels que l’automatisation et la coordination de la chaîne de production, l’amélioration de la qualité des produits et/ou services offerts. Cependant un nouveau rôle leur est attribué. Il s’agit du potentiel offert par les systèmes d’information pour adopter un rôle de support au service de la stratégie de l’entreprise. Les technologies de l’information, de la communication et de la connaissance se sont ainsi positionnées comme une ressource stratégique, support de la transformation organisationnelle voire comme levier du changement. Les modèles d’entreprise peuvent représenter l’état actuel de l’organisation afin de comprendre, de disposer d’une représentation partagée, de mesurer les performances, et éventuellement d’identifier les dysfonctionnements. Ils permettent aussi de représenter un état futur souhaité afin de définir une cible vers laquelle avancer par la mise en œuvre des projets. L’entreprise étant en mouvement perpétuel, son évolution fait partie de ses multiples dimensions. Nous avons donc besoin de représenter, a minima, un état futur et le chemin de transformation à construire pour avancer vers cette cible. Cependant planifier/imaginer/se projeter vers une cible unique et, en supposant que l’on y arrive, croire qu’il puisse exister un seul chemin pour l’atteindre semble irréaliste. Nous devons donc proposer des formalismes qui permettront de spécifier des scenarii à la fois pour des cibles à atteindre et pour des chemins à parcourir. Nous devons aussi développer des démarches méthodologiques pour guider de manière systématique la construction de ces modèles d’entreprise et la rationalité sous-jacente.4) En moins de cinquante ans, le propos du système d’information a évolué et s’est complexifié. Aujourd’hui, le système d’information doit supporter non seulement les fonctions de support de manière isolée et en silos (1970-1990), et les activités appartenant à la chaîne de valeur [Porter, 1985] de l’entreprise (1980-2000) mais aussi les activités de contrôle, de pilotage, de planification stratégique ainsi que la cohérence et l’harmonie de l’ensemble des processus liés aux activités métier (2000-201x), en un mot les activités de management stratégique et de gouvernance d’entreprise. La gouvernance d'entreprise est l'ensemble des processus, réglementations, lois et institutions influant la manière dont l'entreprise est dirigée, administrée et contrôlée. Ces processus qui produisent des ‘décisions’ en guise de ‘produit’ ont autant besoin d’être instrumentalisés par les systèmes d’information que les processus de nature plus opérationnels de l’entreprise. De même, ces processus stratégiques (dits aussi ‘de développement’) nécessitent d’avoir recours à des formalismes de représentation qui sont très loin, en pouvoir d’expression, des notations largement adoptées ces dernières années pour la représentation des processus d’entreprise.Ainsi, il semble peu judicieux de vouloir (ou penser pouvoir) isoler, pendant sa construction, l’objet “système d’information” de son environnement d’exécution. Si le sens donné à l’information dépend de la personne qui la reçoit, ce sens ne peut être entièrement capturé dans le système technique. Il sera plutôt appréhendé comme une composante essentielle d’un système socio-technique incluant les usagers du système d’information technologisé, autrement dit, les acteurs agissant de l’entreprise. De mon point de vue, ce système socio-technique qui mérite l’intérêt scientifique de notre discipline est l’entreprise. Les recherches que j’ai réalisées, animées ou supervisées , et qui sont structurées en quatre thèmes dans ce document, visent à résoudre les problèmes liés aux contextes de l'usage (l'entreprise et son environnement) des systèmes d’information. Le point discriminant de ma recherche est l'intérêt que je porte à la capacité de représentation :(i) de l'évolutivité et de la flexibilité des processus d'entreprise en particulier de ceux supportés par un système logiciel, d’un point de vue microscopique (modèle d’un processus) et macroscopique (représentation et configuration d’un réseau de processus) : thème 2(ii) du système d’entreprise dans toutes ses dimensions (stratégie, organisation des processus, système d’information et changement) : thème 3Pour composer avec ces motivations, il fallait :(iii) s’intéresser à la nature même du travail coopératif et à l’intentionnalité des acteurs agissant afin d’identifier et/ou proposer des formalismes appropriés pour les décrire et les comprendre : thème 1(iv) se questionner aussi sur les processus de management dont le rôle est de surveiller, mesurer, piloter l’entreprise afin de leur apporter le soutien qu’ils méritent du système d’information : thème

    Evidence-based Software Process Recovery

    Get PDF
    Developing a large software system involves many complicated, varied, and inter-dependent tasks, and these tasks are typically implemented using a combination of defined processes, semi-automated tools, and ad hoc practices. Stakeholders in the development process --- including software developers, managers, and customers --- often want to be able to track the actual practices being employed within a project. For example, a customer may wish to be sure that the process is ISO 9000 compliant, a manager may wish to track the amount of testing that has been done in the current iteration, and a developer may wish to determine who has recently been working on a subsystem that has had several major bugs appear in it. However, extracting the software development processes from an existing project is expensive if one must rely upon manual inspection of artifacts and interviews of developers and their managers. Previously, researchers have suggested the live observation and instrumentation of a project to allow for more measurement, but this is costly, invasive, and also requires a live running project. In this work, we propose an approach that we call software process recovery that is based on after-the-fact analysis of various kinds of software development artifacts. We use a variety of supervised and unsupervised techniques from machine learning, topic analysis, natural language processing, and statistics on software repositories such as version control systems, bug trackers, and mailing list archives. We show how we can combine all of these methods to recover process signals that we map back to software development processes such as the Unified Process. The Unified Process has been visualized using a time-line view that shows effort per parallel discipline occurring across time. This visualization is called the Unified Process diagram. We use this diagram as inspiration to produce Recovered Unified Process Views (RUPV) that are a concrete version of this theoretical Unified Process diagram. We then validate these methods using case studies of multiple open source software systems

    Formal patterns for Web-based systems design

    Get PDF
    The ubiquitous and simple interface of Web browsers has opened the door for the devel- opment of a new class of distributed applications which they have been known as Web applications. As more and more systems become Web-enabled we become increasingly dependent on the Web applications. Therefore, reliability of such systems is a very crucial factor for successful operation of many modern organisations and institutes. In the ¯rst part of this thesis we review how Web systems have evolved from simple static pages, in their early days, to their current situation as distributed applications with sophisticated functionalities. We also ¯nd out how the design methods have evolved to align with the rapid changes both in the new emerging technologies and growing functionalities. Although design approaches for Web applications have improved during the last decade we conclude that dependability should be given more consideration. In Chapter 2 we explain how this could be achieved through the application of formal methods. Therefore, we have provided an overview of dependability and formal methods in this chapter. In the second part of this research we follow a practical approach to the formal modelling of Web Applications. Accordingly, in Chapter 3 we have developed a series of formal models for an integrated holiday booking system. Our main objectives are to gain some common knowledge of the domain and to identify some key areas and features with regard to our formal modelling approach. Formal modelling of large Web applications could be a very complex process. In Chapter 4 we have introduced the idea of formal patterns for speci¯cation and re¯nement to accelerate the modelling process and to help alleviate the burden of formal modelling. In a further attempt to tackle the complexity of the formal modelling of Web applica- tions, we have introduced the idea of speci¯cation partitioning in Chapter 5. Speci¯- cation partitioning is closely related to the notion of composition. In this chapter we have extended some CSP-like composition techniques to build the system speci¯cation from subsystems or parts. The summary of our research, related ¯ndings and some suggestions for the future work are presented in Chapter 6.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    INTERACT 2015 Adjunct Proceedings. 15th IFIP TC.13 International Conference on Human-Computer Interaction 14-18 September 2015, Bamberg, Germany

    Get PDF
    INTERACT is among the world’s top conferences in Human-Computer Interaction. Starting with the first INTERACT conference in 1990, this conference series has been organised under the aegis of the Technical Committee 13 on Human-Computer Interaction of the UNESCO International Federation for Information Processing (IFIP). This committee aims at developing the science and technology of the interaction between humans and computing devices. The 15th IFIP TC.13 International Conference on Human-Computer Interaction - INTERACT 2015 took place from 14 to 18 September 2015 in Bamberg, Germany. The theme of INTERACT 2015 was "Connection.Tradition.Innovation". This volume presents the Adjunct Proceedings - it contains the position papers for the students of the Doctoral Consortium as well as the position papers of the participants of the various workshops

    The Automated analysis of object-oriented designs

    Get PDF
    This thesis concerns the use of software measures to assess the quality of object-oriented designs. It examines the ways in which design assessment can be assisted by measurement and the areas in which it can't. Other work in software measurement looks at defining and validating measures,or building prediction systems. This work is distinctive in that it examines the use of measures to help improve design quality during design time. To evaluate a design based on measurement results requires a means of relating measurement values to particular design problems or quality levels. Design heuristics were used to make this connection between measurement and quality. A survey was carried out to find suggestions for guidelines, rules and heuristics from the 00 design literature. This survey resulted in a catalogue of 288 suggestions for 00 design heuristics. The catalogue was structured around the 00 constructs to which the heuristics relate, and includes information on various heuristic attributes. This scheme is intended to allow suitable heuristics to be quickly located and correctly applied. Automation requires tool support. A tool was built which augmented the functionality available in existing sets, and taking input from multiple sources of design information (e.g., CASE tools and source code) and the described so far presents a potential method for automated design assessment provides the means of automation. An empirical study was then required to consider the efficacy of the method and evaluate the novel features of the tool. A case study was used to explore the approach taken by, and evaluate the effectiveness of, 15 subjects using measures and heuristics to assess the design of a small 00 system(IS classes). This study showed that semantic heuristics tended to highlight significant problems, but where attempts were made to automate these it often led to false problems being identified. This result, along with a previous finding that around half of quality criteria are not automatically assessable at design time, strongly suggeststhat people are still a necessary part of design assessment. The main result of the case study was that the subjects correctly identified 90% of the major design problems and were very positive about their experience of using measurement to support design assessment
    corecore