23,264 research outputs found

    A Formal Approach based on Fuzzy Logic for the Specification of Component-Based Interactive Systems

    Full text link
    Formal methods are widely recognized as a powerful engineering method for the specification, simulation, development, and verification of distributed interactive systems. However, most formal methods rely on a two-valued logic, and are therefore limited to the axioms of that logic: a specification is valid or invalid, component behavior is realizable or not, safety properties hold or are violated, systems are available or unavailable. Especially when the problem domain entails uncertainty, impreciseness, and vagueness, the appliance of such methods becomes a challenging task. In order to overcome the limitations resulting from the strict modus operandi of formal methods, the main objective of this work is to relax the boolean notion of formal specifications by using fuzzy logic. The present approach is based on Focus theory, a model-based and strictly formal method for componentbased interactive systems. The contribution of this work is twofold: i) we introduce a specification technique based on fuzzy logic which can be used on top of Focus to develop formal specifications in a qualitative fashion; ii) we partially extend Focus theory to a fuzzy one which allows the specification of fuzzy components and fuzzy interactions. While the former provides a methodology for approximating I/O behaviors under imprecision, the latter enables to capture a more quantitative view of specification properties such as realizability.Comment: In Proceedings FESCA 2015, arXiv:1503.0437

    A Semiotics View of Modeling Method Complexity - The Case of UML

    Get PDF
    Unified Modeling Language (UML) is the standard modeling language for object oriented system development. Despite its status as a standard, UML’s formal specification is fuzzy and its theoretical foundation is weak. Semiotics, the study of signs, provides us good theoretical foundation for UML research as UML graphical notations are some kinds of signs. In this research, we use semiotics to study the graphical notations in UML. We hypothesized that using iconic signs as UML graphical notations leads to more accurate representation and arouses fewer connotations than using symbolic signs. Since symbolic signs involve more learning efforts, we assume that expert users of UML will perform better with symbolic signs than novice users. We created an open-ended survey to test these hypotheses. The qualitative analysis of the survey process can help us gain in-depth understanding of the complexity of modeling language graphical notations. In addition, the introduction of semiotics in this research helps build a solid theoretical foundation of IS modeling method research

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Application Of Fuzzy Mathematics Methods To Processing Geometric Parameters Of Degradation Of Building Structures

    Get PDF
    The aim of research is formalization of the expert experience, which is used in processing geometric parameters of building structure degradation, using fuzzy mathematics. Materials that are used to specify fuzzy models are contained in expert assessments and scientific and technical reports on the technical condition of buildings. The information contained in the reports and assessments is presented in text form and is accompanied by a large number of photographs and diagrams. Model specification methods, based on the analysis of such information on the technical state of structures with damages and defects of various types, primarily lead to difficulties associated with the presentation of knowledge and require the formalization of expert knowledge and experience in the form of fuzzy rules. Approbation and adaptation of the rules is carried out in the process of further research taking into account the influence of random loads and fields. The scientific novelty of the work is expanding of the knowledge base due to the geometric parameters of structural degradation, on the basis of which a fuzzy conclusion about their technical state in the systems of fuzzy product rules at different stages of the object's life cycle is realized. The results of the work are presented in the form of a formalized description of the geometric parameters of degradation. The knowledge presented in the work is intended for the development of technical documentation that is used at the pre-project stage of building reconstruction, but the gained experience is the source of information on the basis of which a constructive solution is selected in the design process of analogical objects. In addition, the knowledge gained from the analysis of expert assessments of the state of various designs is necessary for development of automated expert evaluation processing systems. The use of such evaluation systems will significantly reduce the risks of the human factor associated with the errors in the specification of models for predicting the processes of structural failure at various stages of ensuring the reliability and safety of buildings

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated
    • …
    corecore