55 research outputs found

    Developing and Measuring Parallel Rule-Based Systems in a Functional Programming Environment

    Get PDF
    This thesis investigates the suitability of using functional programming for building parallel rule-based systems. A functional version of the well known rule-based system OPS5 was implemented, and there is a discussion on the suitability of functional languages for both building compilers and manipulating state. Functional languages can be used to build compilers that reflect the structure of the original grammar of a language and are, therefore, very suitable. Particular attention is paid to the state requirements and the state manipulation structures of applications such as a rule-based system because, traditionally, functional languages have been considered unable to manipulate state. From the implementation work, issues have arisen that are important for functional programming as a whole. They are in the areas of algorithms and data structures and development environments. There is a more general discussion of state and state manipulation in functional programs and how theoretical work, such as monads, can be used. Techniques for how descriptions of graph algorithms may be interpreted more abstractly to build functional graph algorithms are presented. Beyond the scope of programming, there are issues relating both to the functional language interaction with the operating system and to tools, such as debugging and measurement tools, which help programmers write efficient programs. In both of these areas functional systems are lacking. To address the complete lack of measurement tools for functional languages, a profiling technique was designed which can accurately measure the number of calls to a function , the time spent in a function, and the amount of heap space used by a function. From this design, a profiler was developed for higher-order, lazy, functional languages which allows the programmer to measure and verify the behaviour of a program. This profiling technique is designed primarily for application programmers rather than functional language implementors, and the results presented by the profiler directly reflect the lexical scope of the original program rather than some run-time representation. Finally, there is a discussion of generally available techniques for parallelizing functional programs in order that they may execute on a parallel machine. The techniques which are easier for the parallel systems builder to implement are shown to be least suitable for large functional applications. Those techniques that best suit functional programmers are not yet generally available and usable

    Reasoning strategies for semantic Web rule languages

    Get PDF
    Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2008.Includes bibliographical references (p. 101-104).Dealing with data in open, distributed environments is an increasingly important problem today. The processing of heterogeneous data in formats such as RDF is still being researched. Using rules and rule engines is one technique that is being used. In doing so, the problem of handling heterogeneous rules from multiple sources becomes important. Over the course of this thesis, I wrote several kinds of reasoners including backward, forward, and hybrid reasoners for RDF rule languages. These were used for a variety of problems and data in a wide range of settings for solving real world problems. During my investigations, I learned several interesting problems of RDF. First, simply making the term space big and well names paced and the language low enough expressivity did not make computation necessarily easier. Next, checking proofs in an RDF environment proved to be hard because the basic features of RDF that make it possible for it to represent heterogeneous data effectively make proofs difficult. Further work is needed to see if some of these problems can be mitigated. Though rules are useful, using rules correctly and efficiently for processing RDF data proved to be difficult.by Joseph Scharf.M.Eng

    MPS : a multiagent production system /

    Full text link

    A distributed rule-based expert system for large event stream processing

    Get PDF
    Rule-based expert systems (RBSs) provide an efficient solution to many problems that involve event stream processing. With today’s needs to process larger streams, many approaches have been proposed to distribute the rule engines behind RBSs. However, there are some issues which limit the potential of distributed RBSs in the current big data era, such as the load imbalance due to their distribution methods, and low parallelism originated from the continuous operator model. To address these issues, we propose a new architecture for distributing rule engines. This architecture adopts the dynamic job assignment and the micro-batching strategies, which have recently arisen in the big data community, to remove the load imbalance and increase parallelism of distributed rule engines. An automated transformation framework based on Model-driven Architecture (MDA) is presented, which can be used to transform the current rule engines to work on the proposed architecture. This work is validated by a 2-step verification. In addition, we propose a generic benchmark for evaluating the performance of distributed rule engines. The performance of the proposed architecture is discussed and directions for future research are suggested. The contribution of this study can be viewed from two different angles: for the rule-based system community, this thesis documents an improvement to the rule engines by fully adopting big data technologies; for the big data community, it is an early proposal to process large event streams using a well crafted rule-based system. Our results show the proposed approach can benefit both research communities

    Towards computerizing intensive care sedation guidelines: design of a rule-based architecture for automated execution of clinical guidelines

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Computerized ICUs rely on software services to convey the medical condition of their patients as well as assisting the staff in taking treatment decisions. Such services are useful for following clinical guidelines quickly and accurately. However, the development of services is often time-consuming and error-prone. Consequently, many care-related activities are still conducted based on manually constructed guidelines. These are often ambiguous, which leads to unnecessary variations in treatments and costs.</p> <p>The goal of this paper is to present a semi-automatic verification and translation framework capable of turning manually constructed diagrams into ready-to-use programs. This framework combines the strengths of the manual and service-oriented approaches while decreasing their disadvantages. The aim is to close the gap in communication between the IT and the medical domain. This leads to a less time-consuming and error-prone development phase and a shorter clinical evaluation phase.</p> <p>Methods</p> <p>A framework is proposed that semi-automatically translates a clinical guideline, expressed as an XML-based flow chart, into a Drools Rule Flow by employing semantic technologies such as ontologies and SWRL. An overview of the architecture is given and all the technology choices are thoroughly motivated. Finally, it is shown how this framework can be integrated into a service-oriented architecture (SOA).</p> <p>Results</p> <p>The applicability of the Drools Rule language to express clinical guidelines is evaluated by translating an example guideline, namely the sedation protocol used for the anaesthetization of patients, to a Drools Rule Flow and executing and deploying this Rule-based application as a part of a SOA. The results show that the performance of Drools is comparable to other technologies such as Web Services and increases with the number of decision nodes present in the Rule Flow. Most delays are introduced by loading the Rule Flows.</p> <p>Conclusions</p> <p>The framework is an effective solution for computerizing clinical guidelines as it allows for quick development, evaluation and human-readable visualization of the Rules and has a good performance. By monitoring the parameters of the patient to automatically detect exceptional situations and problems and by notifying the medical staff of tasks that need to be performed, the computerized sedation guideline improves the execution of the guideline.</p

    Gamification as a Service: Conceptualization of a Generic Enterprise Gamification Platform

    Get PDF
    Gamification is a novel method to improve engagement, motivation, or participation in non-game contexts using game mechanics. To a large extent, gamification is a psychological- and design-oriented discipline, i.e., a lot of effort has to be spent already in the design phase of a gamification project. Subsequently, the design is implemented in information systems such as portals or enterprise resource planning applications. These systems act as mediators to transport a gameful design to its users. However, the efforts for the subsequent development and integration process are often underestimated. In fact, most conceptual gamification designs are never implemented due to the high development costs that arise from building the gamification solution from scratch, imprecise design or technical requirements, and communication conflicts between different stakeholders in the project. This thesis addresses these problems by systematically defining the phases and stakeholders of the overall gamification process. Furthermore, the thesis rigorously defines the conceptual requirements of gamification based on a broad literature review. The identified conceptual requirements are mapped to a domain-specific language, called the Gamification Modeling Language. Moreover, this thesis analyzes 29 existing gamification solutions that aim to decrease the implementation efforts of gamification. However, using the different language elements, it is shown that none of the existing solutions suffices all requirements. Therefore, a generic and reusable platform as runtime environment for gamification is proposed which fulfills all presented functional and non-functional requirements. As another benefit, it is shown how the Gamification Modeling Language can be automatically compiled into code for the gamification runtime environment and, thus, further reduces development efforts. Based on the developed artifacts and five real gamified applications from industry, it is shown that the efforts for the implementation of the gamification can be significantly reduced from several months or weeks to a few days. Since the technology is designed as a reusable service, future projects benefit continuously with regards to time and efforts

    Semantic In-Network Complex Event Processing for an Energy Efficient Wireless Sensor Network

    Get PDF
    Wireless Sensor Networks (WSNs) consist of spatially distributed sensor nodes that perform monitoring tasks in a region and the gateway nodes that provide the acquired sensor data to the end user. With advances in the WSN technology, it has now become possible to have different types of sensor nodes within a region to monitor the environment. This provides the flexibility to monitor the environment in a more extensive manner than before. Sensor nodes are severely constrained devices with very limited battery sources and their resource scarcity remains a challenge. In traditional WSNs, the sensor nodes are used only for capturing data that is analysed later in more powerful gateway nodes. This continuous communication of data between sensor nodes and gateway nodes wastes energy at the sensor nodes, and consequently, the overall network lifetime is greatly reduced. Existing approaches to reduce energy consumption by processing at the sensor node level only work for homogeneous networks. This thesis presents a sensor node architecture for heterogeneous WSNs, called SEPSen, where data is processed locally at the sensor node level to reduce energy consumption. We use ontology fragments at the sensor nodes to enable data exchange between heterogeneous sensor nodes within the WSN. We employ a rule engine based on a pattern matching algorithm for filtering events at the sensor node level. The event routing towards the gateway nodes is performed using a context-aware routing scheme that takes both the energy consumption and the heterogeneity of the sensor nodes into account. As a proof of concept, we present a prototypical implementation of the SEPSen design in a simulation environment. By providing semantic support, in-network data processing capabilities and context-aware routing in SEPSen, the sensor nodes (1) communicate with each other despite their different sensor types, (2) filter events at the their own level to conserve the limited sensor node energy resources and (3) share the nodes' knowledge bases for collaboration between the sensor nodes using node-centric context-awareness in changing conditions. The SEPSen prototype has been evaluated based on a test case for water quality management. The results from the experiments show that the energy saved in SEPSen reaches almost 50% by processing events at the sensor node level and the overall network lifetime is increased by at least a factor of two against the shortest-path-first (Min-Hop) routing approach
    corecore