9,889 research outputs found

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Software framework for the development of context-aware reconfigurable systems

    Get PDF
    In this project we propose a new software framework for the development of context-aware and secure controlling software of distributed reconfigurable systems. Context-awareness is a key feature allowing the adaptation of systems behaviour according to the changing environment. We introduce a new definition of the term “context” for reconfigurable systems then we define a new context modelling and reasoning approach. Afterwards, we define a meta-model of context-aware reconfigurable applications that paves the way to the proposed framework. The proposed framework has a three-layer architecture: reconfiguration, context control, and services layer, where each layer has its well-defined role. We define also a new secure conversation protocol between distributed trustless parts based on the blockchain technology as well as the elliptic curve cryptography. To get better correctness and deployment guarantees of applications models in early development stages, we propose a new UML profile called GR-UML to add new semantics allowing the modelling of probabilistic scenarios running under memory and energy constraints, then we propose a methodology using transformations between the GR-UML, the GR-TNCES Petri nets formalism, and the IEC 61499 function blocks. A software tool implementing the methodology concepts is developed. To show the suitability of the mentioned contributions two case studies (baggage handling system and microgrids) are considered.In diesem Projekt schlagen wir ein Framework für die Entwicklung von kontextbewussten, sicheren Anwendungen von verteilten rekonfigurierbaren Systemen vor. Kontextbewusstheit ist eine Schlüsseleigenschaft, die die Anpassung des Systemverhaltens an die sich ändernde Umgebung ermöglicht. Wir führen eine Definition des Begriffs ``Kontext" für rekonfigurierbare Systeme ein und definieren dann einen Kontextmodellierungs- und Reasoning-Ansatz. Danach definieren wir ein Metamodell für kontextbewusste rekonfigurierbare Anwendungen, das den Weg zum vorgeschlagenen Framework ebnet. Das Framework hat eine dreischichtige Architektur: Rekonfigurations-, Kontextkontroll- und Dienste-Schicht, wobei jede Schicht ihre wohldefinierte Rolle hat. Wir definieren auch ein sicheres Konversationsprotokoll zwischen verteilten Teilen, das auf der Blockchain-Technologie sowie der elliptischen Kurven-Kryptographie basiert. Um bessere Korrektheits- und Einsatzgarantien für Anwendungsmodelle zu erhalten, schlagen wir ein UML-Profil namens GR-UML vor, um Semantik umzufassen, die die Modellierung probabilistischer Szenarien unter Speicher- und Energiebeschränkungen ermöglicht. Dann schlagen wir eine Methodik vor, die Transformationen zwischen GR-UML, dem GR-TNCES-Petrinetz-Formalismus und den IEC 61499-Funktionsblöcken verwendet. Es wird ein Software entwickelt, das die Konzepte der Methodik implementiert. Um die Eignung der genannten Beiträge zu zeigen, werden zwei Fallstudien betrachtet

    A heuristic-based approach to code-smell detection

    Get PDF
    Encapsulation and data hiding are central tenets of the object oriented paradigm. Deciding what data and behaviour to form into a class and where to draw the line between its public and private details can make the difference between a class that is an understandable, flexible and reusable abstraction and one which is not. This decision is a difficult one and may easily result in poor encapsulation which can then have serious implications for a number of system qualities. It is often hard to identify such encapsulation problems within large software systems until they cause a maintenance problem (which is usually too late) and attempting to perform such analysis manually can also be tedious and error prone. Two of the common encapsulation problems that can arise as a consequence of this decomposition process are data classes and god classes. Typically, these two problems occur together – data classes are lacking in functionality that has typically been sucked into an over-complicated and domineering god class. This paper describes the architecture of a tool which automatically detects data and god classes that has been developed as a plug-in for the Eclipse IDE. The technique has been evaluated in a controlled study on two large open source systems which compare the tool results to similar work by Marinescu, who employs a metrics-based approach to detecting such features. The study provides some valuable insights into the strengths and weaknesses of the two approache

    Model-Driven Engineering Method to Support the Formalization of Machine Learning using SysML

    Full text link
    Methods: This work introduces a method supporting the collaborative definition of machine learning tasks by leveraging model-based engineering in the formalization of the systems modeling language SysML. The method supports the identification and integration of various data sources, the required definition of semantic connections between data attributes, and the definition of data processing steps within the machine learning support. Results: By consolidating the knowledge of domain and machine learning experts, a powerful tool to describe machine learning tasks by formalizing knowledge using the systems modeling language SysML is introduced. The method is evaluated based on two use cases, i.e., a smart weather system that allows to predict weather forecasts based on sensor data, and a waste prevention case for 3D printer filament that cancels the printing if the intended result cannot be achieved (image processing). Further, a user study is conducted to gather insights of potential users regarding perceived workload and usability of the elaborated method. Conclusion: Integrating machine learning-specific properties in systems engineering techniques allows non-data scientists to understand formalized knowledge and define specific aspects of a machine learning problem, document knowledge on the data, and to further support data scientists to use the formalized knowledge as input for an implementation using (semi-) automatic code generation. In this respect, this work contributes by consolidating knowledge from various domains and therefore, fosters the integration of machine learning in industry by involving several stakeholders.Comment: 43 pages, 24 figure, 3 table

    Dynamic simulation of a modular flotation plant

    Get PDF
    The aim if this thesis work is to develop a dynamic simulator tool of a modular flotation plant with the final purpose of the simulator to be used as a training tool for operators. The thesis report describes basic theory about froth flotation process, process modelling, automated control and the development process of the final simulator tool. The simula-tor tool was developed for Outotec Oy® Finland based on their designs and calculations. The simulator was developed with the aid of three different programs: HSC Sim 9, SIE-MENS PCS 7, Virtual Experience Manager and Virtual Experience Client. HSC Sim per-forms dynamic simulation of the flotation process. SIEMENS PCS is the Distributed Control System (D.C.S.) where the automated control, the human machine interface and the instrumentation are created. The final training platform development and training exercise creation was done using Virtual Experience Manager. Virtual Experience Client was used to activate the training simulator tool. The final version of the training simulator was capable of simulating the required flota-tion flowsheet with the input ore-feed. It can be used as a base platform on which further development regarding instrumentation, flotation circuit and mineral processing can be modified to fit different configurations as required. Further development of the simula-tor requires improvement on the capabilities of the programs used for the development

    Software Systems Engineering for Cyber Physical Production Systems

    Get PDF
    This project solves the problem of easy adaption and usage of CPPS by small scale industries, With this project it has been tried to develop a methodology of requirement engineering for CPPS system and finally the whole system. We have developed the approach right from requirement engineering to mapping into IEC61499 function blocks and then to deployment to a physical devices. This work can be a good foundation and support for scientific communities or industialist to easily implement requirement engineering of a small scale systems for CPPS and thus build a 21st century production system with this and reap its enormous benefits.Cyber physical production systems are the future of production systems not only in europe but in the entire world. It brings with itself huge benefits and popularly attributes to Industry 4.0 also. These are automated systems where physical systems are monitored and controlled by computer based algorithms in real time. Traditional systems have certain disadvantages and are limited in terms of hours of operation as it is governed by manpowers and the type of products that can be produced without making much changes in the production configuration and the speed of production of products. In europe, a lot of research is going on, particularly in germany and in the United states too for upgrading major physical systems and manufacturing systems. Some examples of such systems are smart factory, smart grid, autonomous automobile systems, automatic pilot avionics, robotics systems etc. The main goal of this thesis is to define a set of methodologies for easing the process of implementation of the CPPS(cyber physical production systems) system on small and medium industries so that the adoption rate for such industries can be high. There is no methodology yet particularly for CPPS systems for small and medium industries, although we have methodologies in place for large industries. In order to do so, first study was done for challenges in developing a requirement engineering process in section 3 and how it is different from a typical software system. An approach has been developed based on existing information available on large systems and CPPS and some software engineering frameworks like MODAF and TOGAF. A proposal for the process and some diagrams and tools has been made in section 4. To validate the proposed approach we have taken a synthetic test case of a pizza production system and implemented all the approaches to transform it into a cyber physical production system right from requirement and UML diagrams to the final function block approach. With this set of approaches,there is now a basis for software development methodology for small and medium industries particularly. With these approaches the adoption rate can be really high for such industries bringing out traditional industries more to the 21st century forefront

    BIM-based decision support for building condition assessment

    Get PDF
    Building condition assessment requires the integration of various types of data such as building characteristics, the properties of elements/systems and maintenance records. Previous research has focused on identifying these data and developing a building condition risk assessment model based on Bayesian networks (BN). However, due to interoperability issues, the process of transferring the data is performed manually, which requires considerable time and effort. To address this issue, this paper presents a data model to integrate the building condition risk assessment model into BIM. The proposed data model is implemented in existing software as a case study and tested and evaluated on three scenarios. Addressing interoperability will leverage the BIM tool as a data re- pository to automate the data transfer process and improve its consistency and reliability. It will also enable BIM to be a more effective tool for building condition and causality analysis visualization.This work was supported by Agència de Gestió d’Ajuts Universitaris i de Recerca (AGAUR) from Generalitat de Catalunya under Grant 2019 FI_B00064Postprint (published version

    OPERATION AND PROCESS CONTROL DEVELOPMENT FOR A PILOT-SCALE LEACHING AND SOLVENT EXTRACTION CIRCUIT RECOVERING RARE EARTH ELEMENTS FROM COAL-BASED SOURCES

    Get PDF
    The US Department of Energy in 2010 has identified several rare earth elements as critical materials to enable clean technologies. As part of ongoing research in REEs (rare earth elements) recovery from coal sources, the University of Kentucky has designed, developed and is demonstrating a ÂĽ ton/hour pilot-scale processing plant to produce high-grade REEs from coal sources. Due to the need to control critical variables (e.g. pH, tank level, etc.), process control is required. To ensure adequate process control, a study was conducted on leaching and solvent extraction control to evaluate the potential of achieving low-cost REE recovery in addition to developing a process control PLC system. The overall operational design and utilization of Six Sigma methodologies is discussed. Further, the application of the controls design, both procedural and electronic for the control of process variables such as pH is discussed. Variations in output parameters were quantified as a function of time. Data trends show that the mean process variable was maintained within prescribed limits. Future work for the utilization of data analysis and integration for data-based decision-making will be discussed
    • …
    corecore