422,262 research outputs found

    Using CLIPS in the domain of knowledge-based massively parallel programming

    Get PDF
    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed

    A Domain-Specific Language and Editor for Parallel Particle Methods

    Full text link
    Domain-specific languages (DSLs) are of increasing importance in scientific high-performance computing to reduce development costs, raise the level of abstraction and, thus, ease scientific programming. However, designing and implementing DSLs is not an easy task, as it requires knowledge of the application domain and experience in language engineering and compilers. Consequently, many DSLs follow a weak approach using macros or text generators, which lack many of the features that make a DSL a comfortable for programmers. Some of these features---e.g., syntax highlighting, type inference, error reporting, and code completion---are easily provided by language workbenches, which combine language engineering techniques and tools in a common ecosystem. In this paper, we present the Parallel Particle-Mesh Environment (PPME), a DSL and development environment for numerical simulations based on particle methods and hybrid particle-mesh methods. PPME uses the meta programming system (MPS), a projectional language workbench. PPME is the successor of the Parallel Particle-Mesh Language (PPML), a Fortran-based DSL that used conventional implementation strategies. We analyze and compare both languages and demonstrate how the programmer's experience can be improved using static analyses and projectional editing. Furthermore, we present an explicit domain model for particle abstractions and the first formal type system for particle methods.Comment: Submitted to ACM Transactions on Mathematical Software on Dec. 25, 201

    Exploranative Code Quality Documents

    Full text link
    Good code quality is a prerequisite for efficiently developing maintainable software. In this paper, we present a novel approach to generate exploranative (explanatory and exploratory) data-driven documents that report code quality in an interactive, exploratory environment. We employ a template-based natural language generation method to create textual explanations about the code quality, dependent on data from software metrics. The interactive document is enriched by different kinds of visualization, including parallel coordinates plots and scatterplots for data exploration and graphics embedded into text. We devise an interaction model that allows users to explore code quality with consistent linking between text and visualizations; through integrated explanatory text, users are taught background knowledge about code quality aspects. Our approach to interactive documents was developed in a design study process that included software engineering and visual analytics experts. Although the solution is specific to the software engineering scenario, we discuss how the concept could generalize to multivariate data and report lessons learned in a broader scope.Comment: IEEE VIS VAST 201

    Exchange of knowledge in customized product development processes

    Get PDF
    If Customized Product Development is perceived as developing products that fulfill the customers individual requirements and in parallel reflect production constraints, such as manufacturing capabilities, a direct demand can be derived for solutions to adapt a given design easy and fast to new requirements based upon the companies production knowledge - at best in an automated way. The latter is usually covered by Knowledge Based Engineering systems. KBE systems are capable to automate repetitive engineering tasks, such as the automated calculation of ship structural design. However, while the efficiency of implemented KBE projects is non controversial, the development or modification of an existing KBE solution usually requires substantial investments due to knowledge acquisition, codification and software implementation. In addition most solutions are still case based and not grounded in structural frameworks. Knowledge is often written in a proprietary language; rules and algorithms are not compatible with other KBE-frameworks and are usually not on a level that is comprehensible for the engineers or domain experts. While this may not be crucial for long development cycles, it may become a hurdle in terms of Customized Product Development with its short cycles. In other words, future KBE must support an incorporation of knowledge from different domains and business units. Thus the objective of the paper is to explain the need for a change in collaborative knowledge sharing and re-use in context of KBE. Based upon, the constraints for a KBE related interchange format are drafted. A three layered approach is proposed in order to adequately represent and exchange KBE knowledge. Each layer addresses different levels of abstraction: an upper layer describing just the core knowledge at a glance, a middle layer in order to codify the knowledge on abstract level, but with purpose of software development and a base layer covering the software code itself. Utilizing an independent format for management of KBE knowledge, the users of CAx systems are able to exchange codified knowledge and gain the rationale behind. Hence the full paper attempts to deliver a substantial contribution for the development of systems, which are capable to easily adapt a given design to upcoming user-requirements, while facing the production challenges

    Automatic Generation of Trace Links in Model-driven Software Development

    Get PDF
    Traceability data provides the knowledge on dependencies and logical relations existing amongst artefacts that are created during software development. In reasoning over traceability data, conclusions can be drawn to increase the quality of software. The paradigm of Model-driven Software Engineering (MDSD) promotes the generation of software out of models. The latter are specified through different modelling languages. In subsequent model transformations, these models are used to generate programming code automatically. Traceability data of the involved artefacts in a MDSD process can be used to increase the software quality in providing the necessary knowledge as described above. Existing traceability solutions in MDSD are based on the integral model mapping of transformation execution to generate traceability data. Yet, these solutions still entail a wide range of open challenges. One challenge is that the collected traceability data does not adhere to a unified formal definition, which leads to poorly integrated traceability data. This aggravates the reasoning over traceability data. Furthermore, these traceability solutions all depend on the existence of a transformation engine. However, not in all cases pertaining to MDSD can a transformation engine be accessed, while taking into account proprietary transformation engines, or manually implemented transformations. In these cases it is not possible to instrument the transformation engine for the sake of generating traceability data, resulting in a lack of traceability data. In this work, we address these shortcomings. In doing so, we propose a generic traceability framework for augmenting arbitrary transformation approaches with a traceability mechanism. To integrate traceability data from different transformation approaches, our approach features a methodology for augmentation possibilities based on a design pattern. The design pattern supplies the engineer with recommendations for designing the traceability mechanism and for modelling traceability data. Additionally, to provide a traceability mechanism for inaccessible transformation engines, we leverage parallel model matching to generate traceability data for arbitrary source and target models. This approach is based on a language-agnostic concept of three similarity measures for matching. To realise the similarity measures, we exploit metamodel matching techniques for graph-based model matching. Finally, we evaluate our approach according to a set of transformations from an SAP business application and the domain of MDSD

    ELISA, a demonstrator environment for information systems architecture design

    Get PDF
    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions

    Knowledge-based Engineering in Product Development Processes - Process, IT and Knowledge Management perspectives

    Get PDF
    Product development as a field of practice and research has significantly changed due to the general trends of globalization changing the enterprise landscapes in which products are realized. The access to partners and suppliers with high technological specialization has also led to an increased specialization of original equipment manufacturers (OEMs). Furthermore, the products are becoming increasingly complex with a high functional and technological content and many variants. Combined with shorter lifecycles which require reuse of technologies and solutions, this has resulted in an overall increased knowledge intensity which necessitates a more explicit approach towards knowledge and knowledge management in product development. In parallel, methods and IT tools for managing knowledge have been developed and are more accessible and usable today. One such approach is knowledge-based engineering (KBE), a term that was coined in the mid-1980s as a label for applications which automate the design of rule-driven geometries. In this thesis the term KBE embraces the capture and application of engineering knowledge to automate engineering tasks, regardless of domain of application, and the thesis aims at contributing to a wider utilization of KBE in product development (PD). The thesis focuses on two perspectives of KBE; as a process improvement IT method and as a knowledge management (KM) method. In the first perspective, the lack of explicit regard for the constraints of the product lifecycle management (PLM) architecture, which governs the interaction of processes and IT in PD, has been identified to negatively affect the utilization of KBE in PD processes. In the second perspective, KM theories and models can complement existing methods for identifying potential for KBE applications.Regarding the first perspective, it is concluded that explicit regard for the PLM architecture decreases the need to develop and maintain software code related to hard coded redundant data and functions in the KBE application. The concept of service oriented architecture (SOA) has been found to enable an the explicit regard for the PLM architecture.. Regarding the second perspective, it is concluded that potential for KBE applications is indicated by: 1.) application of certain types of knowledge in PD processes 2.) high maturity and formalization of the applied knowledge 3.) a codification strategy for KM and 4.) an agreement and transparency regarding how the knowledge is applied, captured and transferred. It is also concluded that the formulation of explicit KM strategies in PD should be guided by knowledge application and its relation to strategic objectives focusing on types of knowledge, their role in the PD process and the methods and tools for their application. These, in turn, affect the methods and tools deployed for knowledge capture in order for it to integrate with the processes of knowledge origin. Finally, roles and processes for knowledge transfer have to be transparent to assure the motivation of individuals to engage in the KM strategy

    Knowledge-based Engineering in Product Development Processes - Process, IT and Knowledge Management perspectives

    Get PDF
    Product development as a field of practice and research has significantly changed due to the general trends of globalization changing the enterprise landscapes in which products are realized. The access to partners and suppliers with high technological specialization has also led to an increased specialization of original equipment manufacturers (OEMs). Furthermore, the products are becoming increasingly complex with a high functional and technological content and many variants. Combined with shorter lifecycles which require reuse of technologies and solutions, this has resulted in an overall increased knowledge intensity which necessitates a more explicit approach towards knowledge and knowledge management in product development. In parallel, methods and IT tools for managing knowledge have been developed and are more accessible and usable today. One such approach is knowledge-based engineering (KBE), a term that was coined in the mid-1980s as a label for applications which automate the design of rule-driven geometries. In this thesis the term KBE embraces the capture and application of engineering knowledge to automate engineering tasks, regardless of domain of application, and the thesis aims at contributing to a wider utilization of KBE in product development (PD). The thesis focuses on two perspectives of KBE; as a process improvement IT method and as a knowledge management (KM) method. In the first perspective, the lack of explicit regard for the constraints of the product lifecycle management (PLM) architecture, which governs the interaction of processes and IT in PD, has been identified to negatively affect the utilization of KBE in PD processes. In the second perspective, KM theories and models can complement existing methods for identifying potential for KBE applications.Regarding the first perspective, it is concluded that explicit regard for the PLM architecture decreases the need to develop and maintain software code related to hard coded redundant data and functions in the KBE application. The concept of service oriented architecture (SOA) has been found to enable an the explicit regard for the PLM architecture.. Regarding the second perspective, it is concluded that potential for KBE applications is indicated by: 1.) application of certain types of knowledge in PD processes 2.) high maturity and formalization of the applied knowledge 3.) a codification strategy for KM and 4.) an agreement and transparency regarding how the knowledge is applied, captured and transferred. It is also concluded that the formulation of explicit KM strategies in PD should be guided by knowledge application and its relation to strategic objectives focusing on types of knowledge, their role in the PD process and the methods and tools for their application. These, in turn, affect the methods and tools deployed for knowledge capture in order for it to integrate with the processes of knowledge origin. Finally, roles and processes for knowledge transfer have to be transparent to assure the motivation of individuals to engage in the KM strategy

    On Acceleration of Evolutionary Algorithms Taking Advantage of A Posteriori Error Analysis

    Get PDF
    A variety of important engineering and scientific tasks may be formulated as non-linear, constrained optimization problems. Their solution often demands high computational power. It may be reached by means of appropriate hardware, software or algorithm improvements. The Evolutionary Algorithms (EA) approach to solution of such problems is considered here. The EA are rather slow methods; however, the main advantage of their application is observed in the case of non-convex problems. Particularly high efficiency is demanded in the case of solving large optimization problems. Examples of such problems in engineering include analysis of residual stresses in railroad rails and vehicle wheels, as well as the Physically Based Approximation (PBA) approach to smoothing experimental and/or numerical data. Having in mind such analysis in the future, we focus our current research on the significant EA efficiency increase. Acceleration of the EA is understood here, first of all, as decreasing the total computational time required to solve an optimization problem. Such acceleration may be obtained in various ways. There are at least two gains from the EA acceleration, namely i) saving computational time, and ii) opening a possibility of solving larger optimization problems, than it would be possible with the standard EA. In our recent research we have preliminarily proposed several new speed-up techniques based on simple concepts. In this paper we mainly develop acceleration techniques based on simultaneous solutions averaging well supported by a non-standard application of parallel calculations, and a posteriori solution error analysis. The knowledge about the solution error is used to EA acceleration by means of appropriately modified standard evolutionary operators like selection, crossover, and mutation. Efficiency of the proposed techniques is evaluated using several benchmark tests. These tests indicate significant speed-up of the involved optimization process. Further concepts and improvements are also currently being developed and tested

    International conference on software engineering and knowledge engineering: Session chair

    Get PDF
    The Thirtieth International Conference on Software Engineering and Knowledge Engineering (SEKE 2018) will be held at the Hotel Pullman, San Francisco Bay, USA, from July 1 to July 3, 2018. SEKE2018 will also be dedicated in memory of Professor Lofti Zadeh, a great scholar, pioneer and leader in fuzzy sets theory and soft computing. The conference aims at bringing together experts in software engineering and knowledge engineering to discuss on relevant results in either software engineering or knowledge engineering or both. Special emphasis will be put on the transference of methods between both domains. The theme this year is soft computing in software engineering & knowledge engineering. Submission of papers and demos are both welcome
    corecore