1,165 research outputs found

    The application of knowledge based systems to the abstraction of design and costing rules in bespoke pipe jointing systems

    Get PDF
    This thesis presents the work undertaken in the creation of a knowledge based system aimed at facilitating the design and cost estimation of bespoke pipe jointing systems. An overview of the problem domain is provided and the findings from a literature review on knowledge based systems and applications in manufacturing were used to provide initial guidance to the research. The overall investigation and development process involved the abstraction of design and costing rules from domain experts using a sub-set of the techniques reviewed and the development and implementation of the knowledge based system using an expert system approach, the soft systems methodology (SSM) and the system development lifecycle methodology. Based on the abstracted design and costing rules, the developed system automates the design of pipe jointing systems, and facilitates cost estimation process within third party configuration software. The developed system was validated using two case studies and was shown to provide the required outputs

    Semantic-guided predictive modeling and relational learning within industrial knowledge graphs

    Get PDF
    The ubiquitous availability of data in today’s manufacturing environments, mainly driven by the extended usage of software and built-in sensing capabilities in automation systems, enables companies to embrace more advanced predictive modeling and analysis in order to optimize processes and usage of equipment. While the potential insight gained from such analysis is high, it often remains untapped, since integration and analysis of data silos from diïŹ€erent production domains requires high manual eïŹ€ort and is therefore not economic. Addressing these challenges, digital representations of production equipment, so-called digital twins, have emerged leading the way to semantic interoperability across systems in diïŹ€erent domains. From a data modeling point of view, digital twins can be seen as industrial knowledge graphs, which are used as semantic backbone of manufacturing software systems and data analytics. Due to the prevalent historically grown and scattered manufacturing software system landscape that is comprising of numerous proprietary information models, data sources are highly heterogeneous. Therefore, there is an increasing need for semi-automatic support in data modeling, enabling end-user engineers to model their domain and maintain a uniïŹed semantic knowledge graph across the company. Once the data modeling and integration is done, further challenges arise, since there has been little research on how knowledge graphs can contribute to the simpliïŹcation and abstraction of statistical analysis and predictive modeling, especially in manufacturing. In this thesis, new approaches for modeling and maintaining industrial knowledge graphs with focus on the application of statistical models are presented. First, concerning data modeling, we discuss requirements from several existing standard information models and analytic use cases in the manufacturing and automation system domains and derive a fragment of the OWL 2 language that is expressive enough to cover the required semantics for a broad range of use cases. The prototypical implementation enables domain end-users, i.e. engineers, to extend the basis ontology model with intuitive semantics. Furthermore it supports eïŹƒcient reasoning and constraint checking via translation to rule-based representations. Based on these models, we propose an architecture for the end-user facilitated application of statistical models using ontological concepts and ontology-based data access paradigms. In addition to that we present an approach for domain knowledge-driven preparation of predictive models in terms of feature selection and show how schema-level reasoning in the OWL 2 language can be employed for this task within knowledge graphs of industrial automation systems. A production cycle time prediction model in an example application scenario serves as a proof of concept and demonstrates that axiomatized domain knowledge about features can give competitive performance compared to purely data-driven ones. In the case of high-dimensional data with small sample size, we show that graph kernels of domain ontologies can provide additional information on the degree of variable dependence. Furthermore, a special application of feature selection in graph-structured data is presented and we develop a method that allows to incorporate domain constraints derived from meta-paths in knowledge graphs in a branch-and-bound pattern enumeration algorithm. Lastly, we discuss maintenance of facts in large-scale industrial knowledge graphs focused on latent variable models for the automated population and completion of missing facts. State-of-the art approaches can not deal with time-series data in form of events that naturally occur in industrial applications. Therefore we present an extension of learning knowledge graph embeddings in conjunction with data in form of event logs. Finally, we design several use case scenarios of missing information and evaluate our embedding approach on data coming from a real-world factory environment. We draw the conclusion that industrial knowledge graphs are a powerful tool that can be used by end-users in the manufacturing domain for data modeling and model validation. They are especially suitable in terms of the facilitated application of statistical models in conjunction with background domain knowledge by providing information about features upfront. Furthermore, relational learning approaches showed great potential to semi-automatically infer missing facts and provide recommendations to production operators on how to keep stored facts in synch with the real world

    Development of a semantic knowledge modelling approach for evaluating offsite manufacturing production processes

    Get PDF
    The housing sector in the UK and across the globe is constantly under pressure to deliver enough affordable houses to meet the increasing demand. Offsite Manufacturing (OSM), a modern method of construction, is considered to be a key aspect in meeting these demands given its potential to increase efficiency and boost productivity. Although the use of OSM to increase the supply of affordable and efficient homes is getting popular, the focus has been on ‘what’ methods of construction are used (i.e. whether implementing OSM or traditional approach) rather than ‘how’ the alternative construction approach shall be done (i.e. choice of OSM method to meet set objectives). There have been criticisms of the approaches used by professionals implementing OSM methods as some of these approaches are non-structured and these methods have been criticised for being similar to the conventional onsite methods with little process gains. There are previous studies that have compared the performance of OSM and other modern methods of construction with conventional methods of construction. However, there is hardly any attempt nor quantitative evidence comparing the performance of various competing OSM approaches (i.e. methods with standardised and non-standardised processes) in order to support stakeholders in making an informed decision on choices of methods. In pursuit of the research gap identified, this research aims to develop a proof-of-concept knowledge-based process analysis tool that would enable OSM practitioners to efficiently evaluate the performances of their choice of OSM methods to support informed decision-making and continuous improvement. To achieve this aim, an ontology knowledge modelling approach was adopted for leveraging data and information sources with semantics, and an offsite production workflow (OPW) ontology was developed to enable a detailed analysis of OSM production methods. The research firstly undertook an extensive critical review of the OSM domain to identify the existing OSM knowledge and how this knowledge can be formalised to aid communication in the OSM domain. In addition, a separate review of process analysis methods and knowledge-based modelling methods was done concurrently to identify the suitable approach for analysing and systemising OSM knowledge respectively. The lean manufacturing value system analysis (VSA) approach was used for the analysis in this study using two units of analysis consisting of an example of atypical non-standardised (i.e. static method of production) and standardised (i.e. semi-automated method of production) OSM methods. The knowledge systematisation was done using an ontology knowledge modelling approach to develop the process analysis tool – OPW ontology. The OPW ontology was further evaluated by mapping a case of lightweight steel frame modular house production to model a real-life context. A two-staged validation approach was then implemented to test the ontology which consists of firstly an internal validation of logic and consistency of the results and then an expert validation process using an industry-approved set of criteria. The result from the study revealed that the non-standardised ad-hoc OSM production method, involving a significant amount of manual tasks, contributes little process improvement from the conventional onsite method when using the metrics of process time and cost. In comparison with the structured method e.g. semi-automated OSM production method, it is discovered that the process cost and time are 82% and 77% more in the static method respectively based on a like-to-like production schedule. The study also evaluates the root causes of process wastes, accounting for non-value-added time and cost consumed. The results contribute to supporting informed decision-making on the choices of OSM production methods for continuous improvement. The main contributions to knowledge and practice are as follows: i. The output of this research contributes to the body of literature on offsite concepts, definition and classification, through the generic classification framework developed for the OSM domain. This provides a means of supporting clear communication and knowledge sharing in the domain and supports knowledge systematisation. ii. The approach used in this research, integrating the value system analysis (VSA) and activity-based costing (ABC) methods for process analysis is a novel approach that bridges that gaps with the use of the ABC method for generating detailed process-related data to support cost/time-based analysis of OSM processes. iii. The developed generic process map which represents the OSM production process captures activity sequences, resources and information flow within the process will help in disseminating knowledge on OSM and improve best practices in the industry. iv. The developed process analysis tool (the OPW ontology) has been tested with a real-life OSM project and validated by domain experts to be a competent tool. The knowledge structure and rules integrated into the OPW ontology have been published on the web for knowledge sharing and re-use. This tool can be adapted by OSM practitioners to develop a company-specific tool that captures their specific business processes, which can then support the evaluation of their processes to enable continuous improvement

    Reusable Knowledge-based Components for Building Software Applications: A Knowledge Modelling Approach

    Get PDF
    In computer science, different types of reusable components for building software applications were proposed as a direct consequence of the emergence of new software programming paradigms. The success of these components for building applications depends on factors such as the flexibility in their combination or the facility for their selection in centralised or distributed environments such as internet. In this article, we propose a general type of reusable component, called primitive of representation, inspired by a knowledge-based approach that can promote reusability. The proposal can be understood as a generalisation of existing partial solutions that is applicable to both software and knowledge engineering for the development of hybrid applications that integrate conventional and knowledge based techniques. The article presents the structure and use of the component and describes our recent experience in the development of real-world applications based on this approach

    Development of a manufacturing feature-based design system

    Get PDF
    Traditional CAD systems are based on the serial approach of the product development cycle: the design process is not integrated with other activities and thus it can not provide information for subsequent phases of product development. In order to eliminate this problem, many modern CAD systems allow the composition of designs from building blocks of higher level of abstraction called features. Although features used in current systems tend to be named after manufacturing processes, they do not, in reality, provide valuable manufacturing data. Apart from the obvious disadvantage that process engineers need to re-evaluate the design and capture the intent of the designer, this approach also prohibits early detection of possible manufacturing problems. This research attempts to bring the design and manufacturing phases together by implementing manufacturing features. A design is composed entirely in a bottom-up manner using manufacturable entities in the same way as they would be produced during the manufacturing phase. Each feature consists of parameterised geometry, manufacturing information (including machine tool, cutting tools, cutting conditions, fixtures, and relative cost information), design limitations, functionality rules, and design-for-manufacture rules. The designer selects features from a hierarchical feature library. Upon insertion of a feature, the system ensures that no functionality or manufacturing rules are violated. If a feature is modified, the system validates the feature by making sure that it remains consistent with its original functionality and design-for-manufacture rules are re-applied. The system also allows analysis of designs, from a manufacturing point of view, that were not composed using features. In order to reduce the complexity of the system, design functionality and design-for manufacture rules are organised into a hierarchical system and are pointed to the appropriate entries of the feature hierarchy. The system makes it possible to avoid costly designs by eliminating possible manufacturing problems early in the product development cycle. It also makes computer-aided process planning feasible. The system is developed as an extension of a commercially available CAD/CAM system (Pro/Engineer), and at its current stage only deals with machining features. However, using the same principles, it can be expanded to cover other kinds of manufacturing processes

    Computational Augmentation of Model Based System Engineering: Supporting Mechatronic System Model Development with AI Technologies

    Get PDF
    Efforts in applying computational support for automatic design synthesis and configuration generation as well as efforts to support descriptive and computational model development for system design and verification has been approached with semantic formalisation of modelling languages and of generic structural and functional concepts using meta-models. Modelling the system using descriptive models helps the designer to explicitly document dependencies between properties and parameters of system and external entities. The descriptive models thus produced often do not consider physics based justification for presence and/or absence of relations. It is often the case, the simulation results obtained at later stages requires changing requirements as well as modifying logical (modelling relations between high level functions parameters/properties and parameters/properties of high level entities) and physical architectures (modelling relations between component’s parameters and properties) to accommodate those requirements. The current MBSE (Model Based System Engineering) tools have capabilities to verify construction of models according to predefined model formats i.e. meta-models. However, these tools and current research in augmenting capabilities of these tools lacks the focus on evaluating content inside the models i.e. whether the system modelled by models represents a system that can be physically realized. This work has tried to avail the potential of available AI (Artificial Intelligence) technologies for assisting modelling activities performed for requirement definition and analysis, architecture design and verification phase of system development process by directing designer to tools that can formalise outputs of model development activities. The proposed problem formulation is based on the insight that a system modelled at both conceptual and detailed design level can be represented by logical and mathematical relations between the properties and parameters of internal and external components or functions of the system and domain. Therefore formulation defines concepts used in requirement, logical architecture and physical architecture models using relation between parameters and properties in those models. Concepts, such as operational requirements (or non-functional requirements for particular use case scenario), are defined through the usage of sets and linking value domains of those sets to particular system application domain for which system model is being developed. These relations enables systematic elaboration of requirements into logical and physical architecture models as well as storage and retrieval of existing model knowledge using existing AI tools. A novel framework has been developed to retrieve existing descriptive structure and function models using logical reasoning as well as to retrieve existing simulation models stored in embedding space of auto-encoder neural network. Beside adopting the concepts of semantic formalisation and meta-model based descriptive knowledge retrieval it utilises novel application of unsupervised representation learning capability of neural network auto-encoders to store known physically and technologically feasible designs in low dimensional representation that cluster similar designs therefore inducing similarity or distance metric that can be used to retrieve the known design with similar behaviour as new required behaviour. Framework also enable application of generic and domain specific logical constraints (as other works has done before) and introduces new concept of system application domain to ensures that at every stage of the model development leading to conceptual physical design architecture stays inside the physical constraints as per system usage domain. The instantiated meta-model elements which are classified to a system application domain (SAD) are implicitly constraint by system usage context constraints (e.g. parameter value restriction), similarly known simulation models can also be categorised to different SADs. The proposed framework extends the conventional approach of automated design synthesis which is only based only on decomposition of high level function (summarizing input to output mapping) into basic functions and selecting components to realize those basic functions. "A system is designed with the aim that it can execute its function(s) as per performance requirements of that function(s) in required operational conditions"- By concentrating on this statement it can be seen that conventional approach of functional decomposition and function allocation to known structural components cannot guarantee to yield a working system in required scenarios by ignoring the dependencies between environment or operating conditions and operating modes of prospective designs satisfying high level function. The results obtained from the implementation of domain specific knowledge representation and retrieval (involving mixture of numerical and logical constraints) as well as the results obtained from implementation of neural network auto-encoder for representation and retrieval of domain specific simulation model demonstrates the viability of these technologies to support the proposed framework

    A Posture Sequence Learning System for an Anthropomorphic Robotic Hand

    Get PDF
    The paper presents a cognitive architecture for posture learning of an anthropomorphic robotic hand. Our approach is aimed to allow the robotic system to perform complex perceptual operations, to interact with a human user and to integrate the perceptions by a cognitive representation of the scene and the observed actions. The anthropomorphic robotic hand imitates the gestures acquired by the vision system in order to learn meaningful movements, to build its knowledge by different conceptual spaces and to perform complex interaction with the human operator
    • 

    corecore