3,805 research outputs found

    Quality costs and Industry 4.0: inspection strategy modelling and reviewing

    Get PDF
    Inspection strategy (IS) is a key component impacting quality costs. Although often considered an infexible output of initial quality plans, it may require revisions given the dynamic quality situation of the manufacturing system. It is from this background that the present study aims to model and compare diferent IS based on the cost of quality (CoQ) approach for a case study in the automotive manufacturing industry. While many computational inspection strategy models (ISMs) are available in the literature, most of them face application challenges and struggle to incorporate real-world data. The present study addresses this gap by developing a model that not only represents a real testing station in a manufacturing line but also uses historical production data. Additionally, in relation to model inputs, this study explores the challenges and opportunities of acquiring reliable quality cost estimates in the Industry 4.0 context. Among the main contributions of this work, the developed CoQ-based ISM can be used as a decision-making aiding tool for inspection revision and improvement, while conclusions about quality cost data collection in the industrial digitalization context can help advance the CoQ approach in practiceFCT|FCCN (b-on

    The interaction of lean and building information modeling in construction

    Get PDF
    Lean construction and Building Information Modeling are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM functionalities with prescriptive lean construction principles, fifty-six interactions have been identified, all but four of which represent constructive interaction. Although evidence for the majority of these has been found, the matrix is not considered complete, but rather a framework for research to explore the degree of validity of the interactions. Construction executives, managers, designers and developers of IT systems for construction can also benefit from the framework as an aid to recognizing the potential synergies when planning their lean and BIM adoption strategies

    Self-resilient production systems : framework for design synthesis of multi-station assembly systems

    Get PDF
    Product design changes are inevitable in the current trend of time-based competition where product models such as automotive bodies and aircraft fuselages are frequently upgraded and cause assembly process design changes. In recent years, several studies in engineering change management and reconfigurable systems have been conducted to address the challenges of frequent product and process design changes. However, the results of these studies are limited in their applications due to shortcomings in three aspects which are: (i) They rely heavily on past records which might only be a few relevant cases and insufficient to perform a reliable analysis; (ii) They focus mainly on managing design changes in product architecture instead of both product and process architecture; and (iii) They consider design changes at a station-level instead of a multistation level. To address the aforementioned challenges, this thesis proposes three interrelated research areas to simulate the design adjustments of the existing process architecture. These research areas involve: (i) the methodologies to model the existing process architecture design in order to use the developed models as assembly response functions for assessing Key Performance Indices (KPIs); (ii) the KPIs to assess quality, cost, and design complexity of the existing process architecture design which are used when making decisions to change the existing process architecture design; and (iii) the methodology to change the process architecture design to new optimal design solutions at a multi-station level. In the first research area, the methodology in modeling the functional dependence of process variables within the process architecture design are presented as well as the relations from process variables and product architecture design. To understand the engineering change propagation chain among process variables within the process architecture design, a functional dependence model is introduced to represent the design dependency among process variables by cascading relationships from customer requirements, product architecture, process architecture, and design tasks to optimise process variable design. This model is used to estimate the level of process variable design change propagation in the existing process architecture design Next, process yield, cost, and complexity indices are introduced and used as KPIs in this thesis to measure product quality, cost in changing the current process design, and dependency of process variables (i.e, change propagation), respectively. The process yield and complexity indices are obtained by using the Stream-of-Variation (SOVA) model and functional dependence model, respectively. The costing KPI is obtained by determining the cost in optimizing tolerances of process variables. The implication of the costing KPI on the overall cost in changing process architecture design is also discussed. These three comprehensive indices are used to support decision-making when redesigning the existing process architecture. Finally, the framework driven by functional optimisation is proposed to adjust the existing process architecture to meet the engineering change requirements. The framework provides a platform to integrate and analyze several individual design synthesis tasks which are necessary to optimise the multi-stage assembly processes such as tolerance of process variables, fixture layouts, or part-to-part joints. The developed framework based on transversal of hypergraph and task connectivity matrix which lead to the optimal sequence of these design tasks. In order to enhance visibility on the dependencies and hierarchy of design tasks, Design Structure Matrix and Task Flow Chain are also adopted. Three scenarios of engineering changes in industrial automotive design are used to illustrate the application of the proposed redesign methodology. The thesis concludes that it is not necessary to optimise all functional designs of process variables to accommodate the engineering changes. The selection of only relevant functional designs is sufficient, but the design optimisation of the process variables has to be conducted at the system level with consideration of dependency between selected functional designs

    Integrating the Cost of Quality into Multi-Products Multi-Components Supply Chain Network Design

    Get PDF
    More than ever before the success of a company heavily depends on its supply chain and how efficient the network. A supply chain needs to be configured in such a manner as to minimize cost while still maintaining a good quality level to satisfy the end user and to be efficient, designing for the network and the whole chain is important. Including the cost of quality into the process of designing the network can be rewording and revealing. In this research the concept of cost of quality as a performance measure was integrated into the supply chain network designing process for a supply chain concerned with multi products multi components. This research discusses how this supply chain can be mathematically modeled, solutions for the resulted model and finally studied the effect of the inclusion of the quality as a parameter on the result of the deigning process. Nonlinear mixed integer mathematical model was developed for the problem and for solving the model two solutions based on Genetic algorithm and Tabu Search were developed and compared. The results and analysis show that the solution based on the Genetic algorithm outperforms the Tabu Search based solution especially in large size problems. In addition, the analysis showed that the inclusion of the cost of quality into the model effect the designing process and changes the resultant routes

    Bench-Ranking: ettekirjutav analüüsimeetod suurte teadmiste graafide päringutele

    Get PDF
    Relatsiooniliste suurandmete (BD) töötlemisraamistike kasutamine suurte teadmiste graafide töötlemiseks kätkeb endas võimalust päringu jõudlust optimeerimida. Kaasaegsed BD-süsteemid on samas keerulised andmesüsteemid, mille konfiguratsioonid omavad olulist mõju jõudlusele. Erinevate raamistike ja konfiguratsioonide võrdlusuuringud pakuvad kogukonnale parimaid tavasid parema jõudluse saavutamiseks. Enamik neist võrdlusuuringutest saab liigitada siiski vaid kirjeldavaks ja diagnostiliseks analüütikaks. Lisaks puudub ühtne standard nende uuringute võrdlemiseks kvantitatiivselt järjestatud kujul. Veelgi enam, suurte graafide töötlemiseks vajalike konveierite kavandamine eeldab täiendavaid disainiotsuseid mis tulenevad mitteloomulikust (relatsioonilisest) graafi töötlemise paradigmast. Taolisi disainiotsuseid ei saa automaatselt langetada, nt relatsiooniskeemi, partitsioonitehnika ja salvestusvormingute valikut. Käesolevas töös käsitleme kuidas me antud uurimuslünga täidame. Esmalt näitame disainiotsuste kompromisside mõju BD-süsteemide jõudluse korratavusele suurte teadmiste graafide päringute tegemisel. Lisaks näitame BD-raamistike jõudluse kirjeldavate ja diagnostiliste analüüside piiranguid suurte graafide päringute tegemisel. Seejärel uurime, kuidas lubada ettekirjutavat analüütikat järjestamisfunktsioonide ja mitmemõõtmeliste optimeerimistehnikate (nn "Bench-Ranking") kaudu. See lähenemine peidab kirjeldava tulemusanalüüsi keerukuse, suunates praktiku otse teostatavate teadlike otsusteni.Leveraging relational Big Data (BD) processing frameworks to process large knowledge graphs yields a great interest in optimizing query performance. Modern BD systems are yet complicated data systems, where the configurations notably affect the performance. Benchmarking different frameworks and configurations provides the community with best practices for better performance. However, most of these benchmarking efforts are classified as descriptive and diagnostic analytics. Moreover, there is no standard for comparing these benchmarks based on quantitative ranking techniques. Moreover, designing mature pipelines for processing big graphs entails considering additional design decisions that emerge with the non-native (relational) graph processing paradigm. Those design decisions cannot be decided automatically, e.g., the choice of the relational schema, partitioning technique, and storage formats. Thus, in this thesis, we discuss how our work fills this timely research gap. Particularly, we first show the impact of those design decisions’ trade-offs on the BD systems’ performance replicability when querying large knowledge graphs. Moreover, we showed the limitations of the descriptive and diagnostic analyses of BD frameworks’ performance for querying large graphs. Thus, we investigate how to enable prescriptive analytics via ranking functions and Multi-Dimensional optimization techniques (called ”Bench-Ranking”). This approach abstracts out from the complexity of descriptive performance analysis, guiding the practitioner directly to actionable informed decisions.https://www.ester.ee/record=b553332

    Monitoring applications with process mining

    Get PDF
    Esta pesquisa apresenta um novo método para alavancar técnicas de Process Mining com o objetivo de monitorizar e suportar aplicações de Sistemas de Informação. Combinando as metodologias de Revisão Sistemática da Literatura e de Pesquisa em Design Science para abordar os objetivos da investigação. A revisão da literatura foi conduzida para explorar a investigação existente de Process Mining para monitorizar e suportar aplicações. A revisão de literatura seguiu rigorosos critérios de inclusão e exclusão, selecionando um conjunto de estudos que serviram como base de conhecimento, respondendo às questões de investigação colocadas. Foi descrito o potencial dos atuais métodos, as suas aplicações, limitações e dimensões inexploradas. Com base nesta revisão, a metodologia Design Science Research foi utilizada para desenvolver um novo método que descreve uma abordagem estruturada e sistemática para a aplicação de técnicas de Process Mining, especificamente adaptadas para monitorizar e suportar aplicações complexas de Sistemas de Informação. Enfatizando a utilidade prática, o método descreve etapas detalhadas, componentes e diretrizes para uma implementação eficaz. Posteriormente, foi avaliado e validado através de um cenário real de caso de uso, afirmando sua eficácia e potencial impacto em aplicações reais. O processo de avaliação concentrou-se na capacidade de o método identificar ineficiências de processos e fornecer suporte para a tomada de decisões em aplicações de Sistemas de Informação. As conclusões derivadas deste estudo contribuem para a aplicabilidade de Process Mining, introduzindo um método que visa melhorar as capacidades de monitorização e suporte de aplicações de Sistemas de Informação. Esta investigação reforça a relevância prática e o potencial transformador da integração do Process Mining no domínio da gestão de Sistemas de Informação e estabelece as bases para futuras investigações neste campo.This research presents a novel method aimed at leveraging Process Mining techniques for monitoring and supporting Information Systems applications. Combining a Systematic Literature Review and the Design Science Research Methodology to address the research objectives comprehensively. The Systematic Literature Review was conducted to explore the existing landscape of Process Mining for application's monitor and support. The review process followed rigorous inclusion and exclusion criteria, selecting a collection of pertinent studies that served as a foundational knowledge base. Through this review, key insights were reported regarding the current methodologies, their applications, limitations, and the unexplored dimensions within the field. From these insights, the Design Science Research Methodology was then employed to conceptualize and develop a new method. This method outlines a structured and systematic approach to applying Process Mining techniques specifically tailored for monitoring and supporting complex Information Systems applications. Emphasizing practical utility, the method encompasses detailed steps, components, and guidelines for effective implementation. Subsequently, the proposed method was evaluated and validated through a real-world use-case scenario, affirming its efficacy and potential impact in actual application environments. The evaluation process focused on assessing the method's ability to derive actionable insights, identify process inefficiencies, and provide support for decision-making within Information Systems applications. The findings derived from this study contribute to the field of Process Mining by introducing a tailored methodology aimed at enhancing the monitoring and support capabilities of Information Systems applications. This research reinforces the practical relevance and potential transformative impact of integrating Process Mining into the domain of Information Systems management and lays the groundwork for future advancements in this field

    Advancing Carbon Sequestration through Smart Proxy Modeling: Leveraging Domain Expertise and Machine Learning for Efficient Reservoir Simulation

    Get PDF
    Geological carbon sequestration (GCS) offers a promising solution to effectively manage extra carbon, mitigating the impact of climate change. This doctoral research introduces a cutting-edge Smart Proxy Modeling-based framework, integrating artificial neural networks (ANNs) and domain expertise, to re-engineer and empower numerical reservoir simulation for efficient modeling of CO2 sequestration and demonstrate predictive conformance and replicative capabilities of smart proxy modeling. Creating well-performing proxy models requires extensive human intervention and trial-and-error processes. Additionally, a large training database is essential to ANN model for complex tasks such as deep saline aquifer CO2 sequestration since it is used as the neural network\u27s input and output data. One major limitation in CCS programs is the lack of real field data due to a lack of field applications and issues with confidentiality. Considering these drawbacks, and due to high-dimensional nonlinearity, heterogeneity, and coupling of multiple physical processes associated with numerical reservoir simulation, novel research to handle these complexities as it allows for the creation of possible CO2 sequestration scenarios that may be used as a training set. This study addresses several types of static and dynamic realistic and practical field-base data augmentation techniques ranging from spatial complexity, spatio-temporal complexity, and heterogeneity of reservoir characteristics. By incorporating domain-expertise-based feature generation, this framework honors precise representation of reservoir overcoming computational challenges associated with numerical reservoir tools. The developed ANN accurately replicated fluid flow behavior, resulting in significant computational savings compared to traditional numerical simulation models. The results showed that all the ML models achieved very good accuracies and high efficiency. The findings revealed that the quality of the path between the focal cell and injection wells emerged as the most crucial factor in both CO2 saturation and pressure estimation models. These insights significantly contribute to our understanding of CO2 plume monitoring, paving the way for breakthroughs in investigating reservoir behavior at a minimal computational cost. The study\u27s commitment to replicating numerical reservoir simulation results underscores the model\u27s potential to contribute valuable insights into the behavior and performance of CO2 sequestration systems, as a complimentary tool to numerical reservoir simulation when there is no measured data available from the field. The transformative nature of this research has vast implications for advancing carbon storage modeling technologies. By addressing the computational limitations of traditional numerical reservoir models and harnessing the synergy between machine learning and domain expertise, this work provides a practical workflow for efficient decision-making in sequestration projects

    Multi Agent Systems in Logistics: A Literature and State-of-the-art Review

    Get PDF
    Based on a literature survey, we aim to answer our main question: “How should we plan and execute logistics in supply chains that aim to meet today’s requirements, and how can we support such planning and execution using IT?†Today’s requirements in supply chains include inter-organizational collaboration and more responsive and tailored supply to meet specific demand. Enterprise systems fall short in meeting these requirements The focus of planning and execution systems should move towards an inter-enterprise and event-driven mode. Inter-organizational systems may support planning going from supporting information exchange and henceforth enable synchronized planning within the organizations towards the capability to do network planning based on available information throughout the network. We provide a framework for planning systems, constituting a rich landscape of possible configurations, where the centralized and fully decentralized approaches are two extremes. We define and discuss agent based systems and in particular multi agent systems (MAS). We emphasize the issue of the role of MAS coordination architectures, and then explain that transportation is, next to production, an important domain in which MAS can and actually are applied. However, implementation is not widespread and some implementation issues are explored. In this manner, we conclude that planning problems in transportation have characteristics that comply with the specific capabilities of agent systems. In particular, these systems are capable to deal with inter-organizational and event-driven planning settings, hence meeting today’s requirements in supply chain planning and execution.supply chain;MAS;multi agent systems
    corecore