85 research outputs found

    Machine learning methods for quality prediction in thermoplastics injection molding

    Get PDF
    © 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksNowadays, competitiveness is a reality in all industrial fields and the plastic injection industry is not an exception. Due to the complex intrinsic changes that the parameters undergo during the injection process, it is essential to monitor the parameters that influence the quality of the final part to guarantee a superior quality of service provided to customers. Quality requirements impose the development of intelligent systems capable to detect defects in the produced parts. This article presents a first step towards building an intelligent system for classifying the quality of produced parts. The basic approach of this work is machine learning methods (Artificial Neural Networks and Support Vector Machines) and techniques that combine the two previous approaches (ensemble method). These are trained as classifiers to detect conformity or even defect types in parts. The data analyzed were collected at a plastic injection company in Portugal. The results show that these techniques are capable of incorporating the non-linear relationships between the process variables, which allows for a good accuracy ( ˜ 99%) in the identification of defects. Although these techniques present good accuracy, we show that taking into account the history of the last cycles and the use of combined techniques improves even further the performance. The approach presented in this article has a number of potential advantages for online predicting of parts quality in injection molding processes.This work was partially supported by the Spanish State Research Agency through the project CHLOE-GRAPH (PID2020-118649RB-l00) and by FCT—Portuguese Foundation for Science and Technology under project grant UIDB/00308/2020.Postprint (author's final draft

    An approach to integrating manufacturing data from legacy Injection Moulding Machines using OPC UA

    Get PDF
    To achieve the ambitions related with the concept of a Smart Factory, manufacturers of new industrial devices have been developing and releasing products capable of integrating themselves into fully-connected environments, with the communication capabilities and advanced specifications required. In these environments, the automatic retrieval of data across the shop floor is a must, allowing the analysis of machine performance for increased production quality and outputs. On most of the recently released industrial devices this machine data is readily available. However, the same is not true when using legacy devices. It is also well established that most SMEs are unable or do not intend to radically replace their industrial devices with this purpose only, since that would imply a high investment, and mainly because many of these legacy machines remain highly productive. That said, there is a need to develop integration methodologies for these legacy industrial devices and provide them with smart factory communication capabilities that make them suitable for the new Smart Factory environments. In this work, an approach is proposed, using as a case study an industrial shop floor, to integrate data from a range of injection moulding machines, from different generations and different models / manufacturers. This equipment diversity renders the automatic interconnection extremely challenging, but is also representative of many existing industrial scenarios. This research will contribute to the development of integration methodologies and, consequently, improve equipment compatibility. To apply these methodologies, information about specific machines within the shop floor was gathered, as well as their communication and I/O capabilities, together with other features deemed relevant. A trend in recently released machines can be identified, revealing a special focus on the use of OPC UA standard, making use of its address space based on the structured Euromap information models. On the other hand, the legacy devices mainly allow outputting a text file to an external storage unit connected to the machine, containing machine and injection cycles related information. Regarding the communication interfaces available, the Ethernet interface reveals to be the most common among the recently acquired machines, while USB is the main interface in older equipment. An experimental solution was developed for the presented case study, which uses the machine's USB interface to access these files at each injection cycle, mapping the acquired data to structured information model variables, according with Euromap specifications, and making it available through an OPC UA server address space. The developed server provides a standardized, interoperable, scalable, and secure approach for data exchange between the injection moulding machines and various OPC UA clients, allowing device monitoring and control during operation, as well as transmitting this data to higher-level management systems, e.g., MES and ERP systems. This solution shows that older legacy devices, available across the shop floors, can be retrofitted and integrated in Smart Factory scenarios, side-by-side with recently released equipment, giving production managers access to information needed to monitor and improve the production process, thus moving towards the Factories of the Future.info:eu-repo/semantics/publishedVersio

    Identifying key interactions between process variables of different material categories using mutual information-based network inference method

    Get PDF
    This paper analyzes production data from injection molding processes to identify key interactions between the process variables from different material categories using the network inference method called "bagging conservative causal core network" (BC3net). This approach is an ensemble method with mutual information that is measured between process variables to select pairs that show significant shared information. We construct networks for different time intervals and aggregate them by calculating the proportion of significant pairs of process variables (weighted edges) for each production process over time. The weighted edges of the aggregated network for each product are used in a machine learning model to optimize the network interval size (interval split) and feature selection, where edge weights are the input features and material categories are the output classification labels. The time intervals are optimized based on the classification accuracy of the machine learning model. Our analysis shows that the aggregated edge features of inferred networks can classify different material categories and identify critical features that represent interdependence in the associated process variables. We further used the "one vs. other" labels for the machine learning models to identify material-specific interactions for each material category. Additionally, we constructed an aggregated network over all samples in which the process variable interactions were steady over time. The resulting network showed modular characteristics where process variables of similar categories were grouped in the same community.publishedVersionPeer reviewe

    Scientific Advances in STEM: From Professor to Students

    Get PDF
    This book collects the publications of the special Topic Scientific advances in STEM: from Professor to students. The aim is to contribute to the advancement of the Science and Engineering fields and their impact on the industrial sector, which requires a multidisciplinary approach. University generates and transmits knowledge to serve society. Social demands continuously evolve, mainly because of cultural, scientific, and technological development. Researchers must contextualize the subjects they investigate to their application to the local industry and community organizations, frequently using a multidisciplinary point of view, to enhance the progress in a wide variety of fields (aeronautics, automotive, biomedical, electrical and renewable energy, communications, environmental, electronic components, etc.). Most investigations in the fields of science and engineering require the work of multidisciplinary teams, representing a stockpile of research projects in different stages (final year projects, master’s or doctoral studies). In this context, this Topic offers a framework for integrating interdisciplinary research, drawing together experimental and theoretical contributions in a wide variety of fields

    Personalised antimicrobial management in secondary care

    Get PDF
    Background: The growing threat of Antimicrobial Resistance (AMR) requires innovative methods to promote the sustainable effectiveness of antimicrobial agents. Hypothesis: This thesis aimed to explore the hypothesis that personalised decision support interventions have the utility to enhance antimicrobial management across secondary care. Methods: Different research methods were used to investigate this hypothesis. Individual physician decision making was mapped and patient experiences of engagement with decision making explored using semi-structured interviews. Cross-specialty engagement with antimicrobial management was investigated through cross-sectional analysis of conference abstracts and educational training curricula. Artificial intelligence tools were developed to explore their ability to predict the likelihood of infection and provide individualised prescribing recommendations using routine patient data. Dynamic, individualised dose optimisation was explored through: (i) development of a microneedle based, electrochemical biosensor for minimally invasive monitoring of beta-lactams; and (ii) pharmacokinetic (PK)-pharmacodynamic (PD) modelling of a new PK-PD index using C-Reactive protein (CRP) to predict the pharmacodynamics of vancomycin. Ethics approval was granted for all aspects of work explored within this thesis. Results: Mapping of individual physician decision making during infection management demonstrated several areas where personalised, technological interventions could enhance antimicrobial management. At specialty level, non-infection specialties have little engagement with antimicrobial management. The importance of engaging surgical specialties, who have relatively high rates of antimicrobial usage and healthcare associated infections, was observed. An individualised information leaflet, co-designed with patients, to provide personalised infection information to in-patients receiving antibiotics significantly improved knowledge and reported engagement with decision making. Artificial intelligence was able to enhance the prediction of infection and the prescribing of antimicrobials using routinely available clinical data. Real-time, continuous penicillin monitoring was demonstrated using a microneedle based electrochemical sensor in-vivo. A new PK-PD index, using C-Reactive Protein, was able to predict individual patient response to vancomycin therapy at 96-120 hours of therapy. Conclusion: Through co-design and the application of specific technologies it is possible to provide personalised antimicrobial management within secondary care.Open Acces

    When costs from being a constraint become a driver for concept generation

    Get PDF
    Managing innovation requires solving issues related to the internal development and engineering processes of a company (supply side), in addition to facing the market and competition (demand side). In this context, the product development process is crucial, as different tradeoffs and issues that require managerial attention tend to arise. The main challenges result in managers requiring practical support tools that can help them in planning and controlling the process, and of designers requiring them for supporting their design decisions. Hence, the thesis aims to focus on product costs to understand its influence on design decisions as well as on the overall management of the product development process. The core part of the thesis is based on the models and methods developed for enhancing cost analysis at the beginning of the product development process. This investigation aims to determine the importance of cost estimation in improving the overall performance of a newly designed product. The focus on post-sales and, more generally, on the customer, has become so relevant that manufacturers have to take into account not only the most obvious aspects about the product and related services, but even consider the associated implications for customers during product use. However, implementing a product life cycle perspective is still a challenging process for companies. From a methodological perspective, the reasons include uncertainty regarding the available approaches and ambiguity about their application. In terms of implementation, the main challenge is the long-term cost management, when one considers uncertainty in process duration, data collection, and other supply chain issues. In fact, helping designers and managers efficiently understand the strategic and operational consequences of a cost analysis implementation is still a problem, although advanced methodologies for more in-depth and timely analyses are available. And this is even more if one considers that product lifecycle represents a critical area of investment, particularly in light of the new challenges and opportunities provided by big data analysis in the Industry 4.0 contexts. This dissertation addresses these aspects and provides a methodological approach to assess a rigorous implementation of life-cycle cost while discussing the evidence derived from its operational and strategic impacts. The novelty lies in the way the data and information are collected, dynamically moving the focus of the investigation with regard to the data aggregation level and the product structure. The way the techniques have been combined represents a further aspect of novelty. In fact, the introduced approach contributes to a new trend in the Product Cost Estimation (PCE) literature, which suggests the integration of different techniques for product life-cycle cost analysis. The findings obtained at the end of the process can be employed to assess the impact of platform design strategy and variety proliferations on the total life-cycle costs. By evaluating the possible mix of options, and hence offering the optimal product configuration, a more conscious way for planning the product portfolio has been provided. In this sense, a detailed operational analysis (as the cost estimation) is used to inform and drive the strategic planning of the portfolio. Finally, the thesis discusses the future opportunities and challenges for product cost analysis, assessing how digitalisation of manufacturing operations may affect the data gathering and analysis process. In this new environment, the opportunity for a more informed, cost-driven decision-making will multiply, leading to varied opportunities in this research field

    Systems Engineering: Availability and Reliability

    Get PDF
    Current trends in Industry 4.0 are largely related to issues of reliability and availability. As a result of these trends and the complexity of engineering systems, research and development in this area needs to focus on new solutions in the integration of intelligent machines or systems, with an emphasis on changes in production processes aimed at increasing production efficiency or equipment reliability. The emergence of innovative technologies and new business models based on innovation, cooperation networks, and the enhancement of endogenous resources is assumed to be a strong contribution to the development of competitive economies all around the world. Innovation and engineering, focused on sustainability, reliability, and availability of resources, have a key role in this context. The scope of this Special Issue is closely associated to that of the ICIE’2020 conference. This conference and journal’s Special Issue is to present current innovations and engineering achievements of top world scientists and industrial practitioners in the thematic areas related to reliability and risk assessment, innovations in maintenance strategies, production process scheduling, management and maintenance or systems analysis, simulation, design and modelling

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    Ensuring the resilience of wireless sensor networks to malicious data injections through measurements inspection

    Get PDF
    Malicious data injections pose a severe threat to the systems based on \emph{Wireless Sensor Networks} (WSNs) since they give the attacker control over the measurements, and on the system's status and response in turn. Malicious measurements are particularly threatening when used to spoof or mask events of interest, thus eliciting or preventing desirable responses. Spoofing and masking attacks are particularly difficult to detect since they depict plausible behaviours, especially if multiple sensors have been compromised and \emph{collude} to inject a coherent set of malicious measurements. Previous work has tackled the problem through \emph{measurements inspection}, which analyses the inter-measurements correlations induced by the physical phenomena. However, these techniques consider simplistic attacks and are not robust to collusion. Moreover, they assume highly predictable patterns in the measurements distribution, which are invalidated by the unpredictability of events. We design a set of techniques that effectively \emph{detect} malicious data injections in the presence of sophisticated collusion strategies, when one or more events manifest. Moreover, we build a methodology to \emph{characterise} the likely compromised sensors. We also design \emph{diagnosis} criteria that allow us to distinguish anomalies arising from malicious interference and faults. In contrast with previous work, we test the robustness of our methodology with automated and sophisticated attacks, where the attacker aims to evade detection. We conclude that our approach outperforms state-of-the-art approaches. Moreover, we estimate quantitatively the WSN degree of resilience and provide a methodology to give a WSN owner an assured degree of resilience by automatically designing the WSN deployment. To deal also with the extreme scenario where the attacker has compromised most of the WSN, we propose a combination with \emph{software attestation techniques}, which are more reliable when malicious data is originated by a compromised software, but also more expensive, and achieve an excellent trade-off between cost and resilience.Open Acces

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems
    • …
    corecore