116 research outputs found

    A technical perspective on integrating artificial intelligence to solid-state welding

    Get PDF
    The implementation of artificial intelligence (AI) techniques in industrial applications, especially solid-state welding (SSW), has transformed modeling, optimization, forecasting, and controlling sophisticated systems. SSW is a better method for joining due to the least melting of material thus maintaining Nugget region integrity. This study investigates thoroughly how AI-based predictions have impacted SSW by looking at methods like Artificial Neural Networks (ANN), Fuzzy Logic (FL), Machine Learning (ML), Meta-Heuristic Algorithms, and Hybrid Methods (HM) as applied to Friction Stir Welding (FSW), Ultrasonic Welding (UW), and Diffusion Bonding (DB). Studies on Diffusion Bonding reveal that ANN and Generic Algorithms can predict outcomes with an accuracy range of 85 – 99%, while Response Surface Methodology such as Optimization Strategy can achieve up to 95 percent confidence levels in improving bonding strength and optimizing process parameters. Using ANNs for FSW gives an average percentage error of about 95%, but using metaheuristics refined it at an incrementally improved accuracy rate of about 2%. In UW, ANN, Hybrid ANN, and ML models predict output parameters with accuracy levels ranging from 85 to 96%. Integrating AI techniques with optimization algorithms, for instance, GA and Particle Swarm Optimization (PSO) significantly improves accuracy, enhancing parameter prediction and optimizing UW processes. ANN’s high accuracy of nearly 95% compared to other techniques like FL and ML in predicting welding parameters. HM exhibits superior precision, showcasing their potential to enhance weld quality, minimize trial welds, and reduce costs and time. Various emerging hybrid methods offer better prediction accuracy

    Expert System Applications in Sheet Metal Forming

    Get PDF

    17. Simpozij „Materijali i metalurgija“ – dopuna „Zbornik sažetaka”

    Get PDF
    In Metalurgija 63 (2024) 2,303-320 published „ Book of Abstracts “ (224). Deadline for received of Abstracts was November, 30,2023 y. Many authors have request new deadline by March, 25, 2024 y. Organizing committee have accept new deadline. Now it published supplements of 103 Abstracts.U Metalurgiji 63 (2024) 2,303-320 objavljen je Zbornik sažetaka (224). Rok za primitak sažetke je bio 30. studeni 2023. god. Mnogi autori zatražili novi rok do 25.03.2024. Organizacijski odbor Simpozija je prihvatio novi termin. Objavljuje se sada dodatnih još 160 sažetaka

    A survey of AI in operations management from 2005 to 2009

    Get PDF
    Purpose: the use of AI for operations management, with its ability to evolve solutions, handle uncertainty and perform optimisation continues to be a major field of research. The growing body of publications over the last two decades means that it can be difficult to keep track of what has been done previously, what has worked, and what really needs to be addressed. Hence this paper presents a survey of the use of AI in operations management aimed at presenting the key research themes, trends and directions of research. Design/methodology/approach: the paper builds upon our previous survey of this field which was carried out for the ten-year period 1995-2004. Like the previous survey, it uses Elsevier’s Science Direct database as a source. The framework and methodology adopted for the survey is kept as similar as possible to enable continuity and comparison of trends. Thus, the application categories adopted are: design; scheduling; process planning and control; and quality, maintenance and fault diagnosis. Research on utilising neural networks, case-based reasoning (CBR), fuzzy logic (FL), knowledge-Based systems (KBS), data mining, and hybrid AI in the four application areas are identified. Findings: the survey categorises over 1,400 papers, identifying the uses of AI in the four categories of operations management and concludes with an analysis of the trends, gaps and directions for future research. The findings include: the trends for design and scheduling show a dramatic increase in the use of genetic algorithms since 2003 that reflect recognition of their success in these areas; there is a significant decline in research on use of KBS, reflecting their transition into practice; there is an increasing trend in the use of FL in quality, maintenance and fault diagnosis; and there are surprising gaps in the use of CBR and hybrid methods in operations management that offer opportunities for future research. Design/methodology/approach: the paper builds upon our previous survey of this field which was carried out for the 10 year period 1995 to 2004 (Kobbacy et al. 2007). Like the previous survey, it uses the Elsevier’s ScienceDirect database as a source. The framework and methodology adopted for the survey is kept as similar as possible to enable continuity and comparison of trends. Thus the application categories adopted are: (a) design, (b) scheduling, (c) process planning and control and (d) quality, maintenance and fault diagnosis. Research on utilising neural networks, case based reasoning, fuzzy logic, knowledge based systems, data mining, and hybrid AI in the four application areas are identified. Findings: The survey categorises over 1400 papers, identifying the uses of AI in the four categories of operations management and concludes with an analysis of the trends, gaps and directions for future research. The findings include: (a) The trends for Design and Scheduling show a dramatic increase in the use of GAs since 2003-04 that reflect recognition of their success in these areas, (b) A significant decline in research on use of KBS, reflecting their transition into practice, (c) an increasing trend in the use of fuzzy logic in Quality, Maintenance and Fault Diagnosis, (d) surprising gaps in the use of CBR and hybrid methods in operations management that offer opportunities for future research. Originality/value: This is the largest and most comprehensive study to classify research on the use of AI in operations management to date. The survey and trends identified provide a useful reference point and directions for future research

    Predicting the Future

    Get PDF
    Due to the increased capabilities of microprocessors and the advent of graphics processing units (GPUs) in recent decades, the use of machine learning methodologies has become popular in many fields of science and technology. This fact, together with the availability of large amounts of information, has meant that machine learning and Big Data have an important presence in the field of Energy. This Special Issue entitled “Predicting the Future—Big Data and Machine Learning” is focused on applications of machine learning methodologies in the field of energy. Topics include but are not limited to the following: big data architectures of power supply systems, energy-saving and efficiency models, environmental effects of energy consumption, prediction of occupational health and safety outcomes in the energy industry, price forecast prediction of raw materials, and energy management of smart buildings

    Soft computing for tool life prediction a manufacturing application of neural - fuzzy systems

    Get PDF
    Tooling technology is recognised as an element of vital importance within the manufacturing industry. Critical tooling decisions related to tool selection, tool life management, optimal determination of cutting conditions and on-line machining process monitoring and control are based on the existence of reliable detailed process models. Among the decisive factors of process planning and control activities, tool wear and tool life considerations hold a dominant role. Yet, both off-line tool life prediction, as well as real tune tool wear identification and prediction are still issues open to research. The main reason lies with the large number of factors, influencing tool wear, some of them being of stochastic nature. The inherent variability of workpiece materials, cutting tools and machine characteristics, further increases the uncertainty about the machining optimisation problem. In machining practice, tool life prediction is based on the availability of data provided from tool manufacturers, machining data handbooks or from the shop floor. This thesis recognises the need for a data-driven, flexible and yet simple approach in predicting tool life. Model building from sample data depends on the availability of a sufficiently rich cutting data set. Flexibility requires a tool-life model with high adaptation capacity. Simplicity calls for a solution with low complexity and easily interpretable by the user. A neural-fuzzy systems approach is adopted, which meets these targets and predicts tool life for a wide range of turning operations. A literature review has been carried out, covering areas such as tool wear and tool life, neural networks, frizzy sets theory and neural-fuzzy systems integration. Various sources of tool life data have been examined. It is concluded that a combined use of simulated data from existing tool life models and real life data is the best policy to follow. The neurofuzzy tool life model developed is constructed by employing neural network-like learning algorithms. The trained model stores the learned knowledge in the form of frizzy IF-THEN rules on its structure, thus featuring desired transparency. Low model complexity is ensured by employing an algorithm which constructs a rule base of reduced size from the available data. In addition, the flexibility of the developed model is demonstrated by the ease, speed and efficiency of its adaptation on the basis of new tool life data. The development of the neurofuzzy tool life model is based on the Fuzzy Logic Toolbox (vl.0) of MATLAB (v4.2cl), a dedicated tool which facilitates design and evaluation of fuzzy logic systems. Extensive results are presented, which demonstrate the neurofuzzy model predictive performance. The model can be directly employed within a process planning system, facilitating the optimisation of turning operations. Recommendations aremade for further enhancements towards this direction

    Tool wear monitoring in machining of stainless steel

    Get PDF
    monitoring systems for automated machines must be capable of operating on-line and interpret the working condition of machining process at a given point in time because it is an automated and unmanned system. But this has posed a challenge that lead to this research study. Generally, optimization of machining process can be categorized as minimization of tool wear, minimization of operating cost, maximization of process output and optimization of machine parameter. Tool wear is a complex phenomenon, capable of reducing surface quality, increases power consumption and increased reflection rate of machined parts. Tool wear has a direct effect on the quality of the surface finish for any given work-piece, dimensional precision and ultimately the cost of parts produced. Tool wear usually occur in combination with the principal wear mode which depends on cutting conditions, tool insert geometry, work piece and tool material. Therefore, there is a need to develop a continuous tool monitoring systems that would notify operator the state of tool to avoid tool failure or undesirable circumstances. Tool wear monitoring system for macro-milling has been studied using design and analysis of experiment (DOE) approach. Regression analysis, analysis of variance (ANOVA), Box Behnken and Response Surface Methodology (RSM). These analysis tools were used to model the tool wear. Hence, further investigations were carried out on the data acquired using signal processing and Neural networks frame work to validate the model. The effects of cutting parameters are evaluated and the optimal cutting conditions are determined. The interaction of cutting parameters is established to illustrate the intrinsic relationship between cutting parameters, tool wear and material removal rate. It was observed that when working with stainless steel 316, a maximum tool wear value of 0.29mm was achieved through optimization at low values of feed about 0.06mm/rev, speed of 4050mm/min and depth of cut about 2mm

    Artificial cognitive architecture with self-learning and self-optimization capabilities. Case studies in micromachining processes

    Full text link
    Tesis doctoral inédita leída en la Universidad Autónoma de Madrid, Escuela Politécnica Superior, Departamento de Ingeniería Informática. Fecha de lectura : 22-09-201

    An investigation into the prognosis of electromagnetic relays.

    Get PDF
    Electrical contacts provide a well-proven solution to switching various loads in a wide variety of applications, such as power distribution, control applications, automotive and telecommunications. However, electrical contacts are known for limited reliability due to degradation effects upon the switching contacts due to arcing and fretting. Essentially, the life of the device may be determined by the limited life of the contacts. Failure to trip, spurious tripping and contact welding can, in critical applications such as control systems for avionics and nuclear power application, cause significant costs due to downtime, as well as safety implications. Prognostics provides a way to assess the remaining useful life (RUL) of a component based on its current state of health and its anticipated future usage and operating conditions. In this thesis, the effects of contact wear on a set of electromagnetic relays used in an avionic power controller is examined, and how contact resistance combined with a prognostic approach, can be used to ascertain the RUL of the device. Two methodologies are presented, firstly a Physics based Model (PbM) of the degradation using the predicted material loss due to arc damage. Secondly a computationally efficient technique using posterior degradation data to form a state space model in real time via a Sliding Window Recursive Least Squares (SWRLS) algorithm. Health monitoring using the presented techniques can provide knowledge of impending failure in high reliability applications where the risks associated with loss-of-functionality are too high to endure. The future states of the systems has been estimated based on a Particle and Kalman-filter projection of the models via a Bayesian framework. Performance of the prognostication health management algorithm during the contacts life has been quantified using performance evaluation metrics. Model predictions have been correlated with experimental data. Prognostic metrics including Prognostic Horizon (PH), alpha-Lamda (α-λ), and Relative Accuracy have been used to assess the performance of the damage proxies and a comparison of the two models made

    Machine Learning in Tribology

    Get PDF
    Tribology has been and continues to be one of the most relevant fields, being present in almost all aspects of our lives. The understanding of tribology provides us with solutions for future technical challenges. At the root of all advances made so far are multitudes of precise experiments and an increasing number of advanced computer simulations across different scales and multiple physical disciplines. Based upon this sound and data-rich foundation, advanced data handling, analysis and learning methods can be developed and employed to expand existing knowledge. Therefore, modern machine learning (ML) or artificial intelligence (AI) methods provide opportunities to explore the complex processes in tribological systems and to classify or quantify their behavior in an efficient or even real-time way. Thus, their potential also goes beyond purely academic aspects into actual industrial applications. To help pave the way, this article collection aimed to present the latest research on ML or AI approaches for solving tribology-related issues generating true added value beyond just buzzwords. In this sense, this Special Issue can support researchers in identifying initial selections and best practice solutions for ML in tribology
    • …
    corecore