206 research outputs found

    A semantically constrained Bayesian network for manufacturing diagnosis

    Get PDF
    The diagnostic problem is posed as recognizing patterns in rejection data and thesubsequent mapping to causes. A new network architecture has been proposedwhich should overcome many of the disadvantages of the existing diagnostictools. The network is based on the authors’ earlier work (Ransing et al. 1995)on representing the causal relationship in the defect-metacause-rootcause form.Although the algorithm is based on the Bayesian analysis, many of the laws ofprobability have been altered to suit the complexities involved. For example, thenotion of conditional probability has been generalized to enable the belief revisioneven in the presence of partial evidence. The inherent presence of the degree ofignorance or uncertainty in the quanti® cation of a relationship has also beenconsidered. Rigorous constraints, again based on the laws of probability, havebeen developed to check the consistency among the network values. The networkis required to be initialized with only a few values or the range for the same andthen a set of globally consistent values is generated automatically and e ciently.Using the most suitable set of consistent values, the diagnosis is performed usingthe generalized Bayesian analysis. The network has been tested for a pressure diecasting process, however, it is generic in nature and can also be applied to othermanufacturing processes

    “If only my foundry knew what it knows …”: A 7Epsilon perspective on root cause analysis and corrective action plans for ISO9001:2008

    Get PDF
    The famous quotes of a former Chairman, president and CEO of Texas Instruments and Chairman of HP “if only we knew what we know” are very much applicable to the foundry industry. Despite the fact that many advances have been made in the field of foundry technologies relating to simulation software, moulding machines, binder formulation and alloy development, poor quality still remains a major issue that affects many foundries not only in terms of lost revenues but also contributing to negative environmental impacts. On an annual casting production of 95 million tonnes, assuming that on average 5% defective castings are produced with a production cost of 1.2€ per kg for ferrous alloys, the foundry industry is losing 5.7 billion €, producing landfill waste well in excess of two million tonnes and releasing just under two million tonnes of CO2 emissions. Foundries have vast proportion of knowledge that is waiting to be tapped, documented, shared and reused in order to realise the saving potential of 5.7 billion € per year. This ambitious goal can only be achieved by developing effective knowledge management strategies to create, retain and re-use foundry and product specific process knowledge whilst supporting a smart and sustainable growth strategy. This is the focus of 7Epsilon (7ε), an innovative methodology led by Swansea University along with a consortium of European universities and research organisations. At the core of 7ε capabilities is casting process optimisation which is defined as a methodology of using existing casting process knowledge to discover new process knowledge by studying patterns in data 1. According to the 7ε terminology, casting process knowledge is actionable information in the form of a list of measurable factors and their optimal ranges to achieve a desired business goal 1, 2. In this paper a penalty matrix approach is described for discovering main effects and interactions among process factors and responses by analysing data collected during a stable casting process. Through a practical cases study it is shown how this technique can be used as an effective tool in the root cause analysis of nonconforming products in the implementation of ISO9001:2008 requirements for continual improvement. In addition some practical aspects concerning the development of a knowledge management repository to store and retrieve foundry process knowledge are discussed. A template to document and structure foundry and product specific process knowledge is proposed so that knowledge can be stored and retrieved more efficiently by process engineers and managers with the final aim to improve process operations and reduce defects rates, taking a significant step towards achieving zero defect manufacturing

    Seven Steps to Energy Efficiency for Foundries

    Get PDF
    Steve Robinsons of American Foundrymen Society has once argued that foundries with 4% profit margin need to find new sales revenue of US1MilliontogenerateUS1 Million to generate US40,000 operating profits. The case study presented in this paper highlights how an in-process quality improvement exercise resulted in annual saving of US$144,000 by studying in-process data in a melt room on 25 process inputs. Foundry is an energy intensive industry. Energy costs for foundries are around 15% of the cost of castings. In recent years foundries have become energy aware and many have installed energy meters with on-line energy monitoring systems to report energy consumption (kWh) per tonne, charge or furnace with varying sampling frequency. This paper highlights how 7 Steps of 7Epsilon were implemented and in-process data for a foundry was visualised using penalty matrices to discover energy saving opportunities. With ISO 9001:2015 on the horizon there is an urgent need to change the foundry culture - across the world - towards capturing, storing, reusing in-process data as well as organisational knowledge in order to demonstrate in-process quality improvement. The 7Epsilon approach offers a structured methodology for organizational knowledge management as well as in-process quality improvement

    Comparison of point foot, collisional and smooth rolling contact models on the bifurcations and stability of bipedal walking

    Get PDF
    Traditional biped walkers based on passive dynamic walking usually have flat or circular feet. Thisfoot contact may be modelled with an eective rocker - represented as a roll-over shape - to describethe function of the knee-ankle-foot complex in human ambulation. Mahmoodi et al. has representedthis roll-over shape as a polygon with a discretized set of collisions. In this paper point foot, collisionaland smooth rolling contact models are compared. An approach based on the Lagrangian mechanicsare used to formulate the equations for the swing phase that conserves mechanical energy. Qualitativeinsight can be gained by studying the bifurcation diagrams of gait descriptors such as average velocity,step period, mechanical energy and inter-leg angle for dierent gain and length values for the feet,as well as dierent mass and length ratios. The results from the three approaches are compared anddiscussed. In the case of a rolling disk, the collisional contact model gives a negligible energy loss;incorporated into the double inverted pendulum system, however, reveals much greater errors. Thisresearch is not only useful for understanding the stability of bipedal walking, but also for the designof rehabilitative devices such as prosthetic feet and orthoses

    Enhancing undergraduate student skills to meet research challenges: Case Studies and Examples

    Get PDF
    The objective of embedding research activities in the undergraduate teaching programs at State Engineering colleges in India is worth pursuing and is certainly beneficial to all stakeholders: local industries, students and lecturers. One of the main objectives of incorporating research at an undergraduate level is to enhance student’s lateral thinking skills. We want them to develop the ability of analysing an engineering problem from first principles and recognise as well as take into account the work done by others in the field. In the knowledge driven society that is becoming truly global and interlinked, we need to equip students with an ability to challenge the status quo with independent and logical thinking. The expectation would be that students would provide solutions that offer better returns on investment for their employers. This is likely to enhance the culture of innovation in industry and increase profitability of the business. In this article, I would like to share some examples and cases studies of how research can be embedded in the teaching based on my personal experience at Swansea University. My examples would range from activities that we undertake for the benefit of Swansea University undergraduate students including Government assisted programs to link student, academia and industry

    A novel imputation based predictive algorithm for reducing common cause variation from small and mixed datasets with missing values

    Get PDF
    Most process control algorithms need a predetermined target value as an input for a process variable so that the deviation is observed and minimized. In this paper, a novel machine learning algorithm is proposed that has an ability to not only suggest new target values for both categorical and continuous variables to minimize process output variation but also predict the extent to which the variation can be minimized.In foundry processes, an average rejection rate of 3%–5% within batches of castings produced is considered as acceptable and is considered as an effect of the common cause variation. As a result, the operating range for process input values is often not changed during the root cause analysis. The relevant available historical process data is normally limited with missing values and it combines both categorical and continuous variables (mixed dataset). However, technological advancements manufacturing processes provide opportunities to further refine process inputs in order to minimize undesired variation in process outputs.A new linear regression based algorithm is proposed to achieve lower prediction error in comparison to the commonly used linear factor analysis for mixed data (FAMD) method. This algorithm is further coupled with a novel missing data algorithm to predict the process response values corresponding to a given set of values for process inputs. This enabled the novel imputation based predictive algorithm to quantify the effect of a confirmation trial based on the proposed changes in the operating ranges of one or more process inputs. A set of values for optimal process inputs is generated from operating ranges discovered by a recently proposed quality correlation algorithm (QCA) using a Bootstrap sampling method. The odds ratio, which represents a ratio between the probability of occurrence of desired and undesired process output values, is used to quantify the effect of a confirmation trial.The limitations of the underlying PCA based linear model have been discussed and the future research areas have been identified

    A quality correlation algorithm for tolerance synthesis in manufacturing operations

    Get PDF
    The clause 6.1 of the ISO9001:2015 quality standard requires organisations to take specific actions to determine and address risks and opportunities in order to minimize undesired effects in the process and achieve process improvement. This paper proposes a new quality correlation algorithm to optimise tolerance limits of process variables across multiple processes. The algorithm uses reduced p-dimensional principal component scores to determine optimal tolerance limits and also embeds ISO9001:2015’s risk based thinking approach. The corresponding factor and response variable pairs are chosen by analysing the mixed data set formulation proposed by Giannetti etl al. (2014) and co-linearity index algorithm proposed by Ransing et al. (2013). The goal of this tolerance limit optimisation problem is to make several small changes to the process in order to reduce undesired process variation. The optimal and avoid ranges of multiple process parameters are determined by analysing in-process data on categorical as well as continuous variables and process responses being transformed using the risk based thinking approach. The proposed approach has been illustrated by analysing in-process chemistry data for a nickel based alloy for manufacturing cast components for an aerospace foundry. It is also demonstrated how the approach embeds the risk based thinking into the in-process quality improvement process as required by the ISO9001:2015 standard

    A novel mathematical formulation for predicting symmetric passive bipedal walking motion with unbalanced masses

    Get PDF
    Commercial prosthetic feet weigh about 25% of their equivalent physiological counterparts. The human body is able to overcome the walking asymmetry resulting from this mass imbalance by exerting more energy. It is hypothesised that the passive walking dynamics coupled with roll-over shapes has potential to suggest energy efficient walking solutions. A two link passive walking kinematic model has been proposed to study the gait pattern with unbalanced leg masses. An optimal roll-over shape for the prosthetic foot that minimises the asymmetry in the inter-leg angle and the step period is determined. The proposed mathematical formulation provides insights into the variation of step length and inter-leg angle with respect to the position and location of the centres for mass of both prosthetic and physiological legs

    A bootstrap method for uncertainty estimation in quality correlation algorithm for risk based tolerance synthesis

    Get PDF
    A risk based tolerance synthesis approach is based on ISO9001:2015 quality standard’s risk based thinking. It analyses in-process data to discover correlations among regions of input data scatter and desired or undesired process outputs. Recently, Ransing, Batbooti, Giannetti, and Ransing (2016) proposed a quality correlation algorithm (QCA) for risk based tolerance synthesis. The quality correlation algorithm is based on the principal component analysis (PCA) and a co-linearity index concept (Ransing, Giannetti, Ransing, & James 2013). The uncertainty in QCA results on mixed data sets is quantified and analysed in this paper.The uncertainty is quantified using a bootstrap sampling method with bias-corrected and accelerated confidence intervals. The co-linearity indices use the length and cosine angles of loading vectors in a p-dimensional space. The uncertainty for all p-loading vectors is shown in a single co-linearity index plot and is used to quantify the uncertainty in predicting optimal tolerance limits. The effects of re-sampling distributions are analysed. The QCA tolerance limits are revised after estimating the uncertainty in limits via bootstrap sampling. The proposed approach has been demonstrated by analysing in-process data from a previously published case study

    BPGD-AG: A New Improvement Of Back-Propagation Neural Network Learning Algorithms With Adaptive Gain

    Get PDF
    The back propagation algorithm is one of the popular learning algorithms to train self learning feed forward neural networks. However, the convergence of this algorithm is slow mainly because the algorithm required the designers to arbitrarily select parameters such as network topology, initial weights and biases, learning rate value, the activation function, value for gain in activation function and momentum. An improper choice of theses parameters can result the training process comes to as standstill or get stuck at local minima. Previous research demonstrated that in a back propagation algorithm, the slope of the activation function is directly influenced by a parameter referred to as ‘gain’. In this paper, the influence of the variation of ‘gain’ on the learning ability of a back propagation neural network is analysed. Multi layer feed forward neural networks have been assessed. Physical interpretation of the relationship between the gain value and the learning rate and weight values is given. Instead of a constant ‘gain’ value, we propose an algorithm to change the gain value adaptively for each node. The efficiency of the proposed algorithm is verified by means of simulation on a function approximation problem using sequential as well as batch modes of training. The results show that the proposed algorithm significantly improves the learning speed of the general back-propagation algorithm
    • …
    corecore