464 research outputs found
Ergonomic Chair Design by Fusing Qualitative and Quantitative Criteria using Interactive Genetic Algorithms
This paper emphasizes the necessity of formally bringing qualitative and
quantitative criteria of ergonomic design together, and provides a novel
complementary design framework with this aim. Within this framework, different
design criteria are viewed as optimization objectives; and design solutions are
iteratively improved through the cooperative efforts of computer and user. The
framework is rooted in multi-objective optimization, genetic algorithms and
interactive user evaluation. Three different algorithms based on the framework
are developed, and tested with an ergonomic chair design problem. The parallel
and multi-objective approaches show promising results in fitness convergence,
design diversity and user satisfaction metrics
Multi-objective optimisation of reliable product-plant network configuration.
Ensuring manufacturing reliability is key to satisfying product orders when production plants are subject to disruptions. Reliability of a supply network is closely related to the redundancy of products as production in disrupted plants can be replaced by alternative plants. However the benefits of incorporating redundancy must be balanced against the costs of doing so. Models in literature are highly case specific and do not consider complex network structures and redundant distributions of products over suppliers, that are evident in empirical literature. In this paper we first develop a simple generic measure for evaluating the reliability of a network of plants in a given product-plant configuration. Second, we frame the problem as a multi-objective evolutionary optimisation model to show that such a measure can be used to optimise the cost-reliability trade off. The model has been applied to a producerâs automotive light and lamp production network using three popular genetic algorithms designed for multi-objective problems, namely, NSGA2, SPEA2 and PAES. Using the model in conjunction with genetic algorithms we were able to find trade off solutions successfully. NSGA2 has achieved the best results in terms of Pareto front spread. Algorithms differed considerably in their performance, meaning that the choice of algorithm has significant impact in the resulting search space exploration
Recommended from our members
The moderating impact of supply network topology on the effectiveness of risk management
While supply chain risk management offers a rich toolset for dealing with risk at the dyadic level, less attention has been given to the effectiveness of risk management in complex supply networks. We bridge this gap by building an agent based model to explore the relationship between topological characteristics of complex supply networks and their ability to recover through inventory mitigation and contingent rerouting. We simulate upstream supply networks, where each agent represents a supplier. Suppliersâ connectivity patterns are generated through random and preferential attachment models. Each supplier manages its inventory using an anchor-and-adjust ordering policy. We then randomly disrupt suppliers and observe how different topologies recover when risk management strategies are applied. Our results show that topology has a moderating effect on the effectiveness of risk management strategies. Scale-free supply networks generate lower costs, have higher fill-rates, and need less inventory to recover when exposed to random disruptions than random networks. Random networks need significantly more inventory distributed across the network to achieve the same fill rates as scale-free networks. Inventory mitigation improves fill-rate more than contingent rerouting regardless of network topology. Contingent rerouting is not effective for scale-free networks due to the low number of alternative suppliers, particularly for short-lasting disruptions. We also find that applying inventory mitigation to the most disrupted suppliers is only effective when the network is exposed to frequent disruptions; and not cost effective otherwise. Our work contributes to the emerging field of research on the relationship between complex supply network topology and resilience
Fast Machine Unlearning Without Retraining Through Selective Synaptic Dampening
Machine unlearning, the ability for a machine learning model to forget, is
becoming increasingly important to comply with data privacy regulations, as
well as to remove harmful, manipulated, or outdated information. The key
challenge lies in forgetting specific information while protecting model
performance on the remaining data. While current state-of-the-art methods
perform well, they typically require some level of retraining over the retained
data, in order to protect or restore model performance. This adds computational
overhead and mandates that the training data remain available and accessible,
which may not be feasible. In contrast, other methods employ a retrain-free
paradigm, however, these approaches are prohibitively computationally expensive
and do not perform on par with their retrain-based counterparts. We present
Selective Synaptic Dampening (SSD), a novel two-step, post hoc, retrain-free
approach to machine unlearning which is fast, performant, and does not require
long-term storage of the training data. First, SSD uses the Fisher information
matrix of the training and forgetting data to select parameters that are
disproportionately important to the forget set. Second, SSD induces forgetting
by dampening these parameters proportional to their relative importance to the
forget set with respect to the wider training data. We evaluate our method
against several existing unlearning methods in a range of experiments using
ResNet18 and Vision Transformer. Results show that the performance of SSD is
competitive with retrain-based post hoc methods, demonstrating the viability of
retrain-free post hoc unlearning approaches
Identifying contributors to supply chain outcomes in a multi-echelon setting: a decentralised approach
Organisations often struggle to identify the causes of change in metrics such
as product quality and delivery duration. This task becomes increasingly
challenging when the cause lies outside of company borders in multi-echelon
supply chains that are only partially observable. Although traditional supply
chain management has advocated for data sharing to gain better insights, this
does not take place in practice due to data privacy concerns. We propose the
use of explainable artificial intelligence for decentralised computing of
estimated contributions to a metric of interest in a multi-stage production
process. This approach mitigates the need to convince supply chain actors to
share data, as all computations occur in a decentralised manner. Our method is
empirically validated using data collected from a real multi-stage
manufacturing process. The results demonstrate the effectiveness of our
approach in detecting the source of quality variations compared to a
centralised approach using Shapley additive explanations
Recommended from our members
Supply Networks as Complex Systems: A Network-Science-Based Characterization
Outsourcing, internationalization, and complexity characterize today's aerospace supply chains, making aircraft manufacturers structurally dependent on each other. Despite several complexity-related supply chain issues reported in the literature, aerospace supply chain structure has not been studied due to a lack of empirical data and suitable analytical toolsets for studying system structure. In this paper, we assemble a large-scale empirical data set on the supply network of Airbus and apply the new science of networks to analyze how the industry is structured. Our results show that the system under study is a network, formed by communities connected by hub firms. Hub firms also tend to connect to each other, providing cohesiveness, yet making the network vulnerable to disruptions in them. We also show how network science can be used to identify firms that are operationally critical and that are key to disseminating information
Topological robustness of the global automotive industry
The manufacturing industry is characterized by large-scale interdependent networks as companies buy goods from one another, but do not control or design the overall flow of materials. The result is a complex emergent structure with which companies connect to each other. The topology of this structure impacts the industryâs robustness to disruptions in companies, countries, and regions. In this work, we propose an analysis framework for examining robustness in the manufacturing industry and validate it using an empirical dataset. Focusing on two key angles, suppliers and products, we highlight macroscopic and microscopic characteristics of the network and shed light on vulnerabilities of the system. It is shown that large-scale data on structural interdependencies can be examined with measures based on network science
- âŠ