210 research outputs found

    Intelligent systems in manufacturing: current developments and future prospects

    Get PDF
    Global competition and rapidly changing customer requirements are demanding increasing changes in manufacturing environments. Enterprises are required to constantly redesign their products and continuously reconfigure their manufacturing systems. Traditional approaches to manufacturing systems do not fully satisfy this new situation. Many authors have proposed that artificial intelligence will bring the flexibility and efficiency needed by manufacturing systems. This paper is a review of artificial intelligence techniques used in manufacturing systems. The paper first defines the components of a simplified intelligent manufacturing systems (IMS), the different Artificial Intelligence (AI) techniques to be considered and then shows how these AI techniques are used for the components of IMS

    Appreciating the Performance of Neuroscience Mining in NeuroIS research: A Case Study on Consumer's Product Perceptions in the Two UI Modes—Dark UI vs. Light UI

    Get PDF
    The goal of the current study was to provide information on the potential of neuroscience mining (NSM) for comprehending NeuroIS paradigms. NSM is an interdisciplinary field that combines neuroscience and business mining, which is the application of big data analytics, computational social science, and other fields to business problems. Therefore, NSM makes it possible to apply predictive models to NeuroIS datasets, such as machine learning and deep learning, to find intricate patterns that are hidden by conventional regression-based analysis. We predicted 28 individual EEG power spectra separated brainwave data using a Random Forest (RF) model. Next, we used NSM to precisely predict how consumers would perceive a product online, depending on whether a light or dark user interface (UI) mode was being used. The model was then used to extract more precise results that could not be obtained using more conventional linear-based analytical models using sensitivity analysis. The benefits of using NSM in NeuroIS research are as follows: (1) it can relieve the burden of the three-horned dilemma described by Runkel and McGrath; (2) it can enable more temporal data to be directly analyzed on the target variables; and (3) sensitivity analysis can be performed on a condition/individual basis, strengthening the rigor of findings by reducing sample bias that can be lost in grand averaging of data when analyzed with methods like GLM

    New product entry success : an examination of variable, dimensional and model evolution

    Get PDF
    This thesis examines the evolution of antecedents, dimensions and initial screening models which discriminate between new product success and failure. It advances on previous empirical new product success/failure comparative studies by developing a discrete simulation procedure in which participating new product managers supply judgements retrospectively on new product strategies and orientations for two distinct time periods in the new product program: (1) the initial screening stage and (2) a period approximately 1 year after market entry. Unique linear regression functions are derived for each event and offer different, but complimentary, temporally appropriate sets of determining factors. Model predictive accuracy ascends over time and conditional process moderators alter success factors at both time periods. Whilst the work validates and synthesises much from the new product development literature, is exposes probable measurement timing error when single retrospective models assess success dimension rank at the initial screen. Six of seven hypotheses are accepted and demonstrate that: 1. Many antecedents of success and measures of objective attainment are perceived by NPD (new product development) managers to differ significantly over time. 2. Reactive strategy, NPD multigenerational history and a superior product are the most important dimensions of success through one year post launch. 3. Current linear screening models constructed using retrospective methods produce average prescriptive dimensions which exhibit measurement timing error when used at the initial screen. 4. Success dimensions evolve from somewhat deterministic to more stochastic over time with model forecasting accuracy rising as launch approaches based on better data availability. 5. Product market PiLC (the life expectancy of an introduction before modification is necessary calculated in years and months) and its order of entry and level of innovation alter aggregate success model accuracy and dimension rank. 6. Proper initial dimensional alignment and intra-process realignment based on changing environments is critical to a successful project through one year post launch. The work cautions practitioners not to wait for better models to be developed but immediately: (1) benchmark reasons for their current product market success, failure and kill historical "batting average"; (2) enhance and/or replace contributing/offending processes and systems based on these history lessons; (3) choose or reject aggregate or conditional success/failure models based on team forecasting ability; (4) concentrate on the selected model's time specific dimensions of success and (5) provide/reserve adequate resources to adapt strategically over time to both internal and external antecedent changes in the NPD environment. Finally, it recommends new research into temporal, conditional and strategic tradeoffs in internal and external antecedents/dimensions of success. Best results should come from using both linear and curvilinear methods to validate more complex yet statistically elegant NPD simulations

    Real-time Tactical and Strategic Sales Management for Intelligent Agents Guided By Economic Regimes

    Get PDF
    Many enterprises that participate in dynamic markets need to make product pricing and inventory resource utilization decisions in real-time. We describe a family of statistical models that address these needs by combining characterization of the economic environment with the ability to predict future economic conditions to make tactical (short-term) decisions, such as product pricing, and strategic (long-term) decisions, such as level of finished goods inventories. Our models characterize economic conditions, called economic regimes, in the form of recurrent statistical patterns that have clear qualitative interpretations. We show how these models can be used to predict prices, price trends, and the probability of receiving a customer order at a given price. These “regime” models are developed using statistical analysis of historical data, and are used in real-time to characterize observed market conditions and predict the evolution of market conditions over multiple time scales. We evaluate our models using a testbed derived from the Trading Agent Competition for Supply Chain Management (TAC SCM), a supply chain environment characterized by competitive procurement and sales markets, and dynamic pricing. We show how regime models can be used to inform both short-term pricing decisions and longterm resource allocation decisions. Results show that our method outperforms more traditional shortand long-term predictive modeling approaches

    State of the Art of Purchasing 2023

    Get PDF

    Architecting Fail-Safe Supply Chains / Networks

    Get PDF
    Disruptions are large-scale stochastic events that rarely happen but have a major effect on supply networks’ topology. Some examples include: air traffic being suspended due to weather or terrorism, labor unions strike, sanctions imposed or lifted, company mergers, etc. Variations are small-scale stochastic events that frequently happen but only have a trivial effect on the efficiency of flow planning in supply networks. Some examples include: fluctuations in market demands (e.g. demand is always stochastic in competitive markets) and performance of production facilities (e.g. there is not any perfect production system in reality). A fail-safe supply network is one that mitigates the impact of variations and disruptions and provides an acceptable level of service. This is achieved by keeping connectivity in its topology against disruptions (structurally fail-safe) and coordinating the flow through the facilities against variations (operationally fail-safe). In this talk, I will show that to have a structurally fail-safe supply network, its topology should be robust against disruptions by positioning mitigation strategies and be resilient in executing these strategies. Considering “Flexibility” as a risk mitigation strategy, I answer the question “What are the best flexibility levels and flexibility speeds for facilities in structurally fail-safe supply networks?” Also, I will show that to have an operationally fail-safe supply network, its flow dynamics should be reliable against demand- and supply-side variations. In the presence of these variations, I answer the question “What is the most profitable flow dynamics throughout a supply network that is reliable against variations?” The method is verified using data from an engine maker. Findings include: i) there is a tradeoff between robustness and resilience in profit-based supply networks; ii) this tradeoff is more stable in larger supply networks with higher product supply quantities; and iii) supply networks with higher reliability in their flow planning require more flexibilities to be robust. Finally, I will touch upon possible extensions of the work into non-profit relief networks for disaster management

    Fostering trust and overcoming psychological resistance towards cryptocurrencies and cryptoassets

    Get PDF
    This research investigates the extent to which sponsorships can be utilised to foster trust and reduce barriers to adopting new technologies. Using Crypto.com\u27s sponsorship of the 2022 FIFA World Cup as the context, this mixed-methods study utilises innovation resistance theory (IRT) and trust transfer theory (TTT) to investigate the extent to which such a sponsorship can increase trust and reduce barriers in innovative technologies such as cryptoassets, while also filling a research gap concerning consumer resistance to innovations in digital financial products and services. The findings of study 1, using a survey (n = 1081), and study 2 using interviews (n = 24) reveal that a positive image of sponsorship significantly influences favourability and interest, and trust of the product of the sponsor which subsequently reduces psychological barriers to adoption. Integrating the theoretical viewpoints of IRT and TTT, this study enhances our conceptual understanding regarding the psychological dimension of sponsorship and the extent to which a sponsorship generates interest, giving assurance and trust in the sponsor\u27s product, and removing uncertainty; thus, reducing barriers to adoption

    Application of nature-inspired optimization algorithms to improve the production efficiency of small and medium-sized bakeries

    Get PDF
    Increasing production efficiency through schedule optimization is one of the most influential topics in operations research that contributes to decision-making process. It is the concept of allocating tasks among available resources within the constraints of any manufacturing facility in order to minimize costs. It is carried out by a model that resembles real-world task distribution with variables and relevant constraints in order to complete a planned production. In addition to a model, an optimizer is required to assist in evaluating and improving the task allocation procedure in order to maximize overall production efficiency. The entire procedure is usually carried out on a computer, where these two distinct segments combine to form a solution framework for production planning and support decision-making in various manufacturing industries. Small and medium-sized bakeries lack access to cutting-edge tools, and most of their production schedules are based on personal experience. This makes a significant difference in production costs when compared to the large bakeries, as evidenced by their market dominance. In this study, a hybrid no-wait flow shop model is proposed to produce a production schedule based on actual data, featuring the constraints of the production environment in small and medium-sized bakeries. Several single-objective and multi-objective nature-inspired optimization algorithms were implemented to find efficient production schedules. While makespan is the most widely used quality criterion of production efficiency because it dominates production costs, high oven idle time in bakeries also wastes energy. Combining these quality criteria allows for additional cost reduction due to energy savings as well as shorter production time. Therefore, to obtain the efficient production plan, makespan and oven idle time were included in the objectives of optimization. To find the optimal production planning for an existing production line, particle swarm optimization, simulated annealing, and the Nawaz-Enscore-Ham algorithms were used. The weighting factor method was used to combine two objectives into a single objective. The classical optimization algorithms were found to be good enough at finding optimal schedules in a reasonable amount of time, reducing makespan by 29 % and oven idle time by 8 % of one of the analyzed production datasets. Nonetheless, the algorithms convergence was found to be poor, with a lower probability of obtaining the best or nearly the best result. In contrast, a modified particle swarm optimization (MPSO) proposed in this study demonstrated significant improvement in convergence with a higher probability of obtaining better results. To obtain trade-offs between two objectives, state-of-the-art multi-objective optimization algorithms, non-dominated sorting genetic algorithm (NSGA-II), strength Pareto evolutionary algorithm, generalized differential evolution, improved multi-objective particle swarm optimization (OMOPSO) and speed-constrained multi-objective particle swarm optimization (SMPSO) were implemented. Optimization algorithms provided efficient production planning with up to a 12 % reduction in makespan and a 26 % reduction in oven idle time based on data from different production days. The performance comparison revealed a significant difference between these multi-objective optimization algorithms, with NSGA-II performing best and OMOPSO and SMPSO performing worst. Proofing is a key processing stage that contributes to the quality of the final product by developing flavor and fluffiness texture in bread. However, the duration of proofing is uncertain due to the complex interaction of multiple parameters: yeast condition, temperature in the proofing chamber, and chemical composition of flour. Due to the uncertainty of proofing time, a production plan optimized with the shortest makespan can be significantly inefficient. The computational results show that the schedules with the shortest and nearly shortest makespan have a significant (up to 18 %) increase in makespan due to proofing time deviation from expected duration. In this thesis, a method for developing resilient production planning that takes into account uncertain proofing time is proposed, so that even if the deviation in proofing time is extreme, the fluctuation in makespan is minimal. The experimental results with a production dataset revealed a proactive production plan, with only 5 minutes longer than the shortest makespan, but only 21 min fluctuating in makespan due to varying the proofing time from -10 % to +10 % of actual proofing time. This study proposed a common framework for small and medium-sized bakeries to improve their production efficiency in three steps: collecting production data, simulating production planning with the hybrid no-wait flow shop model, and running the optimization algorithm. The study suggests to use MPSO for solving single objective optimization problem and NSGA-II for multi-objective optimization problem. Based on real bakery production data, the results revealed that existing plans were significantly inefficient and could be optimized in a reasonable computational time using a robust optimization algorithm. Implementing such a framework in small and medium-sized bakery manufacturing operations could help to achieve an efficient and resilient production system.Die Steigerung der Produktionseffizienz durch die Optimierung von ArbeitsplĂ€nen ist eines der am meisten erforschten Themen im Bereich der Unternehmensplanung, die zur Entscheidungsfindung beitrĂ€gt. Es handelt sich dabei um die Aufteilung von Aufgaben auf die verfĂŒgbaren Ressourcen innerhalb der BeschrĂ€nkungen einer Produktionsanlage mit dem Ziel der Kostenminimierung. Diese Optimierung von ArbeitsplĂ€nen wird mit Hilfe eines Modells durchgefĂŒhrt, das die Aufgabenverteilung in der realen Welt mit Variablen und relevanten EinschrĂ€nkungen nachbildet, um die Produktion zu simulieren. ZusĂ€tzlich zu einem Modell sind Optimierungsverfahren erforderlich, die bei der Bewertung und Verbesserung der Aufgabenverteilung helfen, um eine effiziente Gesamtproduktion zu erzielen. Das gesamte Verfahren wird in der Regel auf einem Computer durchgefĂŒhrt, wobei diese beiden unterschiedlichen Komponenten (Modell und Optimierungsverfahren) zusammen einen Lösungsrahmen fĂŒr die Produktionsplanung bilden und die Entscheidungsfindung in verschiedenen Fertigungsindustrien unterstĂŒtzen. Kleine und mittelgroße BĂ€ckereien haben zumeist keinen Zugang zu den modernsten Werkzeugen und die meisten ihrer ProduktionsplĂ€ne beruhen auf persönlichen Erfahrungen. Dies macht einen erheblichen Unterschied bei den Produktionskosten im Vergleich zu den großen BĂ€ckereien aus, was sich in deren Marktdominanz widerspiegelt. In dieser Studie wird ein hybrides No-Wait-Flow-Shop-Modell vorgeschlagen, um einen Produktionsplan auf der Grundlage tatsĂ€chlicher Daten zu erstellen, der die BeschrĂ€nkungen der Produktionsumgebung in kleinen und mittleren BĂ€ckereien berĂŒcksichtigt. Mehrere einzel- und mehrzielorientierte, von der Natur inspirierte Optimierungsalgorithmen wurden implementiert, um effiziente ProduktionsplĂ€ne zu berechnen. Die Minimierung der Produktionsdauer ist das am hĂ€ufigsten verwendete QualitĂ€tskriterium fĂŒr die Produktionseffizienz, da sie die Produktionskosten dominiert. Jedoch wird in BĂ€ckereien durch hohe Leerlaufzeiten der Öfen Energie verschwendet was wiederum die Produktionskosten erhöht. Die Kombination beider QualitĂ€tskriterien (minimale Produktionskosten, minimale Leerlaufzeiten der Öfen) ermöglicht eine zusĂ€tzliche Kostenreduzierung durch Energieeinsparungen und kurze Produktionszeiten. Um einen effizienten Produktionsplan zu erhalten, wurden daher die Minimierung der Produktionsdauer und der Ofenleerlaufzeit in die Optimierungsziele einbezogen. Um optimale ProduktionsplĂ€ne fĂŒr bestehende Produktionsprozesse von BĂ€ckereien zu ermitteln, wurden folgende Algorithmen untersucht: Particle Swarm Optimization, Simulated Annealing und Nawaz-Enscore-Ham. Die Methode der Gewichtung wurde verwendet, um zwei Ziele zu einem einzigen Ziel zu kombinieren. Die Optimierungsalgorithmen erwiesen sich als gut genug, um in angemessener Zeit optimale PlĂ€ne zu berechnen, wobei bei einem untersuchten Datensatz die Produktionsdauer um 29 % und die Leerlaufzeit des Ofens um 8 % reduziert wurde. Allerdings erwies sich die Konvergenz der Algorithmen als unzureichend, da nur mit einer geringen Wahrscheinlichkeit das beste oder nahezu beste Ergebnis berechnet wurde. Im Gegensatz dazu zeigte der in dieser Studie ebenfalls untersuchte modifizierte Particle-swarm-Optimierungsalgorithmus (mPSO) eine deutliche Verbesserung der Konvergenz mit einer höheren Wahrscheinlichkeit, bessere Ergebnisse zu erzielen im Vergleich zu den anderen Algorithmen. Um Kompromisse zwischen zwei Zielen zu erzielen, wurden moderne Algorithmen zur Mehrzieloptimierung implementiert: Non-dominated Sorting Genetic Algorithm (NSGA-II), Strength Pareto Evolutionary Algorithm, Generalized Differential Evolution, Improved Multi-objective Particle Swarm Optimization (OMOPSO), and Speed-constrained Multi-objective Particle Swarm Optimization (SMPSO). Die Optimierungsalgorithmen ermöglichten eine effiziente Produktionsplanung mit einer Verringerung der Produktionsdauer um bis zu 12 % und einer Verringerung der Leerlaufzeit der Öfen um 26 % auf der Grundlage von Daten aus unterschiedlichen Produktionsprozessen. Der Leistungsvergleich zeigte signifikante Unterschiede zwischen diesen Mehrziel-Optimierungsalgorithmen, wobei NSGA-II am besten und OMOPSO und SMPSO am schlechtesten abschnitten. Die GĂ€rung ist ein wichtiger Verarbeitungsschritt, der zur QualitĂ€t des Endprodukts beitrĂ€gt, indem der Geschmack und die Textur des Brotes positiv beeinflusst werden kann. Die Dauer der GĂ€rung ist jedoch aufgrund der komplexen Interaktion von mehreren GrĂ¶ĂŸen abhĂ€ngig wie der Hefezustand, der Temperatur in der GĂ€rkammer und der chemischen Zusammensetzung des Mehls. Aufgrund der VariabilitĂ€t der GĂ€rzeit kann jedoch ein Produktionsplan, der auf die kĂŒrzeste Produktionszeit optimiert ist, sehr ineffizient sein. Die Berechnungsergebnisse zeigen, dass die PlĂ€ne mit der kĂŒrzesten und nahezu kĂŒrzesten Produktionsdauer eine erhebliche (bis zu 18 %) Erhöhung der Produktionsdauer aufgrund der Abweichung der GĂ€rzeit von der erwarteten Dauer aufweisen. In dieser Arbeit wird eine Methode zur Entwicklung einer robusten Produktionsplanung vorgeschlagen, die VerĂ€nderungen in den GĂ€rzeiten berĂŒcksichtigt, so dass selbst bei einer extremen Abweichung der GĂ€rzeit die Schwankung der Produktionsdauer minimal ist. Die experimentellen Ergebnisse fĂŒr einen Produktionsprozess ergaben einen robusten Produktionsplan, der nur 5 Minuten lĂ€nger ist als die kĂŒrzeste Produktionsdauer, aber nur 21 Minuten in der Produktionsdauer schwankt, wenn die GĂ€rzeit von -10 % bis +10 % der ermittelten GĂ€rzeit variiert. In dieser Studie wird ein Vorgehen fĂŒr kleine und mittlere BĂ€ckereien vorgeschlagen, um ihre Produktionseffizienz in drei Schritten zu verbessern: Erfassung von Produktionsdaten, Simulation von ProduktionsplĂ€nen mit dem hybrid No-Wait Flow Shop Modell und AusfĂŒhrung der Optimierung. FĂŒr die Einzieloptimierung wird der mPSO-Algorithmus und fĂŒr die Mehrzieloptimierung NSGA-II-Algorithmus empfohlen. Auf der Grundlage realer BĂ€ckereiproduktionsdaten zeigten die Ergebnisse, dass die in den BĂ€ckereien verwendeten PlĂ€ne ineffizient waren und mit Hilfe eines effizienten Optimierungsalgorithmus in einer angemessenen Rechenzeit optimiert werden konnten. Die Umsetzung eines solchen Vorgehens in kleinen und mittelgroßen BĂ€ckereibetrieben trĂ€gt dazu bei effiziente und robuste ProduktionsplĂ€ne zu erstellen und somit die WettbewerbsfĂ€higkeit dieser BĂ€ckereien zu erhöhen
    • 

    corecore