2,298 research outputs found

    Undergraduate Catalog of Studies, 2023-2024

    Get PDF

    Graduate Catalog of Studies, 2023-2024

    Get PDF

    Graduate Catalog of Studies, 2023-2024

    Get PDF

    The AddACO: A bio-inspired modified version of the ant colony optimization algorithm to solve travel salesman problems

    Get PDF
    The Travel Salesman Problem (TSP) consists in finding the minimal-length closed tour that connects the entire group of nodes of a given graph. We propose to solve such a combinatorial optimization problem with the AddACO algorithm: it is a version of the Ant Colony Optimization method that is characterized by a modified probabilistic law at the basis of the exploratory movement of the artificial insects. In particular, the ant decisional rule is here set to amount in a linear convex combination of competing behavioral stimuli and has therefore an additive form (hence the name of our algorithm), rather than the canonical multiplicative one. The AddACO intends to address two conceptual shortcomings that characterize classical ACO methods: (i) the population of artificial insects is in principle allowed to simultaneously minimize/maximize all migratory guidance cues (which is in implausible from a biological/ecological point of view) and (ii) a given edge of the graph has a null probability to be explored if at least one of the movement trait is therein equal to zero, i.e., regardless the intensity of the others (this in principle reduces the exploratory potential of the ant colony). Three possible variants of our method are then specified: the AddACO-V1, which includes pheromone trail and visibility as insect decisional variables, and the AddACO-V2 and the AddACO-V3, which in turn add random effects and inertia, respectively, to the two classical migratory stimuli. The three versions of our algorithm are tested on benchmark middle-scale TPS instances, in order to assess their performance and to find their optimal parameter setting. The best performing variant is finally applied to large-scale TSPs, compared to the naive Ant-Cycle Ant System, proposed by Dorigo and colleagues, and evaluated in terms of quality of the solutions, computational time, and convergence speed. The aim is in fact to show that the proposed transition probability, as long as its conceptual advantages, is competitive from a performance perspective, i.e., if it does not reduce the exploratory capacity of the ant population w.r.t. the canonical one (at least in the case of selected TSPs). A theoretical study of the asymptotic behavior of the AddACO is given in the appendix of the work, whose conclusive section contains some hints for further improvements of our algorithm, also in the perspective of its application to other optimization problems

    The Application of Data Analytics Technologies for the Predictive Maintenance of Industrial Facilities in Internet of Things (IoT) Environments

    Get PDF
    In industrial production environments, the maintenance of equipment has a decisive influence on costs and on the plannability of production capacities. In particular, unplanned failures during production times cause high costs, unplanned downtimes and possibly additional collateral damage. Predictive Maintenance starts here and tries to predict a possible failure and its cause so early that its prevention can be prepared and carried out in time. In order to be able to predict malfunctions and failures, the industrial plant with its characteristics, as well as wear and ageing processes, must be modelled. Such modelling can be done by replicating its physical properties. However, this is very complex and requires enormous expert knowledge about the plant and about wear and ageing processes of each individual component. Neural networks and machine learning make it possible to train such models using data and offer an alternative, especially when very complex and non-linear behaviour is evident. In order for models to make predictions, as much data as possible about the condition of a plant and its environment and production planning data is needed. In Industrial Internet of Things (IIoT) environments, the amount of available data is constantly increasing. Intelligent sensors and highly interconnected production facilities produce a steady stream of data. The sheer volume of data, but also the steady stream in which data is transmitted, place high demands on the data processing systems. If a participating system wants to perform live analyses on the incoming data streams, it must be able to process the incoming data at least as fast as the continuous data stream delivers it. If this is not the case, the system falls further and further behind in processing and thus in its analyses. This also applies to Predictive Maintenance systems, especially if they use complex and computationally intensive machine learning models. If sufficiently scalable hardware resources are available, this may not be a problem at first. However, if this is not the case or if the processing takes place on decentralised units with limited hardware resources (e.g. edge devices), the runtime behaviour and resource requirements of the type of neural network used can become an important criterion. This thesis addresses Predictive Maintenance systems in IIoT environments using neural networks and Deep Learning, where the runtime behaviour and the resource requirements are relevant. The question is whether it is possible to achieve better runtimes with similarly result quality using a new type of neural network. The focus is on reducing the complexity of the network and improving its parallelisability. Inspired by projects in which complexity was distributed to less complex neural subnetworks by upstream measures, two hypotheses presented in this thesis emerged: a) the distribution of complexity into simpler subnetworks leads to faster processing overall, despite the overhead this creates, and b) if a neural cell has a deeper internal structure, this leads to a less complex network. Within the framework of a qualitative study, an overall impression of Predictive Maintenance applications in IIoT environments using neural networks was developed. Based on the findings, a novel model layout was developed named Sliced Long Short-Term Memory Neural Network (SlicedLSTM). The SlicedLSTM implements the assumptions made in the aforementioned hypotheses in its inner model architecture. Within the framework of a quantitative study, the runtime behaviour of the SlicedLSTM was compared with that of a reference model in the form of laboratory tests. The study uses synthetically generated data from a NASA project to predict failures of modules of aircraft gas turbines. The dataset contains 1,414 multivariate time series with 104,897 samples of test data and 160,360 samples of training data. As a result, it could be proven for the specific application and the data used that the SlicedLSTM delivers faster processing times with similar result accuracy and thus clearly outperforms the reference model in this respect. The hypotheses about the influence of complexity in the internal structure of the neuronal cells were confirmed by the study carried out in the context of this thesis

    2023-2024 Catalog

    Get PDF
    The 2023-2024 Governors State University Undergraduate and Graduate Catalog is a comprehensive listing of current information regarding:Degree RequirementsCourse OfferingsUndergraduate and Graduate Rules and Regulation

    Undergraduate Catalog of Studies, 2022-2023

    Get PDF

    Current issues of the management of socio-economic systems in terms of globalization challenges

    Get PDF
    The authors of the scientific monograph have come to the conclusion that the management of socio-economic systems in the terms of global challenges requires the use of mechanisms to ensure security, optimise the use of resource potential, increase competitiveness, and provide state support to economic entities. Basic research focuses on assessment of economic entities in the terms of global challenges, analysis of the financial system, migration flows, logistics and product exports, territorial development. The research results have been implemented in the different decision-making models in the context of global challenges, strategic planning, financial and food security, education management, information technology and innovation. The results of the study can be used in the developing of directions, programmes and strategies for sustainable development of economic entities and regions, increasing the competitiveness of products and services, decision-making at the level of ministries and agencies that regulate the processes of managing socio-economic systems. The results can also be used by students and young scientists in the educational process and conducting scientific research on the management of socio-economic systems in the terms of global challenges

    Application of nature-inspired optimization algorithms to improve the production efficiency of small and medium-sized bakeries

    Get PDF
    Increasing production efficiency through schedule optimization is one of the most influential topics in operations research that contributes to decision-making process. It is the concept of allocating tasks among available resources within the constraints of any manufacturing facility in order to minimize costs. It is carried out by a model that resembles real-world task distribution with variables and relevant constraints in order to complete a planned production. In addition to a model, an optimizer is required to assist in evaluating and improving the task allocation procedure in order to maximize overall production efficiency. The entire procedure is usually carried out on a computer, where these two distinct segments combine to form a solution framework for production planning and support decision-making in various manufacturing industries. Small and medium-sized bakeries lack access to cutting-edge tools, and most of their production schedules are based on personal experience. This makes a significant difference in production costs when compared to the large bakeries, as evidenced by their market dominance. In this study, a hybrid no-wait flow shop model is proposed to produce a production schedule based on actual data, featuring the constraints of the production environment in small and medium-sized bakeries. Several single-objective and multi-objective nature-inspired optimization algorithms were implemented to find efficient production schedules. While makespan is the most widely used quality criterion of production efficiency because it dominates production costs, high oven idle time in bakeries also wastes energy. Combining these quality criteria allows for additional cost reduction due to energy savings as well as shorter production time. Therefore, to obtain the efficient production plan, makespan and oven idle time were included in the objectives of optimization. To find the optimal production planning for an existing production line, particle swarm optimization, simulated annealing, and the Nawaz-Enscore-Ham algorithms were used. The weighting factor method was used to combine two objectives into a single objective. The classical optimization algorithms were found to be good enough at finding optimal schedules in a reasonable amount of time, reducing makespan by 29 % and oven idle time by 8 % of one of the analyzed production datasets. Nonetheless, the algorithms convergence was found to be poor, with a lower probability of obtaining the best or nearly the best result. In contrast, a modified particle swarm optimization (MPSO) proposed in this study demonstrated significant improvement in convergence with a higher probability of obtaining better results. To obtain trade-offs between two objectives, state-of-the-art multi-objective optimization algorithms, non-dominated sorting genetic algorithm (NSGA-II), strength Pareto evolutionary algorithm, generalized differential evolution, improved multi-objective particle swarm optimization (OMOPSO) and speed-constrained multi-objective particle swarm optimization (SMPSO) were implemented. Optimization algorithms provided efficient production planning with up to a 12 % reduction in makespan and a 26 % reduction in oven idle time based on data from different production days. The performance comparison revealed a significant difference between these multi-objective optimization algorithms, with NSGA-II performing best and OMOPSO and SMPSO performing worst. Proofing is a key processing stage that contributes to the quality of the final product by developing flavor and fluffiness texture in bread. However, the duration of proofing is uncertain due to the complex interaction of multiple parameters: yeast condition, temperature in the proofing chamber, and chemical composition of flour. Due to the uncertainty of proofing time, a production plan optimized with the shortest makespan can be significantly inefficient. The computational results show that the schedules with the shortest and nearly shortest makespan have a significant (up to 18 %) increase in makespan due to proofing time deviation from expected duration. In this thesis, a method for developing resilient production planning that takes into account uncertain proofing time is proposed, so that even if the deviation in proofing time is extreme, the fluctuation in makespan is minimal. The experimental results with a production dataset revealed a proactive production plan, with only 5 minutes longer than the shortest makespan, but only 21 min fluctuating in makespan due to varying the proofing time from -10 % to +10 % of actual proofing time. This study proposed a common framework for small and medium-sized bakeries to improve their production efficiency in three steps: collecting production data, simulating production planning with the hybrid no-wait flow shop model, and running the optimization algorithm. The study suggests to use MPSO for solving single objective optimization problem and NSGA-II for multi-objective optimization problem. Based on real bakery production data, the results revealed that existing plans were significantly inefficient and could be optimized in a reasonable computational time using a robust optimization algorithm. Implementing such a framework in small and medium-sized bakery manufacturing operations could help to achieve an efficient and resilient production system.Die Steigerung der Produktionseffizienz durch die Optimierung von Arbeitsplänen ist eines der am meisten erforschten Themen im Bereich der Unternehmensplanung, die zur Entscheidungsfindung beiträgt. Es handelt sich dabei um die Aufteilung von Aufgaben auf die verfügbaren Ressourcen innerhalb der Beschränkungen einer Produktionsanlage mit dem Ziel der Kostenminimierung. Diese Optimierung von Arbeitsplänen wird mit Hilfe eines Modells durchgeführt, das die Aufgabenverteilung in der realen Welt mit Variablen und relevanten Einschränkungen nachbildet, um die Produktion zu simulieren. Zusätzlich zu einem Modell sind Optimierungsverfahren erforderlich, die bei der Bewertung und Verbesserung der Aufgabenverteilung helfen, um eine effiziente Gesamtproduktion zu erzielen. Das gesamte Verfahren wird in der Regel auf einem Computer durchgeführt, wobei diese beiden unterschiedlichen Komponenten (Modell und Optimierungsverfahren) zusammen einen Lösungsrahmen für die Produktionsplanung bilden und die Entscheidungsfindung in verschiedenen Fertigungsindustrien unterstützen. Kleine und mittelgroße Bäckereien haben zumeist keinen Zugang zu den modernsten Werkzeugen und die meisten ihrer Produktionspläne beruhen auf persönlichen Erfahrungen. Dies macht einen erheblichen Unterschied bei den Produktionskosten im Vergleich zu den großen Bäckereien aus, was sich in deren Marktdominanz widerspiegelt. In dieser Studie wird ein hybrides No-Wait-Flow-Shop-Modell vorgeschlagen, um einen Produktionsplan auf der Grundlage tatsächlicher Daten zu erstellen, der die Beschränkungen der Produktionsumgebung in kleinen und mittleren Bäckereien berücksichtigt. Mehrere einzel- und mehrzielorientierte, von der Natur inspirierte Optimierungsalgorithmen wurden implementiert, um effiziente Produktionspläne zu berechnen. Die Minimierung der Produktionsdauer ist das am häufigsten verwendete Qualitätskriterium für die Produktionseffizienz, da sie die Produktionskosten dominiert. Jedoch wird in Bäckereien durch hohe Leerlaufzeiten der Öfen Energie verschwendet was wiederum die Produktionskosten erhöht. Die Kombination beider Qualitätskriterien (minimale Produktionskosten, minimale Leerlaufzeiten der Öfen) ermöglicht eine zusätzliche Kostenreduzierung durch Energieeinsparungen und kurze Produktionszeiten. Um einen effizienten Produktionsplan zu erhalten, wurden daher die Minimierung der Produktionsdauer und der Ofenleerlaufzeit in die Optimierungsziele einbezogen. Um optimale Produktionspläne für bestehende Produktionsprozesse von Bäckereien zu ermitteln, wurden folgende Algorithmen untersucht: Particle Swarm Optimization, Simulated Annealing und Nawaz-Enscore-Ham. Die Methode der Gewichtung wurde verwendet, um zwei Ziele zu einem einzigen Ziel zu kombinieren. Die Optimierungsalgorithmen erwiesen sich als gut genug, um in angemessener Zeit optimale Pläne zu berechnen, wobei bei einem untersuchten Datensatz die Produktionsdauer um 29 % und die Leerlaufzeit des Ofens um 8 % reduziert wurde. Allerdings erwies sich die Konvergenz der Algorithmen als unzureichend, da nur mit einer geringen Wahrscheinlichkeit das beste oder nahezu beste Ergebnis berechnet wurde. Im Gegensatz dazu zeigte der in dieser Studie ebenfalls untersuchte modifizierte Particle-swarm-Optimierungsalgorithmus (mPSO) eine deutliche Verbesserung der Konvergenz mit einer höheren Wahrscheinlichkeit, bessere Ergebnisse zu erzielen im Vergleich zu den anderen Algorithmen. Um Kompromisse zwischen zwei Zielen zu erzielen, wurden moderne Algorithmen zur Mehrzieloptimierung implementiert: Non-dominated Sorting Genetic Algorithm (NSGA-II), Strength Pareto Evolutionary Algorithm, Generalized Differential Evolution, Improved Multi-objective Particle Swarm Optimization (OMOPSO), and Speed-constrained Multi-objective Particle Swarm Optimization (SMPSO). Die Optimierungsalgorithmen ermöglichten eine effiziente Produktionsplanung mit einer Verringerung der Produktionsdauer um bis zu 12 % und einer Verringerung der Leerlaufzeit der Öfen um 26 % auf der Grundlage von Daten aus unterschiedlichen Produktionsprozessen. Der Leistungsvergleich zeigte signifikante Unterschiede zwischen diesen Mehrziel-Optimierungsalgorithmen, wobei NSGA-II am besten und OMOPSO und SMPSO am schlechtesten abschnitten. Die Gärung ist ein wichtiger Verarbeitungsschritt, der zur Qualität des Endprodukts beiträgt, indem der Geschmack und die Textur des Brotes positiv beeinflusst werden kann. Die Dauer der Gärung ist jedoch aufgrund der komplexen Interaktion von mehreren Größen abhängig wie der Hefezustand, der Temperatur in der Gärkammer und der chemischen Zusammensetzung des Mehls. Aufgrund der Variabilität der Gärzeit kann jedoch ein Produktionsplan, der auf die kürzeste Produktionszeit optimiert ist, sehr ineffizient sein. Die Berechnungsergebnisse zeigen, dass die Pläne mit der kürzesten und nahezu kürzesten Produktionsdauer eine erhebliche (bis zu 18 %) Erhöhung der Produktionsdauer aufgrund der Abweichung der Gärzeit von der erwarteten Dauer aufweisen. In dieser Arbeit wird eine Methode zur Entwicklung einer robusten Produktionsplanung vorgeschlagen, die Veränderungen in den Gärzeiten berücksichtigt, so dass selbst bei einer extremen Abweichung der Gärzeit die Schwankung der Produktionsdauer minimal ist. Die experimentellen Ergebnisse für einen Produktionsprozess ergaben einen robusten Produktionsplan, der nur 5 Minuten länger ist als die kürzeste Produktionsdauer, aber nur 21 Minuten in der Produktionsdauer schwankt, wenn die Gärzeit von -10 % bis +10 % der ermittelten Gärzeit variiert. In dieser Studie wird ein Vorgehen für kleine und mittlere Bäckereien vorgeschlagen, um ihre Produktionseffizienz in drei Schritten zu verbessern: Erfassung von Produktionsdaten, Simulation von Produktionsplänen mit dem hybrid No-Wait Flow Shop Modell und Ausführung der Optimierung. Für die Einzieloptimierung wird der mPSO-Algorithmus und für die Mehrzieloptimierung NSGA-II-Algorithmus empfohlen. Auf der Grundlage realer Bäckereiproduktionsdaten zeigten die Ergebnisse, dass die in den Bäckereien verwendeten Pläne ineffizient waren und mit Hilfe eines effizienten Optimierungsalgorithmus in einer angemessenen Rechenzeit optimiert werden konnten. Die Umsetzung eines solchen Vorgehens in kleinen und mittelgroßen Bäckereibetrieben trägt dazu bei effiziente und robuste Produktionspläne zu erstellen und somit die Wettbewerbsfähigkeit dieser Bäckereien zu erhöhen

    Workshop Proceedings of the 12th edition of the KONVENS conference

    Get PDF
    The 2014 issue of KONVENS is even more a forum for exchange: its main topic is the interaction between Computational Linguistics and Information Science, and the synergies such interaction, cooperation and integrated views can produce. This topic at the crossroads of different research traditions which deal with natural language as a container of knowledge, and with methods to extract and manage knowledge that is linguistically represented is close to the heart of many researchers at the Institut für Informationswissenschaft und Sprachtechnologie of Universität Hildesheim: it has long been one of the institute’s research topics, and it has received even more attention over the last few years
    • …
    corecore