2,563 research outputs found

    Multidisciplinary perspectives on Artificial Intelligence and the law

    Get PDF
    This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio

    An intelligent decision support system for groundwater supply management and electromechanical infrastructure controls

    Get PDF
    This study presents an intelligent Decision Support System (DSS) aimed at bridging the theoretical-practical gap in groundwater management. The ongoing demand for sophisticated systems capable of interpreting extensive data to inform sustainable groundwater decision- making underscores the critical nature of this research. To meet this challenge, telemetry data from six randomly selected wells were used to establish a comprehensive database of groundwater pumping parameters, including flow rate, pressure, and current intensity. Statistical analysis of these parameters led to the determination of threshold values for critical factors such as water pressure and electrical current. Additionally, a soft sensor was developed using a Random Forest (RF) machine learning algorithm, enabling real-time forecasting of key variables. This was achieved by continuously comparing live telemetry data to pump design specifications and results from regular field testing. The proposed machine learning model ensures robust empirical monitoring of well and pump health. Furthermore, expert operational knowledge from water management professionals, gathered through a Classical Delphi (CD) technique, was seamlessly integrated. This collective expertise culminated in a data-driven framework for sustainable groundwater facilities monitoring. In conclusion, this innovative DSS not only addresses the theory-application gap but also leverages the power of data analytics and expert knowledge to provide high-precision online insights, thereby optimizing groundwater management practices

    Modern computing: Vision and challenges

    Get PDF
    Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress

    Review of Path Selection Algorithms with Link Quality and Critical Switch Aware for Heterogeneous Traffic in SDN

    Get PDF
    Software Defined Networking (SDN) introduced network management flexibility that eludes traditional network architecture. Nevertheless, the pervasive demand for various cloud computing services with different levels of Quality of Service requirements in our contemporary world made network service provisioning challenging. One of these challenges is path selection (PS) for routing heterogeneous traffic with end-to-end quality of service support specific to each traffic class. The challenge had gotten the research community\u27s attention to the extent that many PSAs were proposed. However, a gap still exists that calls for further study. This paper reviews the existing PSA and the Baseline Shortest Path Algorithms (BSPA) upon which many relevant PSA(s) are built to help identify these gaps. The paper categorizes the PSAs into four, based on their path selection criteria, (1) PSAs that use static or dynamic link quality to guide PSD, (2) PSAs that consider the criticality of switch in terms of an update operation, FlowTable limitation or port capacity to guide PSD, (3) PSAs that consider flow variabilities to guide PSD and (4) The PSAs that use ML optimization in their PSD. We then reviewed and compared the techniques\u27 design in each category against the identified SDN PSA design objectives, solution approach, BSPA, and validation approaches. Finally, the paper recommends directions for further research

    SCALING UP TASK EXECUTION ON RESOURCE-CONSTRAINED SYSTEMS

    Get PDF
    The ubiquity of executing machine learning tasks on embedded systems with constrained resources has made efficient execution of neural networks on these systems under the CPU, memory, and energy constraints increasingly important. Different from high-end computing systems where resources are abundant and reliable, resource-constrained systems only have limited computational capability, limited memory, and limited energy supply. This dissertation focuses on how to take full advantage of the limited resources of these systems in order to improve task execution efficiency from different aspects of the execution pipeline. While the existing literature primarily aims at solving the problem by shrinking the model size according to the resource constraints, this dissertation aims to improve the execution efficiency for a given set of tasks from the following two aspects. Firstly, we propose SmartON, which is the first batteryless active event detection system that considers both the event arrival pattern as well as the harvested energy to determine when the system should wake up and what the duty cycle should be. Secondly, we propose Antler, which exploits the affinity between all pairs of tasks in a multitask inference system to construct a compact graph representation of the task set for a given overall size budget. To achieve the aforementioned algorithmic proposals, we propose the following hardware solutions. One is a controllable capacitor array that can expand the system’s energy storage on-the-fly. The other is a FRAM array that can accommodate multiple neural networks running on one system.Doctor of Philosoph

    Technology for Low Resolution Space Based RSO Detection and Characterisation

    Get PDF
    Space Situational Awareness (SSA) refers to all activities to detect, identify and track objects in Earth orbit. SSA is critical to all current and future space activities and protect space assets by providing access control, conjunction warnings, and monitoring status of active satellites. Currently SSA methods and infrastructure are not sufficient to account for the proliferations of space debris. In response to the need for better SSA there has been many different areas of research looking to improve SSA most of the requiring dedicated ground or space-based infrastructure. In this thesis, a novel approach for the characterisation of RSO’s (Resident Space Objects) from passive low-resolution space-based sensors is presented with all the background work performed to enable this novel method. Low resolution space-based sensors are common on current satellites, with many of these sensors being in space using them passively to detect RSO’s can greatly augment SSA with out expensive infrastructure or long lead times. One of the largest hurtles to overcome with research in the area has to do with the lack of publicly available labelled data to test and confirm results with. To overcome this hurtle a simulation software, ORBITALS, was created. To verify and validate the ORBITALS simulator it was compared with the Fast Auroral Imager images, which is one of the only publicly available low-resolution space-based images found with auxiliary data. During the development of the ORBITALS simulator it was found that the generation of these simulated images are computationally intensive when propagating the entire space catalog. To overcome this an upgrade of the currently used propagation method, Specialised General Perturbation Method 4th order (SGP4), was performed to allow the algorithm to run in parallel reducing the computational time required to propagate entire catalogs of RSO’s. From the results it was found that the standard facet model with a particle swarm optimisation performed the best estimating an RSO’s attitude with a 0.66 degree RMSE accuracy across a sequence, and ~1% MAPE accuracy for the optical properties. This accomplished this thesis goal of demonstrating the feasibility of low-resolution passive RSO characterisation from space-based platforms in a simulated environment

    OPTIMIZING PRODUCTION SCHEDULING THROUGH HYBRID DYNAMIC GENETIC-ADAPTIVE IMPROVED GRAVITATIONAL OPTIMIZATION ALGORITHM

    Get PDF
    Mass customization is becoming the more and more of emphasis on the production optimization. In many manufacturing and service organizations, production planning and scheduling are characterized as the daily decision-making procedures. The significance of the choices made is therefore to shown in the areas of work orders, manufacturing, transportation, and distribution of the finished goods. Production scheduling is the process of regulating, determining, and maximizing the restricted resources of the production system. In this study, a novel Hybrid Dynamic Genetic-Adaptive Improved Gravitational Optimization Algorithm (HDG-AIGOA) approach is introduced to optimize the production schedule. In this case, the AIGOA classification effectiveness is increased by using the HDG method. The small and benchmark iMOPSE dataset has been used to assess the success of suggested approach. The noisy data from raw data samples are removed using the Adaptive Median Filter (AMF) filter. To extract the properties from the segmented data, a Kernel Principal Component Analysis (KPCA) is performed. The results of the research show that the recommended methodology beats earlier approaches in terms of the accuracy, Root Mean Square Error (RMSE), Mean Absolute Error (MAE), and Square Error (MSE). Our proposed method might consider to improve the production scheduling in an dynamic environment

    Application of nature-inspired optimization algorithms to improve the production efficiency of small and medium-sized bakeries

    Get PDF
    Increasing production efficiency through schedule optimization is one of the most influential topics in operations research that contributes to decision-making process. It is the concept of allocating tasks among available resources within the constraints of any manufacturing facility in order to minimize costs. It is carried out by a model that resembles real-world task distribution with variables and relevant constraints in order to complete a planned production. In addition to a model, an optimizer is required to assist in evaluating and improving the task allocation procedure in order to maximize overall production efficiency. The entire procedure is usually carried out on a computer, where these two distinct segments combine to form a solution framework for production planning and support decision-making in various manufacturing industries. Small and medium-sized bakeries lack access to cutting-edge tools, and most of their production schedules are based on personal experience. This makes a significant difference in production costs when compared to the large bakeries, as evidenced by their market dominance. In this study, a hybrid no-wait flow shop model is proposed to produce a production schedule based on actual data, featuring the constraints of the production environment in small and medium-sized bakeries. Several single-objective and multi-objective nature-inspired optimization algorithms were implemented to find efficient production schedules. While makespan is the most widely used quality criterion of production efficiency because it dominates production costs, high oven idle time in bakeries also wastes energy. Combining these quality criteria allows for additional cost reduction due to energy savings as well as shorter production time. Therefore, to obtain the efficient production plan, makespan and oven idle time were included in the objectives of optimization. To find the optimal production planning for an existing production line, particle swarm optimization, simulated annealing, and the Nawaz-Enscore-Ham algorithms were used. The weighting factor method was used to combine two objectives into a single objective. The classical optimization algorithms were found to be good enough at finding optimal schedules in a reasonable amount of time, reducing makespan by 29 % and oven idle time by 8 % of one of the analyzed production datasets. Nonetheless, the algorithms convergence was found to be poor, with a lower probability of obtaining the best or nearly the best result. In contrast, a modified particle swarm optimization (MPSO) proposed in this study demonstrated significant improvement in convergence with a higher probability of obtaining better results. To obtain trade-offs between two objectives, state-of-the-art multi-objective optimization algorithms, non-dominated sorting genetic algorithm (NSGA-II), strength Pareto evolutionary algorithm, generalized differential evolution, improved multi-objective particle swarm optimization (OMOPSO) and speed-constrained multi-objective particle swarm optimization (SMPSO) were implemented. Optimization algorithms provided efficient production planning with up to a 12 % reduction in makespan and a 26 % reduction in oven idle time based on data from different production days. The performance comparison revealed a significant difference between these multi-objective optimization algorithms, with NSGA-II performing best and OMOPSO and SMPSO performing worst. Proofing is a key processing stage that contributes to the quality of the final product by developing flavor and fluffiness texture in bread. However, the duration of proofing is uncertain due to the complex interaction of multiple parameters: yeast condition, temperature in the proofing chamber, and chemical composition of flour. Due to the uncertainty of proofing time, a production plan optimized with the shortest makespan can be significantly inefficient. The computational results show that the schedules with the shortest and nearly shortest makespan have a significant (up to 18 %) increase in makespan due to proofing time deviation from expected duration. In this thesis, a method for developing resilient production planning that takes into account uncertain proofing time is proposed, so that even if the deviation in proofing time is extreme, the fluctuation in makespan is minimal. The experimental results with a production dataset revealed a proactive production plan, with only 5 minutes longer than the shortest makespan, but only 21 min fluctuating in makespan due to varying the proofing time from -10 % to +10 % of actual proofing time. This study proposed a common framework for small and medium-sized bakeries to improve their production efficiency in three steps: collecting production data, simulating production planning with the hybrid no-wait flow shop model, and running the optimization algorithm. The study suggests to use MPSO for solving single objective optimization problem and NSGA-II for multi-objective optimization problem. Based on real bakery production data, the results revealed that existing plans were significantly inefficient and could be optimized in a reasonable computational time using a robust optimization algorithm. Implementing such a framework in small and medium-sized bakery manufacturing operations could help to achieve an efficient and resilient production system.Die Steigerung der Produktionseffizienz durch die Optimierung von ArbeitsplĂ€nen ist eines der am meisten erforschten Themen im Bereich der Unternehmensplanung, die zur Entscheidungsfindung beitrĂ€gt. Es handelt sich dabei um die Aufteilung von Aufgaben auf die verfĂŒgbaren Ressourcen innerhalb der BeschrĂ€nkungen einer Produktionsanlage mit dem Ziel der Kostenminimierung. Diese Optimierung von ArbeitsplĂ€nen wird mit Hilfe eines Modells durchgefĂŒhrt, das die Aufgabenverteilung in der realen Welt mit Variablen und relevanten EinschrĂ€nkungen nachbildet, um die Produktion zu simulieren. ZusĂ€tzlich zu einem Modell sind Optimierungsverfahren erforderlich, die bei der Bewertung und Verbesserung der Aufgabenverteilung helfen, um eine effiziente Gesamtproduktion zu erzielen. Das gesamte Verfahren wird in der Regel auf einem Computer durchgefĂŒhrt, wobei diese beiden unterschiedlichen Komponenten (Modell und Optimierungsverfahren) zusammen einen Lösungsrahmen fĂŒr die Produktionsplanung bilden und die Entscheidungsfindung in verschiedenen Fertigungsindustrien unterstĂŒtzen. Kleine und mittelgroße BĂ€ckereien haben zumeist keinen Zugang zu den modernsten Werkzeugen und die meisten ihrer ProduktionsplĂ€ne beruhen auf persönlichen Erfahrungen. Dies macht einen erheblichen Unterschied bei den Produktionskosten im Vergleich zu den großen BĂ€ckereien aus, was sich in deren Marktdominanz widerspiegelt. In dieser Studie wird ein hybrides No-Wait-Flow-Shop-Modell vorgeschlagen, um einen Produktionsplan auf der Grundlage tatsĂ€chlicher Daten zu erstellen, der die BeschrĂ€nkungen der Produktionsumgebung in kleinen und mittleren BĂ€ckereien berĂŒcksichtigt. Mehrere einzel- und mehrzielorientierte, von der Natur inspirierte Optimierungsalgorithmen wurden implementiert, um effiziente ProduktionsplĂ€ne zu berechnen. Die Minimierung der Produktionsdauer ist das am hĂ€ufigsten verwendete QualitĂ€tskriterium fĂŒr die Produktionseffizienz, da sie die Produktionskosten dominiert. Jedoch wird in BĂ€ckereien durch hohe Leerlaufzeiten der Öfen Energie verschwendet was wiederum die Produktionskosten erhöht. Die Kombination beider QualitĂ€tskriterien (minimale Produktionskosten, minimale Leerlaufzeiten der Öfen) ermöglicht eine zusĂ€tzliche Kostenreduzierung durch Energieeinsparungen und kurze Produktionszeiten. Um einen effizienten Produktionsplan zu erhalten, wurden daher die Minimierung der Produktionsdauer und der Ofenleerlaufzeit in die Optimierungsziele einbezogen. Um optimale ProduktionsplĂ€ne fĂŒr bestehende Produktionsprozesse von BĂ€ckereien zu ermitteln, wurden folgende Algorithmen untersucht: Particle Swarm Optimization, Simulated Annealing und Nawaz-Enscore-Ham. Die Methode der Gewichtung wurde verwendet, um zwei Ziele zu einem einzigen Ziel zu kombinieren. Die Optimierungsalgorithmen erwiesen sich als gut genug, um in angemessener Zeit optimale PlĂ€ne zu berechnen, wobei bei einem untersuchten Datensatz die Produktionsdauer um 29 % und die Leerlaufzeit des Ofens um 8 % reduziert wurde. Allerdings erwies sich die Konvergenz der Algorithmen als unzureichend, da nur mit einer geringen Wahrscheinlichkeit das beste oder nahezu beste Ergebnis berechnet wurde. Im Gegensatz dazu zeigte der in dieser Studie ebenfalls untersuchte modifizierte Particle-swarm-Optimierungsalgorithmus (mPSO) eine deutliche Verbesserung der Konvergenz mit einer höheren Wahrscheinlichkeit, bessere Ergebnisse zu erzielen im Vergleich zu den anderen Algorithmen. Um Kompromisse zwischen zwei Zielen zu erzielen, wurden moderne Algorithmen zur Mehrzieloptimierung implementiert: Non-dominated Sorting Genetic Algorithm (NSGA-II), Strength Pareto Evolutionary Algorithm, Generalized Differential Evolution, Improved Multi-objective Particle Swarm Optimization (OMOPSO), and Speed-constrained Multi-objective Particle Swarm Optimization (SMPSO). Die Optimierungsalgorithmen ermöglichten eine effiziente Produktionsplanung mit einer Verringerung der Produktionsdauer um bis zu 12 % und einer Verringerung der Leerlaufzeit der Öfen um 26 % auf der Grundlage von Daten aus unterschiedlichen Produktionsprozessen. Der Leistungsvergleich zeigte signifikante Unterschiede zwischen diesen Mehrziel-Optimierungsalgorithmen, wobei NSGA-II am besten und OMOPSO und SMPSO am schlechtesten abschnitten. Die GĂ€rung ist ein wichtiger Verarbeitungsschritt, der zur QualitĂ€t des Endprodukts beitrĂ€gt, indem der Geschmack und die Textur des Brotes positiv beeinflusst werden kann. Die Dauer der GĂ€rung ist jedoch aufgrund der komplexen Interaktion von mehreren GrĂ¶ĂŸen abhĂ€ngig wie der Hefezustand, der Temperatur in der GĂ€rkammer und der chemischen Zusammensetzung des Mehls. Aufgrund der VariabilitĂ€t der GĂ€rzeit kann jedoch ein Produktionsplan, der auf die kĂŒrzeste Produktionszeit optimiert ist, sehr ineffizient sein. Die Berechnungsergebnisse zeigen, dass die PlĂ€ne mit der kĂŒrzesten und nahezu kĂŒrzesten Produktionsdauer eine erhebliche (bis zu 18 %) Erhöhung der Produktionsdauer aufgrund der Abweichung der GĂ€rzeit von der erwarteten Dauer aufweisen. In dieser Arbeit wird eine Methode zur Entwicklung einer robusten Produktionsplanung vorgeschlagen, die VerĂ€nderungen in den GĂ€rzeiten berĂŒcksichtigt, so dass selbst bei einer extremen Abweichung der GĂ€rzeit die Schwankung der Produktionsdauer minimal ist. Die experimentellen Ergebnisse fĂŒr einen Produktionsprozess ergaben einen robusten Produktionsplan, der nur 5 Minuten lĂ€nger ist als die kĂŒrzeste Produktionsdauer, aber nur 21 Minuten in der Produktionsdauer schwankt, wenn die GĂ€rzeit von -10 % bis +10 % der ermittelten GĂ€rzeit variiert. In dieser Studie wird ein Vorgehen fĂŒr kleine und mittlere BĂ€ckereien vorgeschlagen, um ihre Produktionseffizienz in drei Schritten zu verbessern: Erfassung von Produktionsdaten, Simulation von ProduktionsplĂ€nen mit dem hybrid No-Wait Flow Shop Modell und AusfĂŒhrung der Optimierung. FĂŒr die Einzieloptimierung wird der mPSO-Algorithmus und fĂŒr die Mehrzieloptimierung NSGA-II-Algorithmus empfohlen. Auf der Grundlage realer BĂ€ckereiproduktionsdaten zeigten die Ergebnisse, dass die in den BĂ€ckereien verwendeten PlĂ€ne ineffizient waren und mit Hilfe eines effizienten Optimierungsalgorithmus in einer angemessenen Rechenzeit optimiert werden konnten. Die Umsetzung eines solchen Vorgehens in kleinen und mittelgroßen BĂ€ckereibetrieben trĂ€gt dazu bei effiziente und robuste ProduktionsplĂ€ne zu erstellen und somit die WettbewerbsfĂ€higkeit dieser BĂ€ckereien zu erhöhen

    Optimierungsrahmen fĂŒr die Verbesserung der EnergieflexibilitĂ€t in WohngebĂ€uden

    Get PDF
    Energy flexibility is balancing the supply and demand of a building according to climate conditions, user preferences, and grid constraints. Energy flexibility in households is a practical approach to achieving sustainability in the building sector. However, the diversity in flexibility potential of energy systems and climatic variability complicate the selection of envelope parameters and building energy systems (BESs). This study aimed to design a framework to improve the energy flexibility of the building. For this purpose, a single-family house and diversified BESs were simulated in a TRNSYS-Python co-simulation platform. Initially, the bi-objective optimization identified flexible building envelopes in twenty-four locations. Then, the multi-criteria assessment of BESs was conducted using life-cycle energy flexibility indicators. Lastly, the energy flexibility potential of the BES was evaluated by employing steady-state optimization and model predictive control (MPC). The findings of this work set a benchmark for flexible household envelopes. The systematic approach for selecting BES could guide the energy system design, providing insight into energy flexibility. Further, this investigation has established that the dataset of building thermal load, boundary conditions, and control disturbances can be used to develop an MPC-based dynamic control. That controller could be employed on different BESs to achieve energy flexibility.EnergieflexibilitĂ€t ist der Ausgleich von Versorgung und Bedarf eines GebĂ€udes je nach Klima, NutzerprĂ€ferenzen und NetzbeschrĂ€nkungen. EnergieflexibilitĂ€t ist damit ein praktischer Ansatz fĂŒr Nachhaltigkeit in GebĂ€uden. Die Vielfalt des FlexibilitĂ€tspotenzials von Energiesystemen und die klimatischen Unterschiede erschweren jedoch die Auswahl von HĂŒllparametern und GebĂ€udeenergiesystemen (BESs). Diese Studie zielte darauf ab, einen Rahmen zur Verbesserung der energetischen FlexibilitĂ€t von GebĂ€uden zu entwickeln. Hierzu wurden ein Einfamilienhaus und verschiedene BES in einer TRNSYS-Python Co-Simulationsplattform simuliert. ZunĂ€chst wurden ĂŒber eine bi-objektive Optimierung flexible GebĂ€udehĂŒllen an vierundzwanzig Standorten ermittelt. Danach erfolgte eine multikriterielle Bewertung der BES anhand von EnergieflexibilitĂ€tsindikatoren ĂŒber den gesamten Lebenszyklus. Schließlich wurde das EnergieflexibilitĂ€tspotenzial der BES durch den Einsatz statischer Optimierung und modellprĂ€diktiver Regelung (MPC) bewertet. Die Ergebnisse dieser Arbeit setzen einen Maßstab fĂŒr flexible GebĂ€udehĂŒllen. Der systematische Ansatz zur Auswahl von BES könnte als Leitfaden fĂŒr die Auslegung zukĂŒnftiger Systeme dienen. DarĂŒber hinaus hat die Untersuchung ergeben, dass Daten zu thermischer Belastung des GebĂ€udes, Randbedingungen und Regelungsstörungen zur Entwicklung eines MPC verwendet werden können. Dieser Regler könnte bei verschiedenen BES eingesetzt werden, um EnergieflexibilitĂ€t zu erreichen
    • 

    corecore