1,897 research outputs found

    Tactical supply chain planning under a carbon tax policy scheme: a case study

    Get PDF
    Greenhouse gas emissions are receiving greater scrutiny in many countries due to international forces to reduce anthropogenic global climate change. Industry and their supply chains represent a major source of these emissions. This paper presents a tactical supply chain planning model that integrates economic and carbon emission objectives under a carbon tax policy scheme. A modified Cross-Entropy solution method is adopted to solve the proposed nonlinear supply chain planning model. Numerical experiments are completed utilizing data from an actual organization in Australia where a carbon tax is in operation. The analyses of the numerical results provide important organizational and policy insights on (1) the financial and emissions reduction impacts of a carbon tax at the tactical planning level, (2) the use of cost/emission tradeoff analysis for making informed decisions on investments, (3) the way to price carbon for maximum environmental returns per dollar increase in supply chain cost

    Improving decision making for incentivised and weather-sensitive projects

    Get PDF
    The field of project management has originated from the domain of operational research, which focuses on the mathematical optimization of operational problems. However, in recent decades an increasingly broad perspective has been applied to the field of project management. As such, project management has spawned a number of very active sub- domains, which focus not solely on the scheduling of the project’s baseline, but also on the analysis of risk, as well as the controlling of project execution. This dissertation focuses on two areas where existing literature is still lacking. The first area is the use of incentivised contractual agreements between the owner of a project, and the contractor who is hired to execute the project. Whereas this area has received growing attention in recent years, the majority of studies remained strongly descriptive. Hence, the aim of the first part of this dissertation is to develop a more prescriptive approach from both the owner’s and the contractor’s perspective. The second part of this dissertation investigates the use of dedicated weather models to improve operational performance of weather-sensitive projects. During recent decades, significant effort has been made to improve the quality of weather simulation models. Moreover, the amount of available weather data has been steadily increasing. This opens up a lot of new possibilities for using more precise weather models in order to support operational decision making. In spite of this, the number of applications of these weather models in operational research has remained rather limited. As such, the aim of the second part of this dissertation is to leverage these weather models to improve the scheduling of offshore construction projects, as well as preventive maintenance of offshore wind turbines

    How green is a lean supply chain?

    Get PDF
    This article presents a supply chain planning model that can be used to investigate tradeoffs between cost and environmental degradation including carbon emissions, energy consumption and waste generation. The model also incorporates other aspects of real world supply chains such as multiple transport lot sizing and flexible holding capacity of warehouses. The application of the model and solution method is investigated in an actual case problem. Our analysis of the numerical results focuses on investigating relationship between lean practices and green outcomes. We find that (1) not all lean interventions at the tactical supply chain planning level result in green benefits, and (2) an agile supply chain is the greenest and most efficient alternative when compared to strictly lean and centralized situations

    Data analytics for mobile traffic in 5G networks using machine learning techniques

    Get PDF
    This thesis collects the research works I pursued as Ph.D. candidate at the Universitat Politecnica de Catalunya (UPC). Most of the work has been accomplished at the Mobile Network Department Centre Tecnologic de Telecomunicacions de Catalunya (CTTC). The main topic of my research is the study of mobile network traffic through the analysis of operative networks dataset using machine learning techniques. Understanding first the actual network deployments is fundamental for next-generation network (5G) for improving the performance and Quality of Service (QoS) of the users. The work starts from the collection of a novel type of dataset, using an over-the-air monitoring tool, that allows to extract the control information from the radio-link channel, without harming the users’ identities. The subsequent analysis comprehends a statistical characterization of the traffic and the derivation of prediction models for the network traffic. A wide group of algorithms are implemented and compared, in order to identify the highest performances. Moreover, the thesis addresses a set of applications in the context mobile networks that are prerogatives in the future mobile networks. This includes the detection of urban anomalies, the user classification based on the demanded network services, the design of a proactive wake-up scheme for efficient-energy devices.Esta tesis recoge los trabajos de investigación que realicé como Ph.D. candidato a la Universitat Politecnica de Catalunya (UPC). La mayor parte del trabajo se ha realizado en el Centro Tecnológico de Telecomunicaciones de Catalunya (CTTC) del Departamento de Redes Móviles. El tema principal de mi investigación es el estudio del tráfico de la red móvil a través del análisis del conjunto de datos de redes operativas utilizando técnicas de aprendizaje automático. Comprender primero las implementaciones de red reales es fundamental para la red de próxima generación (5G) para mejorar el rendimiento y la calidad de servicio (QoS) de los usuarios. El trabajo comienza con la recopilación de un nuevo tipo de conjunto de datos, utilizando una herramienta de monitoreo por aire, que permite extraer la información de control del canal de radioenlace, sin dañar las identidades de los usuarios. El análisis posterior comprende una caracterización estadística del tráfico y la derivación de modelos de predicción para el tráfico de red. Se implementa y compara un amplio grupo de algoritmos para identificar los rendimientos más altos. Además, la tesis aborda un conjunto de aplicaciones en el contexto de redes móviles que son prerrogativas en las redes móviles futuras. Esto incluye la detección de anomalías urbanas, la clasificación de usuarios basada en los servicios de red demandados, el diseño de un esquema de activación proactiva para dispositivos de energía eficiente.Postprint (published version

    Production Engineering and Management

    Get PDF
    The annual International Conference on Production Engineering and Management takes place for the sixth time his year, and can therefore be considered a well - established event that is the result of the joint effort of the OWL University of Applied Sciences and the University of Trieste. The conference has been established as an annual meeting under the Double Degree Master Program ‘Production Engineering and Management’ by the two partner universities. The main goal of the conference is to provide an opportunity for students, researchers and professionals from Germany, Italy and abroad, to meet and exchange information, discuss experiences, specific practices and technical solutions used in planning, design and management of production and service systems. In addition, the conference is a platform aimed at presenting research projects, introducing young academics to the tradition of Symposiums and promoting the exchange of ideas between the industry and the academy. Especially the contributions of successful graduates of the Double Degree Master Program ‘Production Engineering and Management’ and those of other postgraduate researchers from several European countries have been enforced. This year’s special focus is on Direct Digital Manufacturing in the context of Industry 4.0, a topic of great interest for the global industry. The concept is spreading, but the actual solutions must be presented in order to highlight the practical benefits to industry and customers. Indeed, as Henning Banthien, Secretary General of the German ‘Plattform Industrie 4.0’ project office, has recently remarked, “Industry 4.0 requires a close alliance amongst the private sector, academia, politics and trade unions” in order to be “translated into practice and be implemented now”. PEM 2016 takes place between September 29 and 30, 2016 at the OWL University of Applied Sciences in Lemgo. The program is defined by the Organizing and Scientific Committees and clustered into scientific sessions covering topics of main interest and importance to the participants of the conference. The scientific sessions deal with technical and engineering issues, as well as management topics, and include contributions by researchers from academia and industry. The extended abstracts and full papers of the contributions underwent a double - blind review process. The 24 accepted presentations are assigned, according to their subject, to one of the following sessions: ‘Direct Digital Manufacturing in the Context of Industry 4.0’, ‘Industrial Engineering and Lean Management’, ‘Management Techniques and Methodologies’, ‘Wood Processing Technologies and Furniture Production’ and ‘Innovation Techniques and Methodologies

    Quantum Algorithms for Solving Hard Constrained Optimization Problems

    Get PDF
    En aquesta investigació, s'han examinat tècniques d'optimització per resoldre problemes de restriccions i s'ha fet un estudi de l'era quàntica i de les empreses líders del mercat, com ara IBM, D-Wave, Google, Xanadu, AWS-Braket i Microsoft. S'ha après sobre la comunitat, les plataformes, l'estat de les investigacions i s'han estudiat els postulats de la mecànica quàntica que serveixen per crear els sistemes i algorismes quàntics més eficients. Per tal de saber si és possible resoldre problemes de Problema de cerca de restriccions (CSP) de manera més eficient amb la computació quàntica, es va definir un escenari perquè tant la computació clàssica com la quàntica tinguessin un bon punt de referència. En primer lloc, la prova de concepte es centra en el problema de programació dels treballadors socials i més tard en el tema de la preparació per lots i la selecció de comandes com a generalització del Problema dels treballadors socials (SWP). El problema de programació dels treballadors socials és una mena de problema d'optimització combinatòria que, en el millor dels casos, es pot resoldre en temps exponencial; veient que el SWP és NP-Hard, proposa fer servir un altre enfoc més enllà de la computació clàssica per a la seva resolució. Avui dia, el focus a la computació quàntica ja no és només per la seva enorme capacitat informàtica sinó també, per l'ús de la seva imperfecció en aquesta era Noisy Intermediate-Scale Quantum (NISQ) per crear un poderós dispositiu d'aprenentatge automàtic que utilitza el principi variacional per resoldre problemes d'optimització en reduir la classe de complexitat. A la tesi es proposa una formulació (quadràtica) per resoldre el problema de l'horari dels treballadors socials de manera eficient utilitzant Variational Quantum Eigensolver (VQE), Quantum Approximate Optimization Algorithm (QAOA), Minimal Eigen Optimizer i ADMM optimizer. La viabilitat quàntica de l'algorisme s'ha modelat en forma QUBO, amb Docplex simulat Cirq, Or-Tools i provat a ordinadors IBMQ. Després d'analitzar els resultats de l'enfocament anterior, es va dissenyar un escenari per resoldre el SWP com a raonament basat en casos (qCBR), tant quànticament com clàssicament. I així poder contribuir amb un algorisme quàntic centrat en la intel·ligència artificial i l'aprenentatge automàtic. El qCBR és una tècnica d’aprenentatge automàtic basada en la resolució de nous problemes que utilitza l’experiència, com ho fan els humans. L'experiència es representa com una memòria de casos que conté qüestions prèviament resoltes i utilitza una tècnica de síntesi per adaptar millor l'experiència al problema nou. A la definició de SWP, si en lloc de pacients es tenen lots de comandes i en lloc de treballadors socials robots mòbils, es generalitza la funció objectiu i les restriccions. Per això, s'ha proposat una prova de concepte i una nova formulació per resoldre els problemes de picking i batching anomenat qRobot. Es va fer una prova de concepte en aquesta part del projecte mitjançant una Raspberry Pi 4 i es va provar la capacitat d'integració de la computació quàntica dins de la robòtica mòbil, amb un dels problemes més demandats en aquest sector industrial: problemes de picking i batching. Es va provar en diferents tecnologies i els resultats van ser prometedors. A més, en cas de necessitat computacional, el robot paral·lelitza part de les operacions en computació híbrida (quàntica + clàssica), accedint a CPU i QPU distribuïts en un núvol públic o privat. A més, s’ha desenvolupat un entorn estable (ARM64) dins del robot (Raspberry) per executar operacions de gradient i altres algorismes quàntics a IBMQ, Amazon Braket (D-Wave) i Pennylane de forma local o remota. Per millorar el temps d’execució dels algorismes variacionals en aquesta era NISQ i la següent, s’ha proposat EVA: un algorisme d’aproximació de Valor Exponencial quàntic. Fins ara, el VQE és el vaixell insígnia de la computació quàntica. Avui dia, a les plataformes líders del mercat de computació quàntica al núvol, el cost de l'experimentació dels circuits quàntics és proporcional al nombre de circuits que s'executen en aquestes plataformes. És a dir, amb més circuits més cost. Una de les coses que aconsegueix el VQE, el vaixell insígnia d'aquesta era de pocs qubits, és la poca profunditat en dividir el Hamiltonià en una llista de molts petits circuits (matrius de Pauli). Però aquest mateix fet, fa que simular amb el VQE sigui molt car al núvol. Per aquesta mateixa raó, es va dissenyar EVA per poder calcular el valor esperat amb un únic circuit. Tot i haver respost a la hipòtesi d'aquesta tesis amb tots els estudis realitzats, encara es pot continuar investigant per proposar nous algorismes quàntics per millorar problemes d'optimització.En esta investigación, se han examinado técnicas de optimización para resolver problemas de restricciones y se ha realizado un estudio de la era cuántica y de las empresas lideres del mercado, como IBM, D-Wave, Google, Xanadu, AWS-Braket y Microsoft. Se ha aprendido sobre su comunidad, sus plataformas, el estado de sus investigaciones y se han estudiado los postulados de la mecánica cuántica que sirven para crear los sistemas y algoritmos cuánticos más eficientes. Por tal de saber si es posible resolver problemas de Problema de búsqueda de restricciones (CSP) de manera más eficiente con la computación cuántica, se definió un escenario para que tanto la computación clásica como la cuántica tuvieran un buen punto de referencia. En primer lugar, la prueba de concepto se centra en el problema de programación de los trabajadores sociales y más tarde en el tema de la preparación por lotes y la selección de pedidos como una generalización del Problema de los trabajadores sociales (SWP). El problema de programación de los trabajadores sociales es una clase de problema de optimización combinatoria que, en el mejor de los casos, puede resolverse en tiempo exponencial; viendo que el SWP es NP-Hard, propone usar otro enfoque mas allá de la computación clásica para su resolución. Hoy en día, el foco en la computación cuántica ya no es sólo por su enorme capacidad informática sino también, por el uso de su imperfección en esta era Noisy Intermediate-Scale Quantum (NISQ) para crear un poderoso dispositivo de aprendizaje automático que usa el principio variacional para resolver problemas de optimización al reducir su clase de complejidad. En la tesis se propone una formulación (cuadrática) para resolver el problema del horario de los trabajadores sociales de manera eficiente usando Variational Quantum Eigensolver (VQE), Quantum Approximate Optimization Algorithm (QAOA), Minimal Eigen Optimizer y ADMM optimizer. La viabilidad cuántica del algoritmo se ha modelado en forma QUBO, con Docplex simulado Cirq, Or-Tools y probado en computadoras IBMQ. Después de analizar los resultados del enfoque anterior, se diseñó un escenario para resolver el SWP como razonamiento basado en casos (qCBR), tanto cuántica como clásicamente. Y así, poder contribuir con un algoritmo cuántico centrado en la inteligencia artificial y el aprendizaje automático. El qCBR es una técnica de aprendizaje automático basada en la resolución de nuevos problemas que utiliza la experiencia, como lo hacen los humanos. La experiencia se representa como una memoria de casos que contiene cuestiones previamente resueltas y usa una técnica de síntesis para adaptar mejor la experiencia al nuevo problema. En la definición de SWP, si en lugar de pacientes se tienen lotes de pedidos y en lugar de trabajadores sociales robots móviles, se generaliza la función objetivo y las restricciones. Para ello, se ha propuesto una prueba de concepto y una nueva formulación para resolver los problemas de picking y batching llamado qRobot. Se hizo una prueba de concepto en esta parte del proyecto a través de una Raspberry Pi 4 y se probó la capacidad de integración de la computación cuántica dentro de la robótica móvil, con uno de los problemas más demandados en este sector industrial: problemas de picking y batching. Se probó en distintas tecnologías y los resultados fueron prometedores. Además, en caso de necesidad computacional, el robot paraleliza parte de las operaciones en computación híbrida (cuántica + clásica), accediendo a CPU y QPU distribuidos en una nube pública o privada. Además, desarrollamos un entorno estable (ARM64) dentro del robot (Raspberry) para ejecutar operaciones de gradiente y otros algoritmos cuánticos en IBMQ, Amazon Braket (D-Wave) y Pennylane de forma local o remota. Para mejorar el tiempo de ejecución de los algoritmos variacionales en esta era NISQ y la siguiente, se ha propuesto EVA: un algoritmo de Aproximación de Valor Exponencial cuántico. Hasta la fecha, el VQE es el buque insignia de la computación cuántica. Hoy en día, en las plataformas de computación cuántica en la nube líderes de mercado, el coste de la experimentación de los circuitos cuánticos es proporcional al número de circuitos que se ejecutan en dichas plataformas. Es decir, con más circuitos mayor coste. Una de las cosas que consigue el VQE, el buque insignia de esta era de pocos qubits, es la poca profundidad al dividir el Hamiltoniano en una lista de muchos pequeños circuitos (matrices de Pauli). Pero este mismo hecho, hace que simular con el VQE sea muy caro en la nube. Por esta misma razón, se diseñó EVA para poder calcular el valor esperado con un único circuito. Aún habiendo respuesto a la hipótesis de este trabajo con todos los estudios realizados, todavía se puede seguir investigando para proponer nuevos algoritmos cuánticos para mejorar problemas de optimización combinatoria.In this research, Combinatorial optimization techniques to solve constraint problems have been examined. A study of the quantum era and market leaders such as IBM, D-Wave, Google, Xanadu, AWS-Braket and Microsoft has been carried out. We have learned about their community, their platforms, the status of their research, and the postulates of quantum mechanics that create the most efficient quantum systems and algorithms. To know if it is possible to solve Constraint Search Problem (CSP) problems more efficiently with quantum computing, a scenario was defined so that both classical and quantum computing would have a good point of reference. First, the proof of concept focuses on the social worker scheduling problem and later on the issue of batch picking and order picking as a generalization of the Social Workers Problem (SWP). The social workers programming problem is a combinatorial optimization problem that can be solved exponentially at best; seeing that the SWP is NP-Hard, it claims using another approach beyond classical computation for its resolution. Today, the focus on quantum computing is no longer only on its enormous computing power but also on the use of its imperfection in this era Noisy Intermediate-Scale Quantum (NISQ) to create a powerful machine learning device that uses the variational principle to solve optimization problems by reducing their complexity class. In the thesis, a (quadratic) formulation is proposed to solve the problem of social workers' schedules efficiently using Variational Quantum Eigensolver (VQE), Quantum Approximate Optimization Algorithm (QAOA), Minimal Eigen Optimizer and ADMM optimizer. The quantum feasibility of the algorithm has been modelled in QUBO form, with Cirq simulated, Or-Tools and tested on IBMQ computers. After analyzing the results of the above approach, a scenario was designed to solve the SWP as quantum case-based reasoning (qCBR), both quantum and classically. And thus, to be able to contribute with a quantum algorithm focused on artificial intelligence and machine learning. The qCBR is a machine learning technique based on solving new problems that use experience, as humans do. The experience is represented as a memory of cases containing previously resolved questions and uses a synthesis technique to adapt the background to the new problem better. In the definition of SWP, if instead of patients there are batches of orders and instead of social workers mobile robots, the objective function and the restrictions are generalized. To do this, a proof of concept and a new formulation has been proposed to solve the problems of picking and batching called qRobot. A proof of concept was carried out in this part of the project through a Raspberry Pi 4 and the integration capacity of quantum computing within mobile robotics was tested, with one of the most demanded problems in this industrial sector: picking and batching problems. It was tested on different technologies, and the results were promising. Furthermore, in case of computational need, the robot parallelizes part of the operations in hybrid computing (quantum + classical), accessing CPU and QPU distributed in a public or private cloud. Furthermore, we developed a stable environment (ARM64) inside the robot (Raspberry) to run gradient operations and other quantum algorithms on IBMQ, Amazon Braket (D-Wave) and Pennylane locally or remotely. To improve the execution time of variational algorithms in this NISQ era and the next, EVA has been proposed: A quantum Exponential Value Approximation algorithm. To date, the VQE is the flagship of quantum computing. Today, in the market-leading quantum cloud computing platforms, the cost of experimenting with quantum circuits is proportional to the number of circuits running on those platforms. That is, with more circuits, higher cost. One of the things that the VQE, the flagship of this low-qubit era, achieves is shallow depth by dividing the Hamiltonian into a list of many small circuits (Pauli matrices). But this very fact makes simulating with VQE very expensive in the cloud. For this same reason, EVA was designed to calculate the expected value with a single circuit. Even having answered the hypothesis of this work with all the studies carried out, it is still possible to continue research to propose new quantum algorithms to improve combinatorial optimization

    Hybrid Advanced Optimization Methods with Evolutionary Computation Techniques in Energy Forecasting

    Get PDF
    More accurate and precise energy demand forecasts are required when energy decisions are made in a competitive environment. Particularly in the Big Data era, forecasting models are always based on a complex function combination, and energy data are always complicated. Examples include seasonality, cyclicity, fluctuation, dynamic nonlinearity, and so on. These forecasting models have resulted in an over-reliance on the use of informal judgment and higher expenses when lacking the ability to determine data characteristics and patterns. The hybridization of optimization methods and superior evolutionary algorithms can provide important improvements via good parameter determinations in the optimization process, which is of great assistance to actions taken by energy decision-makers. This book aimed to attract researchers with an interest in the research areas described above. Specifically, it sought contributions to the development of any hybrid optimization methods (e.g., quadratic programming techniques, chaotic mapping, fuzzy inference theory, quantum computing, etc.) with advanced algorithms (e.g., genetic algorithms, ant colony optimization, particle swarm optimization algorithm, etc.) that have superior capabilities over the traditional optimization approaches to overcome some embedded drawbacks, and the application of these advanced hybrid approaches to significantly improve forecasting accuracy
    corecore