22 research outputs found

    Applying MILP/Heuristic algorithms to automated job-shop scheduling problems in aircraft-part manufacturing

    Get PDF
    This work presents efficient algorithms based on Mixed-Integer Linear Programming (MILP) and heuristic strategies for complex job-shop scheduling problems raised in Automated Manufacturing Systems. The aim of this work is to find alternative a solution approach of production and transportation operations in a multi-product multi-stage production system that can be used to solve industrial-scale problems with a reasonable computational effort. The MILP model developed must take into account; heterogeneous recipes, single unit per stage, possible recycle flows, sequence-dependent free transferring times and load transfer movements in a single automated material-handling device. In addition, heuristic-based strategies are proposed to iteratively find and improve the solutions generated over time. These approaches were tested in different real-world problems arising in the surface-treatment process of metal components in the aircraft manufacturing industry.Fil: Aguirre, Adrian Marcelo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Santa Fe. Instituto de Desarrollo Tecnológico Para la Industria Química (i); Argentina. Universidad Nacional del Nordeste; ArgentinaFil: Mendez, Carlos Alberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Santa Fe. Instituto de Desarrollo Tecnológico Para la Industria Química (i); Argentina. Universidad Nacional del Nordeste; ArgentinaFil: García Sanchez, Alvaro. Universidad Politecnica de Madrid; EspañaFil: Ortega Mier, Miguel. Universidad Politecnica de Madrid; Españ

    Capse.jl: efficient and auto-differentiable CMB power spectra emulation

    Full text link
    We present Capse.jl, a novel emulator that utilizes neural networks to predict Cosmic Microwave Background (CMB) temperature, polarization and lensing angular power spectra. The emulator computes predictions in just a few microseconds with emulation errors below 0.1 σ\sigma for all the scales relevant for the planned CMB-S4 survey. Capse.jl can also be trained in an hour's time on a CPU. As a test case, we use Capse.jl to analyze Planck 2018 data and ACT DR4 data. We obtain the same result as standard analysis methods with a computational efficiency 3 to 6 order of magnitude higher. We take advantage of the differentiability of our emulators to use gradients-based methods, such as Pathfinder and Hamiltonian Monte Carlo (HMC), which speed up the convergence and increase sampling efficiency. Together, these features make Capse.jl a powerful tool for studying the CMB and its implications for cosmology. When using the fastest combination of our likelihoods, emulators, and analysis algorithm, we are able to perform a Planck TT + TE + EE analysis in less than a second. To ensure full reproducibility, we provide open access to the codes and data required to reproduce all the results of this work.Comment: 16 pages, 4 figure

    Integrated management of chemical processes in a competitive environment

    Get PDF
    El objetivo general de esta Tesis es mejorar el proceso de la toma de decisiones en la gestión de cadenas de suministro, tomando en cuenta principalmente dos diferencias: ser competitivo considerando las decisiones propias de la cadena de suministro, y ser competitivo dentro de un entorno global. La estructura de ésta tesis se divide en 4 partes principales: La Parte I consiste en una introducción general de los temas cubiertos en esta Tesis (Capítulo 1). Una revisión de la literatura, que nos permite identificar las problemáticas asociadas al proceso de toma de decisiones (Capítulo 2). El Capítulo 3 presenta una introducción de las técnicas y métodos de optimización utilizados para resolver los problemas propuestos en esta Tesis. La Parte II se enfoca en la integración de los niveles de decisión, buscando mejorar la toma de decisiones de la propia cadena de suministro. El Capítulo 4 presenta una formulación matemática que integra las decisiones de síntesis de procesos y las decisiones operacionales. Además, este capítulo presenta un modelo integrado para la toma de decisiones operacionales incluyendo las características del control de procesos. El Capítulo 5 muestra la integración de las decisiones del nivel táctico y el operacional, dicha propuesta está basada en el conocimiento adquirido capturando la información relacionada al nivel operacional. Una vez obtenida esta información se incluye en la toma de decisiones a nivel táctico. Finalmente en el capítulo 6 se desarrolla un modelo simplificado para integrar múltiples cadenas de suministro. El modelo propuesto incluye la información detallada de las entidades presentes en una cadena de suministro (suministradores, plantas de producción, distribuidores y mercados) introduciéndola en un modelo matemático para su coordinación. La Parte III propone la integración explicita de múltiples cadenas de suministro que tienen que enfrentar numerosas situaciones propias de un mercado global. Asimismo, esta parte presenta una nueva herramienta de optimización basada en el uso integrado de métodos de programación matemática y conceptos relacionados a la Teoría de Juegos. En el Capítulo 7 analiza múltiples cadenas de suministro que cooperan o compiten por la demanda global del mercado. El Capítulo 8 incluye una comparación entre el problema resuelto en el Capítulo anterior y un modelo estocástico, los resultados obtenidos nos permiten situar el comportamiento de los competidores como fuente exógena de la incertidumbre típicamente asociada la demanda del mercado. Además, los resultados de ambos Capítulos muestran una mejora sustancial en el coste total de las cadenas de suministro asociada al hecho de cooperar para atender de forma conjunta la demanda disponible. Es por esto, que el Capítulo 9 presenta una nueva herramienta de negociación, basada en la resolución del mismo problema (Capítulo 7) bajo un análisis multiobjetivo. Finalmente, la parte IV presenta las conclusiones finales y una descripción general del trabajo futuro.This Thesis aims to enhance the decision making process in the SCM, remarking the difference between optimizing the SC to be competitive by its own, and to be competitive in a global market in cooperative and competitive environments. The structure of this work has been divided in four main parts: Part I: consists in a general introduction of the main topics covered in this manuscript (Chapter I); a review of the State of the Art that allows us to identify new open issues in the PSE (Chapter 2). Finally, Chapter 3 introduces the main optimization techniques and methods used in this contribution. Part II focuses on the integration of decision making levels in order to improve the decision making of a single SC: Chapter 4 presents a novel formulation to integrate synthesis and scheduling decision making models, additionally, this chapter also shows an integrated operational and control decision making model for distributed generations systems (EGS). Chapter 5 shows the integration of tactical and operational decision making levels. In this chapter a knowledge based approach has been developed capturing the information related to the operational decision making level. Then, this information has been included in the tactical decision making model. In Chapter 6 a simplified approach for integrated SCs is developed, the detailed information of the typical production‐distribution SC echelons has been introduced in a coordinated SC model. Part III proposes the explicit integration of several SC’s decision making in order to face several real market situations. As well, a novel formulation is developed using an MILP model and Game Theory (GT) as a decision making tool. Chapter 7 includes the tactical and operational analysis of several SC’s cooperating or competing for the global market demand. Moreover, Chapter 8 includes a comparison, based on the previous results (MILP‐GT optimization tool) and a two stage stochastic optimization model. Results from both Chapters show how cooperating for the global demand represent an improvement of the overall total cost. Consequently, Chapter 9 presents a bargaining tool obtained by the Multiobjective (MO) resolution of the model presented in Chapter 7. Finally, final conclusions and further work have been provided in Part IV.Postprint (published version

    Optimization of crude oil operations scheduling by applying a two-stage stochastic programming approach with risk management

    Get PDF
    Producción CientíficaThis paper focuses on the problem of crude oil operations scheduling carried out in a system composed of a refinery and a marine terminal, considering uncertainty in the arrival date of the ships that supply the crudes. To tackle this problem, we develop a two-stage stochastic mixed-integer nonlinear programming (MINLP) model based on continuous-time representation. Furthermore, we extend the proposed model to include risk management by considering the Conditional Value-at-Risk (CVaR) measure as the objective function, and we analyze the solutions obtained for different risk levels. Finally, to evaluate the solution obtained, we calculate the Expected Value of Perfect Information (EVPI) and the Value of the Stochastic Solution (VSS) to assess whether two-stage stochastic programming model offers any advantage over simpler deterministic approaches.Gobierno de España - proyects a-CIDiT (PID2021-123654OB-C31) and InCo4In (PGC 2018-099312-B-C31)Junta de Castilla y León - EU-FEDER (CLU 2017-09, CL-EI-2021-07, UIC 233

    Primary schools as community hubs: a review of the literature

    Get PDF
    Considerable work has been undertaken over several years to establish primary schools as community hubs in the City of Hume through the Hubs Strategy Group for the Hume Communities for Children Initiative and, more recently, the Supporting Parents Developing Children project. This work has highlighted the need for a primary school community hub toolkit. The purpose of this review is to inform the development of a resource (e.g. a toolkit) that can be used by other schools so that they can also establish themselves as community hubs. An agreed definition of schools as community hubs within the literature has not been reached. Rather, the notion of schools as community hubs seems to be understood in a variety of ways. For the purposes of this review we will draw on the definitions provided by Black (2008) and the Hubs Strategy Group for the Broadmeadows Communities for Children Initiative (2009). Black (2008) describes hubs as involving \u27collaboration between school education systems and the other sectors (community, business, local government and philanthropy) to support the learning and wellbeing of young people, especially those facing disadvantage\u27 (p. 6). These collaborations can range from sharing, co-locating or joint use of physical facilities, through to schools as the centre of a hub or precinct that offers multiple services for the whole community. In the City of Hume, the Hubs Strategy Group have conceptualised a hub as, a welcoming place for families that engages key service providers to work collaboratively. A hub can be a single location or a network of places working together to provide services, such as schools, kindergartens, maternal and child health, and other relevant agencies. Hubs facilitate connections between key services and professionals and represent a paradigm shift in the planning and practice of service provision. Services and their staff are required to rethink existing practice to move to an inclusive practices framework at a professional and community level. &nbsp

    Catalan Health Institute: 2012 Annual Report

    Get PDF
    Sistema sanitari públic; Activitat assistencial; MemòriaPublic health system; Healthcare activity; ReportSistema sanitario público; Actividad asistencial; MemoriaLa Memòria de l’Institut Català de la Salut vol ser un reflex de la realitat de l’empresa sanitària més gran del país. En aquest document trobareu un recull de la feina realitzada pels professionals de la institució, que s’orienta a millorar la salut de la ciutadania

    Compositional construction and analysis of Petri net systems

    Get PDF

    Explainable methods for knowledge graph refinement and exploration via symbolic reasoning

    Get PDF
    Knowledge Graphs (KGs) have applications in many domains such as Finance, Manufacturing, and Healthcare. While recent efforts have created large KGs, their content is far from complete and sometimes includes invalid statements. Therefore, it is crucial to refine the constructed KGs to enhance their coverage and accuracy via KG completion and KG validation. It is also vital to provide human-comprehensible explanations for such refinements, so that humans have trust in the KG quality. Enabling KG exploration, by search and browsing, is also essential for users to understand the KG value and limitations towards down-stream applications. However, the large size of KGs makes KG exploration very challenging. While the type taxonomy of KGs is a useful asset along these lines, it remains insufficient for deep exploration. In this dissertation we tackle the aforementioned challenges of KG refinement and KG exploration by combining logical reasoning over the KG with other techniques such as KG embedding models and text mining. Through such combination, we introduce methods that provide human-understandable output. Concretely, we introduce methods to tackle KG incompleteness by learning exception-aware rules over the existing KG. Learned rules are then used in inferring missing links in the KG accurately. Furthermore, we propose a framework for constructing human-comprehensible explanations for candidate facts from both KG and text. Extracted explanations are used to insure the validity of KG facts. Finally, to facilitate KG exploration, we introduce a method that combines KG embeddings with rule mining to compute informative entity clusters with explanations.Wissensgraphen haben viele Anwendungen in verschiedenen Bereichen, beispielsweise im Finanz- und Gesundheitswesen. Wissensgraphen sind jedoch unvollständig und enthalten auch ungültige Daten. Hohe Abdeckung und Korrektheit erfordern neue Methoden zur Wissensgraph-Erweiterung und Wissensgraph-Validierung. Beide Aufgaben zusammen werden als Wissensgraph-Verfeinerung bezeichnet. Ein wichtiger Aspekt dabei ist die Erklärbarkeit und Verständlichkeit von Wissensgraphinhalten für Nutzer. In Anwendungen ist darüber hinaus die nutzerseitige Exploration von Wissensgraphen von besonderer Bedeutung. Suchen und Navigieren im Graph hilft dem Anwender, die Wissensinhalte und ihre Limitationen besser zu verstehen. Aufgrund der riesigen Menge an vorhandenen Entitäten und Fakten ist die Wissensgraphen-Exploration eine Herausforderung. Taxonomische Typsystem helfen dabei, sind jedoch für tiefergehende Exploration nicht ausreichend. Diese Dissertation adressiert die Herausforderungen der Wissensgraph-Verfeinerung und der Wissensgraph-Exploration durch algorithmische Inferenz über dem Wissensgraph. Sie erweitert logisches Schlussfolgern und kombiniert es mit anderen Methoden, insbesondere mit neuronalen Wissensgraph-Einbettungen und mit Text-Mining. Diese neuen Methoden liefern Ausgaben mit Erklärungen für Nutzer. Die Dissertation umfasst folgende Beiträge: Insbesondere leistet die Dissertation folgende Beiträge: • Zur Wissensgraph-Erweiterung präsentieren wir ExRuL, eine Methode zur Revision von Horn-Regeln durch Hinzufügen von Ausnahmebedingungen zum Rumpf der Regeln. Die erweiterten Regeln können neue Fakten inferieren und somit Lücken im Wissensgraphen schließen. Experimente mit großen Wissensgraphen zeigen, dass diese Methode Fehler in abgeleiteten Fakten erheblich reduziert und nutzerfreundliche Erklärungen liefert. • Mit RuLES stellen wir eine Methode zum Lernen von Regeln vor, die auf probabilistischen Repräsentationen für fehlende Fakten basiert. Das Verfahren erweitert iterativ die aus einem Wissensgraphen induzierten Regeln, indem es neuronale Wissensgraph-Einbettungen mit Informationen aus Textkorpora kombiniert. Bei der Regelgenerierung werden neue Metriken für die Regelqualität verwendet. Experimente zeigen, dass RuLES die Qualität der gelernten Regeln und ihrer Vorhersagen erheblich verbessert. • Zur Unterstützung der Wissensgraph-Validierung wird ExFaKT vorgestellt, ein Framework zur Konstruktion von Erklärungen für Faktkandidaten. Die Methode transformiert Kandidaten mit Hilfe von Regeln in eine Menge von Aussagen, die leichter zu finden und zu validieren oder widerlegen sind. Die Ausgabe von ExFaKT ist eine Menge semantischer Evidenzen für Faktkandidaten, die aus Textkorpora und dem Wissensgraph extrahiert werden. Experimente zeigen, dass die Transformationen die Ausbeute und Qualität der entdeckten Erklärungen deutlich verbessert. Die generierten unterstützen Erklärungen unterstütze sowohl die manuelle Wissensgraph- Validierung durch Kuratoren als auch die automatische Validierung. • Zur Unterstützung der Wissensgraph-Exploration wird ExCut vorgestellt, eine Methode zur Erzeugung von informativen Entitäts-Clustern mit Erklärungen unter Verwendung von Wissensgraph-Einbettungen und automatisch induzierten Regeln. Eine Cluster-Erklärung besteht aus einer Kombination von Relationen zwischen den Entitäten, die den Cluster identifizieren. ExCut verbessert gleichzeitig die Cluster- Qualität und die Cluster-Erklärbarkeit durch iteratives Verschränken des Lernens von Einbettungen und Regeln. Experimente zeigen, dass ExCut Cluster von hoher Qualität berechnet und dass die Cluster-Erklärungen für Nutzer informativ sind

    Comparative study of in vivo and in vitro embryogenesis in Arabidopsis thaliana L

    Get PDF
    corecore