17 research outputs found

    Interaction-Aware Motion Planning for Automated Vehicles

    Get PDF
    Die Bewegungsplanung für automatisierte Fahrzeuge (AVs) in gemischtem Verkehr ist eine herausfordernde Aufgabe. Hierbei bezeichnet gemischter Verkehr, Verkehr bestehend aus von Menschen gefahrenen Fahrzeugen sowie automatisierten Fahrzeugen. Um die Komplexität der Aufgabe zu reduzieren, verwenden state-of-the-art Planungsansätze oft die vereinfachende Annahme, dass das zukünftige Verhalten umliegender Fahrzeuge unabhängig vom Plan des AVs vorhergesagt werden kann. Während die Trennung von Prädiktion und Planung für viele Verkehrssituationen eine hilfreiche Vereinfachung darstellt, werden hierbei Interaktionen zwischen den Verkehrsteilnehmern ignoriert, was besonders in interaktiven Verkehrssituationen zu suboptimalem, übermäßig konservativem Fahrverhalten führen kann. In dieser Arbeit werden zwei interaktionsbewusste Bewegungsplanungsalgorithmen vorgeschlagen, die in der Lage sind übermäßig konservatives Fahrverhalten zu reduzieren. Der Kernaspekt dieser Algorithmen ist, dass Prädiktion und Planung gleichzeitig gelöst werden. Mit diesen Algorithmen können anspruchsvolle Fahrmanöver, wie z. B. das Reißverschlussverfahren in dichtem Verkehr, durchgeführt werden, die mit state-of-the-art Planungsansätzen nicht möglich sind. Der erste Algorithmus basiert auf Methoden der Multi-Agenten-Planung. Interaktionen zwischen Verkehrsteilnehmern werden durch Optimierung gekoppelter Trajektorien mittels einer gemeinsamen Kostenfunktion approximiert. Das Kernstück des Algorithmus ist eine neuartige Multi-Agenten-Trajektorienplanungsformulierung, die auf gemischt-ganzzahliger quadratischer Programmierung (MIQP) basiert. Die Formulierung garantiert global optimale Lösungen und ist somit in der Lage das kombinatorische Problem zu lösen, welches kontinuierliche Methoden auf lokal optimale Lösungen beschränkt. Desweiteren kann durch den vorgestellten Ansatz ein manöverneutrales Verhalten erzeugt werden, das Manöverentscheidungen in ungewissen Situationen aufschieben kann. Der zweite Ansatz formuliert Interaktionen zwischen einem menschlichen Fahrer und einem AV als ein Stackelberg-Spiel. Im Gegensatz zu bestehenden Arbeiten kann der Algorithmus allgemeine nichtlineare Zustands- und Eingabebeschränkungen berücksichtigen. Desweiteren führen wir Mechanismen zur Integration von Kooperation und Rücksichtnahme in die Planung ein. Damit wird übermäßig aggressives Fahrverhalten verhindert, was in der Literatur als ein Problem interaktionsbewusster Planungsmethoden identifiziert wurde. Die Wirksamkeit, Robustheit und Echtzeitfähigkeit des Algorithmus wird durch numerische Experimente gezeigt

    Non-Traditional Flow Shop Scheduling Using CSP Scheduling Flow Shop No Tradicional Empleando CSP

    Get PDF
    Abstract This paper addresses the problem of scheduling in a flow shop manufacturing environment with non-traditional requirements, where some jobs must be scheduled earlier and others later depending on the priority established by the demand characteristics supplied. The problem is formulated mathematically, and given its nonlinearity, we propose a CSP (Constraint Satisfaction Problem) model, which is formulated using constraint programming with the software OPL Studio ® . A set of experiments was performed by varying the weighting of jobs. We also varied the deadlines and waiting times among the machines. Finally, different production schedules were attained according to the type of experiment, thus solving the problem of non-traditional scheduling. Keywords: Scheduling, Operations Programming, Flow-shop Manufacturing Environment, Constraint Programming. Resumen En este documento se aborda el problema del Scheduling en un ambiente de fabricación Flow-Shop con requerimientos no tradicionales, en el cual algunos trabajos deben ser programados en su momento más temprano y otros en su momento más tardío dependiendo de la prioridad establecida por las características de la demanda a suplir. El problema es formulado matemáticamente y dada su no linealidad se propone un modelo CSP (Constraint Satisfaction Problem) para su solución, el cual se formula mediante programación por restricciones utilizando el software OPL studio ® . Se realizaron un conjunto de experimentos, variando la ponderación de los trabajos, así mismo se variaron la fecha de terminación y los tiempos de espera entre máquinas. Finalmente, se obtuvieron diferentes programas de producción acorde al tipo de experimento dando una solución al problema del Scheduling no tradicional

    Моделі, алгоритми та програмне забезпечення для прогнозування цін на електроенергію на основі статистичних методів та штучних нейронних мереж

    Get PDF
    Об'єкт дослідження: процес оптимізації моделей для прогнозування часових рядів. Предмет дослідження: методи та моделі прогнозування цін на електроенергію. Мета кваліфікаційної роботи: зниження витрат підприємств на електроенергію за рахунок оптимізації графіків споживання та/або виробництва. Методи дослідження. Для аналізу даних використані наступні методи: кореляційний аналіз, моделі авторегресії і ковзного середнього, сезонна модель Бокса-Дженкінса, спектральний аналіз. Для побудування прогностичних моделей використані математичні та статистичні методи, нейронні мережі прямого поширення. Наукова новизна результатів кваліфікаційної роботи полягає в удосконаленні методів прогнозування цін на електроенергію. Практична цінність полягає в тому, що моделі та методи, запропоновані в дослідженні, дозволяють підприємствам зменшувати витрати або максимізувати прибуток, оптимізуючи графік споживання або виробництва електроенергі

    Scalable Automatic Service Composition using Genetic Algorithms

    Get PDF
    A composition of simple web services, each dedicated to performing a specific sub- task involved, proves to be a more competitive solution than an equivalent atomic web service for a complex requirement comprised of several sub-tasks. Composite services have been extensively researched and perfected in many aspects for over two decades, owing to benefits such as component re-usability, broader options for composition requesters, and the liberty to specialize for component providers. However, most studies in this field must acknowledge that each web service has a limited context in which it can successfully perform its tasks, the boundaries defined by the internal constraints imposed on the service by its providers. The restricted context-spaces of all such component services define the contextual boundaries of the composite service as a whole when used in a composition, making internal constraints an essential factor in composite service functionality. Due to their limited exposure, no systems have yet been proposed on the large-scale solution repository to cater to the specific verification of internal constraints imposed on components of a composite service. In this thesis, we propose a scalable automatic service composition capable of not only automatically constructing context-aware composite web services with internal constraints positioned for optimal resource utilization but also validating the generated compositions on a large-scale solution repository using the General Intensional Programming System (GIPSY) as a time- and cost-efficient simulation/execution environment

    Execution/Simulation of Context/Constraint-aware Composite Services using GIPSY

    Get PDF
    For fulfilling a complex requirement comprising of several sub-tasks, a composition of simple web services, each of which is dedicated to performing a specific sub-task involved, proves to be a more competent solution in comparison to an equivalent atomic web service. Owing to advantages such as re-usability of components, broader options for composition requesters and liberty to specialize for component providers, for over two decades now, composite services have been extensively researched to the point of being perfected in many aspects. Yet, most of the studies undertaken in this field fail to acknowledge that every web service has a limited context in which it can successfully perform its tasks, the boundaries of which are defined by the internal constraints placed on the service by its providers. When used as part of a composition, the restricted context-spaces of all such component services together define the contextual boundaries of the composite service as a unit, which makes internal constraints an influential factor for composite service functionality. However, due to the limited exposure received by them, no systems have yet been proposed to cater to the specific verification of internal constraints imposed on components of a composite service. In an attempt to address this gap in service composition research, in this thesis, we propose a multi-faceted solution capable of not only automatically constructing context-aware composite web services with their internal constraints positioned for optimum resource-utilization but also of validating the generated compositions using the General Intensional Programming SYstem (GIPSY) as a time- and cost-efficient simulation/execution environment

    An energy Economic Model for Electricity Generation In the United States

    Get PDF

    Formal Foundations for Information-Preserving Model Synchronization Processes Based on Triple Graph Grammars

    Get PDF
    Zwischen verschiedenen Artefakten, die Informationen teilen, wieder Konsistenz herzustellen, nachdem eines von ihnen geändert wurde, ist ein wichtiges Problem, das in verschiedenen Bereichen der Informatik auftaucht. Mit dieser Dissertation legen wir eine Lösung für das grundlegende Modellsynchronisationsproblem vor. Bei diesem Problem ist ein Paar solcher Artefakte (Modelle) gegeben, von denen eines geändert wurde; Aufgabe ist die Wiederherstellung der Konsistenz. Tripelgraphgrammatiken (TGGs) sind ein etablierter und geeigneter Formalismus, um dieses und verwandte Probleme anzugehen. Da sie auf der algebraischen Theorie der Graphtransformation und dem (Double-)Pushout Zugang zu Ersetzungssystemen basieren, sind sie besonders geeignet, um Lösungen zu entwickeln, deren Eigenschaften formal bewiesen werden können. Doch obwohl TGG-basierte Ansätze etabliert sind, leiden viele von ihnen unter dem Problem des Informationsverlustes. Wenn ein Modell geändert wurde, können während eines Synchronisationsprozesses Informationen verloren gehen, die nur im zweiten Modell vorliegen. Das liegt daran, dass solche Synchronisationsprozesse darauf zurückfallen Konsistenz dadurch wiederherzustellen, dass sie das geänderte Modell (bzw. große Teile von ihm) neu übersetzen. Wir schlagen einen TGG-basierten Ansatz vor, der fortgeschrittene Features von TGGs unterstützt (Attribute und negative Constraints), durchgängig formalisiert ist, implementiert und inkrementell in dem Sinne ist, dass er den Informationsverlust im Vergleich mit vorherigen Ansätzen drastisch reduziert. Bisher gibt es keinen TGG-basierten Ansatz mit vergleichbaren Eigenschaften. Zentraler Beitrag dieser Dissertation ist es, diesen Ansatz formal auszuarbeiten und seine wesentlichen Eigenschaften, nämlich Korrektheit, Vollständigkeit und Termination, zu beweisen. Die entscheidende neue Idee unseres Ansatzes ist es, Reparaturregeln anzuwenden. Dies sind spezielle Regeln, die es erlauben, Änderungen an einem Modell direkt zu propagieren anstatt auf Neuübersetzung zurückzugreifen. Um diese Reparaturregeln erstellen und anwenden zu können, entwickeln wir grundlegende Beiträge zur Theorie der algebraischen Graphtransformation. Zunächst entwickeln wir eine neue Art der sequentiellen Komposition von Regeln. Im Gegensatz zur gewöhnlichen Komposition, die zu Regeln führt, die Elemente löschen und dann wieder neu erzeugen, können wir Regeln herleiten, die solche Elemente stattdessen bewahren. Technisch gesehen findet der Synchronisationsprozess, den wir entwickeln, außerdem in der Kategorie der partiellen Tripelgraphen statt und nicht in der der normalen Tripelgraphen. Daher müssen wir sicherstellen, dass die für Double-Pushout-Ersetzungssysteme ausgearbeitete Theorie immer noch gültig ist. Dazu entwickeln wir eine (kategorientheoretische) Konstruktion neuer Kategorien aus gegebenen und zeigen, dass (i) diese Konstruktion die Axiome erhält, die nötig sind, um die Theorie für Double-Pushout-Ersetzungssysteme zu entwickeln, und (ii) partielle Tripelgraphen als eine solche Kategorie konstruiert werden können. Zusammen ermöglichen diese beiden grundsätzlichen Beiträge es uns, unsere Lösung für das grundlegende Modellsynchronisationsproblem vollständig formal auszuarbeiten und ihre zentralen Eigenschaften zu beweisen.Restoring consistency between different information-sharing artifacts after one of them has been changed is an important problem that arises in several areas of computer science. In this thesis, we provide a solution to the basic model synchronization problem. There, a pair of such artifacts (models), one of which has been changed, is given and consistency shall be restored. Triple graph grammars (TGGs) are an established and suitable formalism to address this and related problems. Being based on the algebraic theory of graph transformation and (double-)pushout rewriting, they are especially suited to develop solutions whose properties can be formally proven. Despite being established, many TGG-based solutions do not satisfactorily deal with the problem of information loss. When one model is changed, in the process of restoring consistency such solutions may lose information that is only present in the second model because the synchronization process resorts to restoring consistency by re-translating (large parts of) the updated model. We introduce a TGG-based approach that supports advanced features of TGGs (attributes and negative constraints), is comprehensively formalized, implemented, and is incremental in the sense that it drastically reduces the amount of information loss compared to former approaches. Up to now, a TGG-based approach with these characteristics is not available. The central contribution of this thesis is to formally develop that approach and to prove its essential properties, namely correctness, completeness, and termination. The crucial new idea in our approach is the use of repair rules, which are special rules that allow one to directly propagate changes from one model to the other instead of resorting to re-translation. To be able to construct and apply these repair rules, we contribute more fundamentally to the theory of algebraic graph transformation. First, we develop a new kind of sequential rule composition. Whereas the conventional composition of rules leads to rules that delete and re-create elements, we can compute rules that preserve such elements instead. Furthermore, technically the setting in which the synchronization process we develop takes place is the category of partial triple graphs and not the one of ordinary triple graphs. Hence, we have to ensure that the elaborate theory of double-pushout rewriting still applies. Therefore, we develop a (category-theoretic) construction of new categories from given ones and show that (i) this construction preserves the axioms that are necessary to develop the theory of double-pushout rewriting and (ii) partial triple graphs can be constructed as such a category. Together, those two more fundamental contributions enable us to develop our solution to the basic model synchronization problem in a fully formal manner and to prove its central properties

    A new dialect of SOFL-Syntax formal semantics and tool support

    Get PDF
    Structured Object Orientated Formal Language (SOFL) is a formal method design methodology that combines data flows diagrams and predicates in order to describe processes that can be refined. This methodology creates a very versatile method of describing a system, which system properties can be proven rigorously. Data flows are grouped by ports that define from which data flows data can be consumed or on which flows data can be generated. For predicates, Logic of Partial Functions (LFP) are used; and an undefined element that is also used to indicate if a data flows do not contain any data. Over time SOFL “evolved organically” and a number of features were added: usability was the main consideration for a feature being added. For a formal language to be useful there must be no uncertainty of a specific design’s meaning. With SOFL, there is a possible contradiction between the requirement that a process's precondition must be true when the process fire, and the fire rules. This contradiction is due to the use of LPF. Semantics (the meaning) of SOFL was not always updated to keep track of the changes made to SOFL which resulted in an outdated and incomplete semantic. The incompleteness of the semantics is a significant factor motivating the work done in this dissertation. In this dissertation, a dialect of SOFL is created to define a semantic. Not all the elements of SOFL are added in order that a simpler semantic can be defined. Elements that were removed include: LPF, Classes, and Non-deterministic broadcast nodes. Semantics of the dialect is created by a two-step process: firstly, an intuitive understanding of the dialect is created, and secondly, both static and dynamic semantics are defined by means of translations. A translation is a mapping from the dialect to a formal language that describes a certain aspect of the dialect. Static semantics defines the meaning of the elements that are “fixed” in their state: SMT-LIB is used as the target language to describe the static semantics of the dialect. Dynamic semantics describes how an element in a design changes over time: the process algebra mCRL2 is used as the formal language which describes the dynamic behaviour of the dialect. The SMT-Solver Z3 and tools included in mCLR2 are used to analyse the translation of the dialect. Use of these tools allows properties that are necessary for a design to have a well defined meaning, to be proven. Properties that can be proven include: a process can fire, a process can fire an infinite number of times, and a predicate that described a property. An Eclipse plug-in is created so that translation is not required to be done manually. After a design is translated the tools Z3 and mCRL2 are run using script files and the results of the analysis are displayed on the screen. The desired properties could be proven but for a moderate size design, but as the size of the design increased the analysis of the translation could not be completed due to computational problem. Usability of the tool can be improved by not only using a textual representation of a design, but also visual representations as in SOFL. As a result, properties that are necessary for a design to have a well-defined meaning, can be proven using these tools.Dissertation (MSc)--University of Pretoria, 2018.Computer ScienceMScUnrestricte

    A Survey of Challenges for Runtime Verification from Advanced Application Domains (Beyond Software)

    Get PDF
    Runtime verification is an area of formal methods that studies the dynamic analysis of execution traces against formal specifications. Typically, the two main activities in runtime verification efforts are the process of creating monitors from specifications, and the algorithms for the evaluation of traces against the generated monitors. Other activities involve the instrumentation of the system to generate the trace and the communication between the system under analysis and the monitor. Most of the applications in runtime verification have been focused on the dynamic analysis of software, even though there are many more potential applications to other computational devices and target systems. In this paper we present a collection of challenges for runtime verification extracted from concrete application domains, focusing on the difficulties that must be overcome to tackle these specific challenges. The computational models that characterize these domains require to devise new techniques beyond the current state of the art in runtime verification

    ESTIMATES OF FOREST CHARACTERISTICS DERIVED FROM REMOTELY SENSED IMAGERY AND FIELD SAMPLES: APPLICABLE SCALES, APPROPRIATE STUDY DESIGN, AND RELEVANCE TO FOREST MANAGEMENT

    Get PDF
    Information and knowledge about a given forested landscape drives forest management decisions. Within forest management though, information that adequately describes various characteristics of the forested environment in the spatial detail desired to make fully informed management decisions is often limited. Key metrics such as species composition, tree basal area, and tree density are typically too expensive to collect using ground-based inventory methods alone across broad extents for forest level planning (thousands of ha) at fine spatial detail that permit use at tactical spatial scales (tens of ha). However, quantifying these metrics accurately, in spatial detail, across broad landscapes is important to inform the management process. While relating remotely sensed data to classical ground-based survey data through modeling has shown promise for describing landscapes at the spatial detail need to inform planning and tactical scale projects, questions remain related to integrating both sources of data, sample design, and linking plots to remotely sensed data. This dissertation addresses critical aspects of these questions by: quantifying and mitigating the impact of co-registration errors; comparing various sample designs and estimation techniques using simulated ground-based information, remotely sensed data, and a variety of modeling techniques; developing enhanced image normalization routines; and creating an ensemble approach to estimating various forest characteristics that describe species composition, basal area, and tree density. This dissertation address knowledge gaps in the fields of forestry, remote sensing, data science, and decision science that can be used to efficiently and effectively inform the natural resource management decision-making process at fine spatial resolutions across broad extents
    corecore