72 research outputs found

    Hedging of Financial Derivative Contracts via Monte Carlo Tree Search

    Full text link
    The construction of approximate replication strategies for derivative contracts in incomplete markets is a key problem of financial engineering. Recently Reinforcement Learning algorithms for pricing and hedging under realistic market conditions have attracted significant interest. While financial research mostly focused on variations of QQ-learning, in Artificial Intelligence Monte Carlo Tree Search is the recognized state-of-the-art method for various planning problems, such as the games of Hex, Chess, Go,... This article introduces Monte Carlo Tree Search as a method to solve the stochastic optimal control problem underlying the pricing and hedging of financial derivatives. As compared to QQ-learning it combines reinforcement learning with tree search techniques. As a consequence Monte Carlo Tree Search has higher sample efficiency, is less prone to over-fitting to specific market models and generally learns stronger policies faster. In our experiments we find that Monte Carlo Tree Search, being the world-champion in games like Chess and Go, is easily capable of directly maximizing the utility of investor's terminal wealth without an intermediate mathematical theory.Comment: Added figures. Added references. Corrected typos. 15 pages, 5 figure

    Fuzzy Description Logics with General Concept Inclusions

    Get PDF
    Description logics (DLs) are used to represent knowledge of an application domain and provide standard reasoning services to infer consequences of this knowledge. However, classical DLs are not suited to represent vagueness in the description of the knowledge. We consider a combination of DLs and Fuzzy Logics to address this task. In particular, we consider the t-norm-based semantics for fuzzy DLs introduced by Hájek in 2005. Since then, many tableau algorithms have been developed for reasoning in fuzzy DLs. Another popular approach is to reduce fuzzy ontologies to classical ones and use existing highly optimized classical reasoners to deal with them. However, a systematic study of the computational complexity of the different reasoning problems is so far missing from the literature on fuzzy DLs. Recently, some of the developed tableau algorithms have been shown to be incorrect in the presence of general concept inclusion axioms (GCIs). In some fuzzy DLs, reasoning with GCIs has even turned out to be undecidable. This work provides a rigorous analysis of the boundary between decidable and undecidable reasoning problems in t-norm-based fuzzy DLs, in particular for GCIs. Existing undecidability proofs are extended to cover large classes of fuzzy DLs, and decidability is shown for most of the remaining logics considered here. Additionally, the computational complexity of reasoning in fuzzy DLs with semantics based on finite lattices is analyzed. For most decidability results, tight complexity bounds can be derived

    Algebraic decoder specification: coupling formal-language theory and statistical machine translation: Algebraic decoder specification: coupling formal-language theory and statistical machine translation

    Get PDF
    The specification of a decoder, i.e., a program that translates sentences from one natural language into another, is an intricate process, driven by the application and lacking a canonical methodology. The practical nature of decoder development inhibits the transfer of knowledge between theory and application, which is unfortunate because many contemporary decoders are in fact related to formal-language theory. This thesis proposes an algebraic framework where a decoder is specified by an expression built from a fixed set of operations. As yet, this framework accommodates contemporary syntax-based decoders, it spans two levels of abstraction, and, primarily, it encourages mutual stimulation between the theory of weighted tree automata and the application

    Logische Grundlagen von Datenbanktransformationen fĂĽr Datenbanken mit komplexen Typen

    Get PDF
    Database transformations consist of queries and updates which are two fundamental types of computations in any databases - the first provides the capability to retrieve data and the second is used to maintain databases in light of ever-changing application domains. With the rising popularity of web-based applications and service-oriented architectures, the development of database transformations must address new challenges, which frequently call for establishing a theoretical framework that unifies both queries and updates over complex-value databases. This dissertation aims to lay down the foundations for establishing a theoretical framework of database transformations in the context of complex-value databases. We shall use an approach that has successfully been used for the characterisation of sequential algorithms. The sequential Abstract State Machine (ASM) thesis captures semantics and behaviour of sequential algorithms. The thesis uses the similarity of general computations and database transformations for characterisation of the later by five postulates: sequential time postulate, abstract state postulate, bounded exploration postulate, background postulate, and the bounded non-determinism postulate. The last two postulates reflect the specific form of transformations for databases. The five postulates exactly capture database transformations. Furthermore, we provide a logical proof system for database transformations that is sound and complete.Datenbanktransformationen sind Anfragen an ein Datenbanksystem oder Modifikationen der Daten des Datenbanksystemes. Diese beiden grundlegenden Arten von Berechnungen auf Datenbanksystemen erlauben zum einem den Zugriff auf Daten und zum anderen die Pflege der Datenbank. Eine theoretische Fundierung von Datenbanktransformationen muss so flexibel sein, dass auch neue web-basierten Anwendungen und den neuen serviceorientierte Architekturen reflektiert sind, sowie auch die komplexeren Datenstrukturen. Diese Dissertation legt die Grundlagen für eine Theoriefundierung durch Datenbanktransformationen, die auch komplexe Datenstrukturen unterstützen. Wir greifen dabei auf einen Zugang zurück, der eine Theorie der sequentiellen Algorithmen bietet. Die sequentielle ASM-These (abstrakte Zustandsmaschinen) beschreibt die Semantik und das Verhalten sequentieller Algorithmen. Die Dissertation nutzt dabei die Gleichartigkeit von allgemeinen Berechnungen und Datenbanktransformationen zur Charakterisierung durch fünf Postulate bzw. Axiome: das Axiom der sequentiellen Ausführung, das Axiom einer abstrakten Charakterisierbarkeit von Zuständen, das Axiom der Begrenzbarkeit von Zustandsänderungen und Zustandssicht, das Axiom der Strukturierung von Datenbanken und das Axiom der Begrenzbarkeit des Nichtdeterminismus. Die letzten beiden Axiome reflektieren die spezifische Seite der Datenbankberechnungen. Die fünf Axiome beschreiben vollständig das Verhalten von Datenbanktransformationen. Weiterhin wird eine Beweiskalkül für Datenbanktransformationen entwickelt, der vollständig und korrekt ist

    Equivalences and calculi for formal verification of cryptographic protocols

    Get PDF
    Security protocols are essential to the proper functioning of any distributed system running over an insecure network but often have flaws that can be exploited even without breaking the cryptography. Formal cryptography, the assumption that the cryptographic primitives are flawless, facilitates the construction of formal models and verification tools. Such models are often based on process calculi, small formal languages for modelling communicating systems. The spi calculus, a process calculus for the modelling and formal verification of cryptographic protocols, is an extension of the pi calculus with cryptography. In the spi calculus, security properties can be formulated as equations on process terms, so no external formalism is needed. Moreover, the contextual nature of observational process equivalences takes into account any attacker/environment that can be expressed in the calculus. We set out to address the problem of automatic verification of observational equivalence in an extension of the spi calculus: A channel-passing calculus with a more general expression language. As a first step, we study existing non-contextual proof techniques for a particular canonical contextual equivalence. In contrast to standard process calculi, reasoning on cryptographic processes must take into account the partial knowledge of the environment about transmitted messages. In the setting of the spi calculus, several notions of environment-sensitive bisimulation has been developed to treat this environment knowledge. We exhibit distinguishing examples between several of these notions, including ones previously believed to coincide. We then give a general framework for comparison of environment-sensitive relations, based on a comparison of the corresponding kinds of environment and notions of environment consistency. Within this framework we perform an exhaustive comparison of the different bisimulations, where every possible relation that is not proven is disproven. For the second step, we consider the question of which expression languages are suitable. Extending the expression language to account for more sophisticated cryptographic primitives or other kinds of data terms quickly leads to decidability issues. Two important problems in this area are the knowledge problem and an indistinguishability problem called static equivalence. It is known that decidability of static equivalence implies decidability of knowledge in many cases; we exhibit an expression language where knowledge is decidable but static equivalence is not. We then define a class of constructor-destructor expression languages and prove that environment consistency over any such language directly corresponds to static equivalence in a particular extension thereof. We proceed to place some loose constraints on deterministic expression evaluation, and redefine the spi calculus in this more general setting. Once we have chosen an expression language, we encounter a third problem, which is inherent in the operational semantics of message-passing process calculi: The possibility to receive arbitrary messages gives rise to infinite branching on process input. To mitigate this problem, we define a symbolic semantics, where the substitution of received messages for input variables never takes place. Instead, input variables are only subject to logical constraints. We then use this symbolic semantics to define a symbolic bisimulation that is sound and complete with respect to its concrete counterpart, extending the possibilities for automated bisimulation checkers

    Abstraktní studium úplnosti pro infinitární logiky

    Get PDF
    V této dizertační práci se zabýváme studiem vlastností úplnosti infinitárních výrokových logik z pohledu abstraktní algebraické logiky. Cílem práce je pochopit, jak lze základní nástroj v důkazech uplnosti, tzv. Lindenbaumovo lemma, zobecnit za hranici finitárních logik. Za tímto účelem studujeme vlastnosti úzce související s Lindenbaumovým lemmatem (a v důsledku také s vlastnostmi úplnosti). Uvidíme, že na základě těchto vlastností lze vystavět novou hierarchii infinitárních výrokových logik. Také se zabýváme studiem těchto vlastností v případě, kdy naše logika má nějaké (případně hodně obecně definované) spojky implikace, disjunkce a negace. Mimo jiné uvidíme, že přítomnost daných spojek může zajist platnost Lindenbaumova lemmatu. Keywords: abstraktní algebraická logika, infinitární logiky, Lindenbau- movo lemma, disjunkce, implikace, negaceIn this thesis we study completeness properties of infinitary propositional logics from the perspective of abstract algebraic logic. The goal is to under- stand how the basic tool in proofs of completeness, the so called Linden- baum lemma, generalizes beyond finitary logics. To this end, we study few properties closely related to the Lindenbaum lemma (and hence to com- pleteness properties). We will see that these properties give rise to a new hierarchy of infinitary propositional logic. We also study these properties in scenarios when a given logic has some (possibly very generally defined) connectives of implication, disjunction, and negation. Among others, we will see that presence of these connectives can ensure provability of the Lin- denbaum lemma. Keywords: abstract algebraic logic, infinitary logics, Lindenbaum lemma, disjunction, implication, negationKatedra logikyDepartment of LogicFaculty of ArtsFilozofická fakult
    • …
    corecore