4,193 research outputs found

    Taylor Expansion Diagrams: A Canonical Representation for Verification of Data Flow Designs

    Full text link

    Extended Seiberg-Witten Theory and Integrable Hierarchy

    Full text link
    The prepotential of the effective N=2 super-Yang-Mills theory perturbed in the ultraviolet by the descendents of the single-trace chiral operators is shown to be a particular tau-function of the quasiclassical Toda hierarchy. In the case of noncommutative U(1) theory (or U(N) theory with 2N-2 fundamental hypermultiplets at the appropriate locus of the moduli space of vacua) or a theory on a single fractional D3 brane at the ADE singularity the hierarchy is the dispersionless Toda chain. We present its explicit solutions. Our results generalize the limit shape analysis of Logan-Schepp and Vershik-Kerov, support the prior work hep-th/0302191 which established the equivalence of these N=2 theories with the topological A string on CP^1 and clarify the origin of the Eguchi-Yang matrix integral. In the higher rank case we find an appropriate variant of the quasiclassical tau-function, show how the Seiberg-Witten curve is deformed by Toda flows, and fix the contact term ambiguity.Comment: 49 page

    Constraints First: A New MDD-based Model to Generate Sentences Under Constraints

    Full text link
    This paper introduces a new approach to generating strongly constrained texts. We consider standardized sentence generation for the typical application of vision screening. To solve this problem, we formalize it as a discrete combinatorial optimization problem and utilize multivalued decision diagrams (MDD), a well-known data structure to deal with constraints. In our context, one key strength of MDD is to compute an exhaustive set of solutions without performing any search. Once the sentences are obtained, we apply a language model (GPT-2) to keep the best ones. We detail this for English and also for French where the agreement and conjugation rules are known to be more complex. Finally, with the help of GPT-2, we get hundreds of bona-fide candidate sentences. When compared with the few dozen sentences usually available in the well-known vision screening test (MNREAD), this brings a major breakthrough in the field of standardized sentence generation. Also, as it can be easily adapted for other languages, it has the potential to make the MNREAD test even more valuable and usable. More generally, this paper highlights MDD as a convincing alternative for constrained text generation, especially when the constraints are hard to satisfy, but also for many other prospects.Comment: To be published in Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI 202

    Model Predictive Control for Signal Temporal Logic Specification

    Get PDF
    We present a mathematical programming-based method for model predictive control of cyber-physical systems subject to signal temporal logic (STL) specifications. We describe the use of STL to specify a wide range of properties of these systems, including safety, response and bounded liveness. For synthesis, we encode STL specifications as mixed integer-linear constraints on the system variables in the optimization problem at each step of a receding horizon control framework. We prove correctness of our algorithms, and present experimental results for controller synthesis for building energy and climate control

    A Survey of Prediction and Classification Techniques in Multicore Processor Systems

    Get PDF
    In multicore processor systems, being able to accurately predict the future provides new optimization opportunities, which otherwise could not be exploited. For example, an oracle able to predict a certain application\u27s behavior running on a smart phone could direct the power manager to switch to appropriate dynamic voltage and frequency scaling modes that would guarantee minimum levels of desired performance while saving energy consumption and thereby prolonging battery life. Using predictions enables systems to become proactive rather than continue to operate in a reactive manner. This prediction-based proactive approach has become increasingly popular in the design and optimization of integrated circuits and of multicore processor systems. Prediction transforms from simple forecasting to sophisticated machine learning based prediction and classification that learns from existing data, employs data mining, and predicts future behavior. This can be exploited by novel optimization techniques that can span across all layers of the computing stack. In this survey paper, we present a discussion of the most popular techniques on prediction and classification in the general context of computing systems with emphasis on multicore processors. The paper is far from comprehensive, but, it will help the reader interested in employing prediction in optimization of multicore processor systems

    Partially-shared zero-suppressed multi-terminal BDDs: concept, algorithms and applications

    Get PDF
    Multi-Terminal Binary Decision Diagrams (MTBDDs) are a well accepted technique for the state graph (SG) based quantitative analysis of large and complex systems specified by means of high-level model description techniques. However, this type of Decision Diagram (DD) is not always the best choice, since finite functions with small satisfaction sets, and where the fulfilling assignments possess many 0-assigned positions, may yield relatively large MTBDD based representations. Therefore, this article introduces zero-suppressed MTBDDs and proves that they are canonical representations of multi-valued functions on finite input sets. For manipulating DDs of this new type, possibly defined over different sets of function variables, the concept of partially-shared zero-suppressed MTBDDs and respective algorithms are developed. The efficiency of this new approach is demonstrated by comparing it to the well-known standard type of MTBDDs, where both types of DDs have been implemented by us within the C++-based DD-package JINC. The benchmarking takes place in the context of Markovian analysis and probabilistic model checking of systems. In total, the presented work extends existing approaches, since it not only allows one to directly employ (multi-terminal) zero-suppressed DDs in the field of quantitative verification, but also clearly demonstrates their efficienc

    Probabilistic Inference Using Partitioned Bayesian Networks:Introducing a Compositional Framework

    Get PDF
    Probability theory offers an intuitive and formally sound way to reason in situations that involve uncertainty. The automation of probabilistic reasoning has many applications such as predicting future events or prognostics, providing decision support, action planning under uncertainty, dealing with multiple uncertain measurements, making a diagnosis, and so forth. Bayesian networks in particular have been used to represent probability distributions that model the various applications of uncertainty reasoning. However, present-day automated reasoning approaches involving uncertainty struggle when models increase in size and complexity to fit real-world applications.In this thesis, we explore and extend a state-of-the-art automated reasoning method, called inference by Weighted Model Counting (WMC), when applied to increasingly complex Bayesian network models. WMC is comprised of two distinct phases: compilation and inference. The computational cost of compilation has limited the applicability of WMC. To overcome this limitation we have proposed theoretical and practical solutions that have been tested extensively in empirical studies using real-world Bayesian network models.We have proposed a weighted variant of OBDDs, called Weighted Positive Binary Decision Diagrams (WPBDD), which in turn is based on the new notion of positive Shannon decomposition. WPBDDs are particularly well suited to represent discrete probabilistic models. The conciseness of WPBDDs leads to a reduction in the cost of probabilistic inference.We have introduced Compositional Weighted Model Counting (CWMC), a language-agnostic framework for probabilistic inference that partitions a Bayesian network into subproblems. These subproblems are then compiled and subsequently composed in order to perform inference. This approach significantly reduces the cost of compilation, yet increases the cost of inference. The best results are obtained by seeking a partitioning that allows compilation to (barely) become feasible, but no more, as compilation cost can be amortized over multiple inference queries.Theoretical concepts have been implemented in a readily available open-source tool called ParaGnosis. Further implementational improvements have been found through parallelism, by exploiting independencies that are introduced by CWMC. The proposed methods combined push the boundaries of WMC, allowing this state-of-the-art method to be used on much larger models than before

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Witness generation in existential CTL model checking

    Get PDF
    Hardware and software systems are widely used in applications where failure is prohibitively costly or even unacceptable. The main obstacle to make such systems more reliable and capable of more complex and sensitive tasks is our limited ability to design and implement them with sufficiently high degree of confidence in their correctness under all circumstances. As an automated technique that verifies the system early in the design phase, model checking explores the state space of the system exhaustively and rigorously to determine if the system satisfies the specifications and detect fatal errors that may be missed by simulation and testing. One essential advantage of model checking is the capability to generate witnesses and counterexamples. They are simple and straightforward forms to prove an existential specification or falsify a universal specification. Beside enhancing the credibility of the model checker\u27s conclusion, they either strengthen engineers\u27 confidence in the system or provide hints to reveal potential defects. In this dissertation, we focus on symbolic model checking with specifications expressed in computation tree logic (CTL), which describes branching-time behaviors of the system, and investigate the witness generation techniques for the existential fragment of CTL, i.e., ECTL, covering both decision-diagram-based and SAT-based. Since witnesses provide important debugging information and may be inspected by engineers, smaller ones are always preferable to ease their interpretation and understanding. To the best of our knowledge, no existing witness generation technique guarantees the minimality for a general ECTL formula with nested existential CTL operators. One contribution of this dissertation is to fill this gap with the minimality guarantee. With the help of the saturation algorithm, our approach computes the minimum witness size for the given ECTL formula in every state, stored as an additive edge-valued multiway decision diagrams (EV+MDD), a variant of the well-known binary decision diagram (BDD), and then builds a minimum witness. Though computationally intensive, this has promising applications in reducing engineers\u27 workload. SAT-based model checking, in particular, bounded model checking, reduces a model checking problem problem into a satisfiability problem and leverages a SAT solver to solve it. Another contribution of this dissertation is to improve the translation of bounded semantics of ECTL into propositional formulas. By realizing the possibility of path reuse, i.e., a state may build its own witness by reusing its successor\u27s, we may generate a significantly smaller formula, which is often easier for a SAT solver to answer, and thus boost the performance of bounded model checking
    • …
    corecore