23 research outputs found

    Solvable Polynomial Ideals: The Ideal Reflection for Program Analysis

    Full text link
    This paper presents a program analysis method that generates program summaries involving polynomial arithmetic. Our approach builds on prior techniques that use solvable polynomial maps for summarizing loops. These techniques are able to generate all polynomial invariants for a restricted class of programs, but cannot be applied to programs outside of this class -- for instance, programs with nested loops, conditional branching, unstructured control flow, etc. There currently lacks approaches to apply these prior methods to the case of general programs. This paper bridges that gap. Instead of restricting the kinds of programs we can handle, our method abstracts every loop into a model that can be solved with prior techniques, bringing to bear prior work on solvable polynomial maps to general programs. While no method can generate all polynomial invariants for arbitrary programs, our method establishes its merit through a monotonicty result. We have implemented our techniques, and tested them on a suite of benchmarks from the literature. Our experiments indicate our techniques show promise on challenging verification tasks requiring non-linear reasoning.Comment: Long version of an article to appear at the 51st ACM SIGPLAN Symposium on Principles of Programming Languages (POPL 2024). This version is a replacement of an earlier long version where typos have been fixed, DOI's have been added to references when able, and a data availability statement has been adde

    Templates and Recurrences: Better Together

    Full text link
    This paper is the confluence of two streams of ideas in the literature on generating numerical invariants, namely: (1) template-based methods, and (2) recurrence-based methods. A template-based method begins with a template that contains unknown quantities, and finds invariants that match the template by extracting and solving constraints on the unknowns. A disadvantage of template-based methods is that they require fixing the set of terms that may appear in an invariant in advance. This disadvantage is particularly prominent for non-linear invariant generation, because the user must supply maximum degrees on polynomials, bases for exponents, etc. On the other hand, recurrence-based methods are able to find sophisticated non-linear mathematical relations, including polynomials, exponentials, and logarithms, because such relations arise as the solutions to recurrences. However, a disadvantage of past recurrence-based invariant-generation methods is that they are primarily loop-based analyses: they use recurrences to relate the pre-state and post-state of a loop, so it is not obvious how to apply them to a recursive procedure, especially if the procedure is non-linearly recursive (e.g., a tree-traversal algorithm). In this paper, we combine these two approaches and obtain a technique that uses templates in which the unknowns are functions rather than numbers, and the constraints on the unknowns are recurrences. The technique synthesizes invariants involving polynomials, exponentials, and logarithms, even in the presence of arbitrary control-flow, including any combination of loops, branches, and (possibly non-linear) recursion. For instance, it is able to show that (i) the time taken by merge-sort is O(nlog(n))O(n \log(n)), and (ii) the time taken by Strassen's algorithm is O(nlog2(7))O(n^{\log_2(7)}).Comment: 20 pages, 3 figure

    Direct foam writing in microgravity

    Get PDF
    Herein we report 2D printing in microgravity of aqueous-based foams containing metal oxide nanoparticles. Such hierarchical foams have potential space applications, for example for in situ habitat repair work, or for UV shielding. Foam line patterns of a TiO2-containing foam have been printed onto glass substrates via Direct Foam Writing (DFW) under microgravity conditions through a parabolic aircraft flight. Initial characterization of the foam properties (printed foam line width, bubble size, etc.) are presented. It has been found that gravity plays a significant role in the process of direct foam writing. The foam spread less over the substrate when deposited in microgravity as compared to Earth gravity. This had a direct impact on the cross-sectional area and surface roughness of the printed lines. Additionally, the contact angle of deionized water on a film exposed to microgravity was higher than that of a film not exposed to microgravity, due to the increased surface roughness of films exposed to microgravity

    Cooperation between Mast Cells and Neurons Is Essential for Antigen-Mediated Bronchoconstriction

    Get PDF
    Mast cells are important sentinels guarding the interface between the environment and the body: a breach in the integrity of this interface can lead to the release of a plethora of mediators which engage the foreign agent, recruit leukocytes, and initiate adaptive physiological changes in the organism. While these capabilities make mast cells critical players in immune defense, it also makes them important contributors to the pathogenesis of diseases such as asthma. Mast cell mediators induce dramatic changes in smooth muscle physiology, and the expression of receptors for these factors by smooth muscle suggests that they act directly to initiate constriction. Contrary to this view, we show here that mast cell-mediated bronchoconstriction is observed only in animals with intact innervation of the lung and that serotonin release alone is required for this action. While ablation of sensory neurons does not limit bronchoconstriction, constriction after antigen challenge is absent in mice in which the cholinergic pathways are compromised. Linking mast cell function to the cholinergic system likely provides an important means of modulating the function of these resident immune cells to physiology of the lung, but may also provide a safeguard against life-threatening anaphylaxis during mast cell degranulation

    Compositional, Monotone, and Non-linear Program Analysis

    No full text
    The presence of bugs in deployed software can lead to great economic and or human cost. One strategy for mitigating these losses is to prove the functional correctness of programs---or sometimes aspects of a program's functional correctness---during development. With an appropriate analysis technique, one can guarantee that deployed software will satisfy important properties for all possible inputs. This dissertation presents several lines of research that advance the state-of-the-art in the topic area of automatically characterizing program behavior to prove functional correctness. More specifically, this dissertation focuses on building program-analysis techniques and tools that exhibit some combination of (1) producing non-linear invariants, (2) reasoning compositionally by building up more complex program summaries from simpler ones, and (3) being predictable by satisfying a monotonicity property. The first line of research presented in this dissertation gives a program-analysis technique that is compositional and produces non-linear invariants. The key feature of the method is how it analyzes loops. To analyze loops, loop bodies are abstracted with the newly introduced wedge abstract domain. Furthermore, wedges admit a recurrence-extraction procedure that results in a set of c-finite recurrence relations that are implied by the original loop body. In this dissertation, we solve these c-finite recurrences using a technique based on operational calculus. In combination, these advancements yield a program-analysis tool that is able to produce program summaries that include equalities and inequalities between expressions that include polynomial, exponential, and logarithmic terms. Experimental results show that our method is able to generate precise non-linear summaries that are able to prove more programs correct in comparison to other state-of-the-art tools. The second line of research presented in this dissertation focuses on automatically improving the precision of a base compositional analysis by modifying the way the base analysis summarizes loops. Specifically, this line of research presents a method that automatically rewrites a program-analysis problem to a semantically sound alternative problem, with the goal of achieving a more-precise analysis result. In pursuit of this goal, we introduce the notion of a pre-Kleene algebra (PKA) as model for reasoning about analysis precision. A key property of PKAs is that they have a monotonicity property. Our method then refines a program-analysis problem with respect to the laws of PKAs. Then, as long as the base analysis satisfies the axioms of PKAs, our method guarantees that the refined program-analysis problem will yield a result that is at least as good as (and often better than) the result obtained from the original analysis formulation. Although this result is technically a "no-degradation" result, an experimental evaluation showed that our method allows for an analysis to prove roughly 25% more programs correct at the expense of an approximately 50% increase in analysis time. The third line of research presented in this dissertation introduces the optimal symbolic-bound synthesis (OSB) problem. In short, an instance of the OSB problem has as input a term t and a formula phi, and asks to find a symbolic term t* such that (i) phi implies that t* upper-bounds t, and (ii) t* exhibits some "term-desirability" properties. We present a heuristic method for finding a symbolic term t* when t and phi may contain non-linear terms. Our method works by extracting an implied cone of polynomials from phi and then reducing t by the cone of polynomials to obtain a sound upper bound t*. At a high level, our method makes use of Groebner-basis techniques for reducing with respect to equations, and a novel local-projection method for reducing with respect to inequalities. To show the utility of our method, we apply our techniques to the setting of bounding relevant terms, such as those representing the value of some financial asset, in Solidity smart contracts. The fourth line of research presented in this dissertation gives another compositional program analysis that is able to produce non-linear invariants. The method follows a similar structure to the method from the first line of research; however, the subsequent method has the additional benefit of being monotone. We implemented our monotone technique in a tool named Abstractionator. Instead of wedges, Abstractionator uses solvable transition ideals as an intermediate abstraction of a loop. Abstractionator then uses a summarization technique inspired by prior complete methods based on solvable polynomial maps to summarize abstracted solvable transition ideals. Thus, by utilizing an abstraction procedure for solvable transition ideals, Abstractionator brings to bear prior complete polynomial-invariant-generation methods to the setting of programs with a more general syntactic structure. Experiments show that Abstractionator compares favorably with other program-analysis tools, especially in the case where non-linear invariants are required

    Solvable Polynomial Ideals: The Ideal Reflection for Program Analysis Artifact

    No full text
    Artifact for POPL24 paper 122: Solvable Polynomial Ideals: The Ideal Reflection for Program Analysis
    corecore