33 research outputs found
A Simple and Scalable Static Analysis for Bound Analysis and Amortized Complexity Analysis
We present the first scalable bound analysis that achieves amortized
complexity analysis. In contrast to earlier work, our bound analysis is not
based on general purpose reasoners such as abstract interpreters, software
model checkers or computer algebra tools. Rather, we derive bounds directly
from abstract program models, which we obtain from programs by comparatively
simple invariant generation and symbolic execution techniques. As a result, we
obtain an analysis that is more predictable and more scalable than earlier
approaches. Our experiments demonstrate that our analysis is fast and at the
same time able to compute bounds for challenging loops in a large real-world
benchmark. Technically, our approach is based on lossy vector addition systems
(VASS). Our bound analysis first computes a lexicographic ranking function that
proves the termination of a VASS, and then derives a bound from this ranking
function. Our methodology achieves amortized analysis based on a new insight
how lexicographic ranking functions can be used for bound analysis
HBP-enhancing hepatocellular adenomas and how to discriminate them from FNH in Gd-EOB MRI
BackgroundRecent studies provide evidence that hepatocellular adenomas (HCAs) frequently take up gadoxetic acid (Gd-EOB) during the hepatobiliary phase (HBP). The purpose of our study was to investigate how to differentiate between Gd-EOB-enhancing HCAs and focal nodular hyperplasias (FNHs). We therefore retrospectively included 40 HCAs classified as HBP Gd-EOB-enhancing lesions from a sample of 100 histopathologically proven HCAs in 65 patients. These enhancing HCAs were matched retrospectively with 28 FNH lesions (standard of reference: surgical resection). Two readers (experienced abdominal radiologists blinded to clinical data) reviewed the images evaluating morphologic features and subjectively scoring Gd-EOB uptake (25-50%, 50-75% and 75-100%) for each lesion. Quantitative lesion-to-liver enhancement was measured in arterial, portal venous (PV), transitional and HBP. Additionally, multivariate regression analyses were performed.
ResultsSubjective scoring of intralesional Gd-EOB uptake showed the highest discriminatory accuracies (AUC: 0.848 (R#1); 0.920 (R#2)-p0.05).
ConclusionEven in HBP-enhancing HCA, characterization of Gd-EOB uptake was found to provide the strongest discriminatory power in differentiating HCA from FNH. Furthermore, a lobulated appearance and a central scar are more frequently seen in FNH than in HCA
Increased Cell-Free DNA Plasma Concentration Following Liver Transplantation Is Linked to Portal Hepatitis and Inferior Survival
Donor organ quality is crucial for transplant survival and long-term survival of patients after liver transplantation. Besides bacterial and viral infections, endogenous damage-associated molecular patterns (DAMPs) can stimulate immune responses. Cell-free DNA (cfDNA) is one such DAMP that exhibits highly proinflammatory effects via DNA sensors. Herein, we measured cfDNA after liver transplantation and found elevated levels when organs from resuscitated donors were transplanted. High levels of cfDNA were associated with high C-reactive protein, leukocytosis as well as granulocytosis in the recipient. In addition to increased systemic immune responses, portal hepatitis was observed, which was associated with increased interface activity and a higher numbers of infiltrating neutrophils and eosinophils in the graft. In fact, the cfDNA was an independent significant factor in multivariate analysis and increased concentration of cfDNA was associated with inferior 1-year survival. Moreover, cfDNA levels were found to be decreased significantly during the postoperative course when patients underwent continuous veno-venous haemofiltration. In conclusion, patients receiving livers from resuscitated donors were characterised by high postoperative cfDNA levels. Those patients showed pronounced portal hepatitis and systemic inflammatory responses in the short term leading to a high mortality. Further studies are needed to evaluate the clinical relevance of cfDNA clearance by haemoadsorption and haemofiltration in vitro and in vivo
Cost analysis of nondeterministic probabilistic programs
We consider the problem of expected cost analysis over nondeterministic probabilistic programs,
which aims at automated methods for analyzing the resource-usage of such programs.
Previous approaches for this problem could only handle nonnegative bounded costs.
However, in many scenarios, such as queuing networks or analysis of cryptocurrency protocols,
both positive and negative costs are necessary and the costs are unbounded as well.
In this work, we present a sound and efficient approach to obtain polynomial bounds on the
expected accumulated cost of nondeterministic probabilistic programs.
Our approach can handle (a) general positive and negative costs with bounded updates in
variables; and (b) nonnegative costs with general updates to variables.
We show that several natural examples which could not be
handled by previous approaches are captured in our framework.
Moreover, our approach leads to an efficient polynomial-time algorithm, while no
previous approach for cost analysis of probabilistic programs could guarantee polynomial runtime.
Finally, we show the effectiveness of our approach using experimental results on a variety of programs for which we efficiently synthesize tight resource-usage bounds
Cost Analysis of Nondeterministic Probabilistic Programs
We consider the problem of expected cost analysis over nondeterministic
probabilistic programs, which aims at automated methods for analyzing the
resource-usage of such programs. Previous approaches for this problem could
only handle nonnegative bounded costs. However, in many scenarios, such as
queuing networks or analysis of cryptocurrency protocols, both positive and
negative costs are necessary and the costs are unbounded as well.
In this work, we present a sound and efficient approach to obtain polynomial
bounds on the expected accumulated cost of nondeterministic probabilistic
programs. Our approach can handle (a) general positive and negative costs with
bounded updates in variables; and (b) nonnegative costs with general updates to
variables. We show that several natural examples which could not be handled by
previous approaches are captured in our framework.
Moreover, our approach leads to an efficient polynomial-time algorithm, while
no previous approach for cost analysis of probabilistic programs could
guarantee polynomial runtime. Finally, we show the effectiveness of our
approach by presenting experimental results on a variety of programs, motivated
by real-world applications, for which we efficiently synthesize tight
resource-usage bounds.Comment: A conference version will appear in the 40th ACM Conference on
Programming Language Design and Implementation (PLDI 2019
Recurrent hotspot mutations in HRAS Q61 and PI3K-AKT pathway genes as drivers of breast adenomyoepitheliomas
Adenomyoepithelioma of the breast is a rare tumor characterized by epithelial-myoepithelial differentiation, of which a subset will progress to invasive or metastatic cancer. We sought to define the genomic landscape of adenomyoepitheliomas. Massively parallel sequencing revealed highly recurrent somatic mutations in HRAS and PI3K-AKT pathway-related genes. Strikingly, HRAS mutations were restricted to estrogen receptor (ER)-negative tumors, all affected codon 61, and all but one co-occurred with PIK3CA or PIK3R1 mutations. To interrogate the functional significance of HRAS Q61 mutations in adenomyoepithelial differentiation, we expressed HRASQ61R alone or in combination with PIK3CAH1047R in non-transformed ER-negative breast epithelial cells. HRASQ61R induced characteristic phenotypes of adenomyoepitheliomas such as the expression of myoepithelial markers and loss of e-cadherin, hyperactivation of AKT signaling, and transformative properties that were arrested by combination therapy with AKT and MEK inhibitors. Our results indicate that breast adenomyoepitheliomas often manifest a unique transformation program featuring HRAS activation
Market Power Rents and Climate Change Mitigation: A Rationale for Coal Taxes?
In this paper we investigate the introduction of an export tax on steam coal levied by an individual country (Australia), or a group of major exporting countries. The policy motivation would be twofold: generating tax revenues against the background of improved terms-of-trade, while CO2 emissions are reduced. We construct and numerically apply a two-level game consisting of an optimal policy problem at the upper level, and an equilibrium model of the international steam coal market (based on COALMOD-World) at the lower level. We find that a unilaterally introduced Australian export tax on steam coal has little impact on global emissions and may be welfare reducing. On the contrary, a tax jointly levied by a "climate coalition" of major coal exporters may well leave these better off while significantly reducing global CO2 emissions from steam coal by up to 200 Mt CO2 per year. Comparable production-based tax scenarios consistently yield higher tax revenues but may be hard to implement against the opposition of disproportionally affected local stakeholders depending on low domestic coal prices
Fiscal Divergence, Current Account and TARGET2 Imbalances in the EMU
The paper scrutinizes the reasons for the European debt crisis, the implications for TARGET2 imbalances and options for surplus liquidity absorption within an asymmetric EMU. It is argued that starting from the turn of the millennium diverging fiscal policy paths and diverging unit labour costs were the driving force of rising intra-European current account imbalances within the euro area, which were enhanced by post-2001 low interest rate policies and changing financing conditions for the German banking sector. The paper shows how since the outbreak of the crisis the adjustment of intra-EMU current account imbalances is postponed by a rising divergence of TARGET2 balances, as the repatriation of private international credit and deposit flight from the crisis economies is substituted by central bank credit. Given that this process has brought Deutsche Bundesbank into a debtor position to the domestic financial system, we discuss options for liquidity absorption by Deutsche Bundesbank to forestall asset price bubbles in Germany
Automatisierte Komplexitätsanalyse für Imperative Programme
Abweichender Titel nach Übersetzung der Verfasserin/des VerfassersZusammenfassung in deutscher SpracheUnsere Arbeit widmet sich dem Problem der automatisierten Komplexitätsanalyse, welches alternativ auch als Problem der automatisierten Abschätzung des Ressourcenverbrauchs formuliert wird. Eine Lösung dieses Problems wird angestrebt durch die Entwicklung einer Quellcodeanalyse (im Folgenden Bound-Analyse genannt) zur Abschätzung der Ausführungskosten eines gegebenen Programms. Der Begriff Kosten wird, wie üblich, durch ein Kostenmodell definiert, welches jeder Programminstruktion eine Ausführungskost zuweist. Je nach Anwendungsgebiet können Ausführungskosten beispielsweise in Form der benötigten Zeit, der benötigten Energie oder der Anzahl auszuführender Operationen definiert werden. Unsere Bound-Analyse behandelt das gegebene Programm als ein mathematisches Objekt und schätzt die Ausführungskosten bzw. den Ressourcenverbrauch des Programms mittels automatisierter Anwendung mathematischer und logischer Methoden ab, ohne das Programm auszuführen. Eine gewonnene Abschätzung wird in Form eines symbolischen Ausdrucks über den Programmparametern angegeben. Wir denken, dass Bound-Analysen in den verschiedensten Bereichen von großem Nutzen sein können. In unserer Arbeit diskutieren wir Anwendungsszenarien in den Bereichen Software Profiling, Program Understanding, Program Verification, Software Security und Automatic Parallelization. In den letzten Jahren wurden großen Fortschritte im Bereich der Bound-Analysen erzielt. Gegenwärtige Bound-Analysen können die Ausführungskosten beeindruckend komplizierter Code-Beispiele vollautomatisch deduzieren. Dennoch sehen wir zwei maßgebliche Nachteile aktueller Ansätze im Bereich der Bound-Analysen: (1) Die existierenden Analysen skalieren nicht ausreichend, um den Quellcode ganzer Programme, welcher oft Tausende oder Hunderttausende Code-Zeilen umfasst, zu analysieren. (2) Obwohl gegenwärtige Ansätze hinreichend genaue Abschätzungen des Ressourcenverbrauchs vieler und vor allem diffiziler Code-Beispiele berechnen können, werden dennoch die Kosten mancher, durchaus natürlicher Schleifeniterationsschemata nur sehr unzuverlässig und ungenau ermittelt. Mit unserer Arbeit wollen wir dazu beitragen, beide Probleme zu überwinden: (1) Die Bound-Analyse, welche wir in vorliegender Arbeit präsentieren, geht das Problem mangelnder Skalierbarkeit mittels einfacher statischer Analyse an: Während existierende Bound-Analysen mächtige Werkzeuge wie z.B. Abstract Interpretation, Computeralgebra oder lineare Optimierung verwenden, setzt unsere Analyse auf eine Reduktion des Problems mittels Programmabstraktion: In einem ersten Schritt abstrahieren wir das gegebene Programm in ein stark vereinfachtes Programmmodell. Anschließend wenden wir unseren Algorithmus für die Berechnung der Ausführungskosten auf dem vereinfachten Programm an. (2) Unsere Analyse erweitert die Möglichkeiten der Bound-Analyse: Wir erhalten asymptotisch präzise Abschätzungen der Kosten bzw. des Ressourcenverbrauchs für Instanzen einer Klasse von Schleifeniterationsschemata, für welche existierende Analysen meist fehlschlagen oder nur grobe Abschätzungen deduzieren können. Instanzen dieser Iterationsschemata werden häufig in Parser-Implementierungen sowie in String-Matching-Routinen verwendet. Wir diskutieren mehrere Beispiele in unserer Arbeit. Darüber hinaus ist unsere Analyse in der Lage, den überwiegenden Teil der in der Literatur diskutierten Bound-Analyse-Probleme zu lösen. Unsere Bound-Analyse basiert auf dem abstrakten Programmmodell der Difference Constraints. Difference Constraints wurden für die Terminationsanalyse eingeführt und bezeichnen relationale Ungleichungen der Form x' <= y + c. Eine solche Ungleichung drückt aus, dass der Wert von x im gegenwärtigen Zustand höchstens so groß ist wie der Wert von y im vorherigen Zustand, erhöht um eine ganzzahlige Konstante c. Wir denken, dass Difference Constraints, neben ihrer Anwendung in der Terminationsanalyse, auch eine gute Wahl für die Komplexitätsanalyse imperativer Programme darstellen, da die Komplexität solcher Programme meist aus dem Zusammenspiel von Zählerinkrementen bzw. -dekrementen der Form x := x + c und Zähler-Resets der Form x := y (wobei x != y) resultiert, diese Operationen können mittels der Difference Constraints x' <= x + c und x' <= y modelliert werden. Unsere Arbeit trägt zur Entwicklung von Bound-Analyse-Techniken wesentlich bei durch: (1) Effiziente Abstraktionstechniken: Wir zeigen, das Difference Constraints ein adäquates abstraktes Programmmodell für die Bound-Analyse bilden. (2) Einen neuen Bound-Analyse-Algorithmus: Wir definieren und diskutieren einen Algorithmus, welcher den Bereich der Bound-Analysen auf eine Klasse schwer zu analysierender, aber natürlich vorkommender Schleifeniterationsschemata ausweitet. Wir beweisen die Korrektheit unseres Algorithmus. (3) Eine skalierende Bound-Analyse: Unsere Analyse skaliert wesentlich besser als existierende Bound-Analysen. Unser Beitrag wird durch einen sorgfältigen experimentellen Vergleich auf Drei verschiedenen Benchmarks unterstützt: Wir vergleichen unsere Implementierung mit gegenwärtigen Bound-Analyse-Tools auf (a) einer großen Benchmark, bestehend aus C-Code, (b) einer Benchmark, welche Code-Beispiele aus der Literatur enthält, und (c) einer Benchmark, welche schwierige Schleifeniterationsschemata beinhaltet, die wir aus dem Quellcode echter Programme extrahiert haben. Wie die meisten existierenden Ansätze im Bereich der Bound-Analysen zielt unsere Analyse auf eine obere Abschätzung der Ausführungskosten.Our work contributes to the field of automated complexity and resource bound analysis (bound analysis) that develops source code analyses for estimating the cost of running a given piece of software. The term cost is usually defined by a cost model which assigns an execution cost to each program instruction or operation. Depending on the application domain, the cost is estimated, e.g., in terms of consumed time, consumed power, or the number of executed statements. A bound analysis treats the program under scrutiny as a mathematical object, inferring a bound on the execution cost (with respect to the given cost model) by means of automated formal reasoning. The computed bound is usually expressed by a symbolic expression over the program's parameters. We argue that bound analysis could be applied with great benefits, e.g., in the areas of software profiling, program understanding, program verification, software security, and automatic parallelization. We state several application scenarios in this thesis. In recent years, the research on bound analysis has made great progress. State-of-the-art bound analysis techniques can automatically infer execution costs for impressively complicated code examples. Nevertheless, we see two main drawbacks of current bound analysis techniques: (1) Present approaches do not scale sufficiently for analyzing real code, which often consist of thousands or hundreds of thousands of lines of code with many nested conditionals. (2) Though existing approaches can infer tight bounds for many challenging examples, they nevertheless grossly over-approximate the cost of certain code patterns that are common in real source code. With our work we aim towards overcoming both problems: (1) The bound analysis we present in this work tackles the scalability problem by simple static analysis: In contrast to existing bound analysis techniques, which employ general purpose reasoners such as abstract interpreters, computer algebra tools or linear optimizers, we take an orthogonal approach based on the well-known program abstraction methodology: We first abstract a given program into an abstract program model by means of static analysis. We then apply our bound algorithm on the abstracted program, which is a simplified version of the original program. (2) Our bound algorithm extends the range of bound analysis. It infers tight bounds for a class of loop iteration patterns on which existing approaches fail or infer bounds that are not tight. Instances of such loop iterations can often be found in parsing and string matching routines. We state several examples in this work. At the same time our approach is general and can handle most of the bound analysis problems which are discussed in the literature. Our bound analysis is based on the abstract program model of difference constraints. Difference constraints have been used for termination analysis in the literature, where they denote relational inequalities of form x' <= y + c, and describe that the value of x in the current state is at most the value of y in the previous state plus some integer constant c. Intuitively, difference constraints are also a good choice for complexity and resource bound analysis because the complexity of imperative programs typically arises from counter increments resp. decrements of form x := x + c and resets of form x := y (where x != y), which can be modeled naturally by the difference constraints x' <= x + c resp. x' <= y + 0. Our work contributes to the field of automated complexity and resource bound analysis by: (1) Providing efficient abstraction techniques and demonstrating that difference constraints are a suitable abstract program model for automatic complexity and resource bound analysis. (2) Providing a new, soundness proven bound algorithm which extends the range of bound analysis to a class of challenging but natural loop iteration patterns that can be found in real source code. (3) Presenting a bound analysis technique which is more scalable than existing approaches. Our contributions are supported by a thorough experimental comparison on three benchmarks: We compare our implementation to other state-of-the-art bound analyses (a) on a large benchmark of real-world C code, (b) on a benchmark built of examples taken from the bound analysis literature and (c) on a benchmark of challenging iteration patterns which we found in real source code. As most approaches to bound analysis, our analysis infers upper bounds on the execution cost of a program.17
Complexity and Resource Bound Analysis of Imperative Programs Using Difference Constraints
Difference constraints have been used for termination analysis in the literature, where they denote relational inequalities of the form x′≤y+c, and describe that the value of x in the current state is at most the value of y in the previous state plus some constant c∈Z. We believe that difference constraints are also a good choice for complexity and resource bound analysis because the complexity of imperative programs typically arises from counter increments and resets, which can be modeled naturally by difference constraints. In this article we propose a bound analysis based on difference constraints. We make the following contributions: (1) our analysis handles bound analysis problems of high practical relevance which current approaches cannot handle: we extend the range of bound analysis to a class of challenging but natural loop iteration patterns which typically appear in parsing and string-matching routines. (2) We advocate the idea of using bound analysis to infer invariants: our soundness proven algorithm obtains invariants through bound analysis, the inferred invariants are in turn used for obtaining bounds. Our bound analysis therefore does not rely on external techniques for invariant generation. (3) We demonstrate that difference constraints are a suitable abstract program model for automatic complexity and resource bound analysis: we provide efficient abstraction techniques for obtaining difference constraint programs from imperative code. (4) We report on a thorough experimental comparison of state-of-the-art bound analysis tools: we set up a tool comparison on (a) a large benchmark of real-world C code, (b) a benchmark built of examples taken from the bound analysis literature and (c) a benchmark of challenging iteration patterns which we found in real source code. (5) Our analysis is more scalable than existing approaches: we discuss how we achieve scalability.Austrian National Research NetworkVienna Science andTechnology Fund (WWTF)3454