103 research outputs found

    A Transformational Approach to Resource Analysis with Typed-Norms

    Get PDF
    In order to automatically infer the resource consumption of programs, analyzers track how data sizes change along a program s execution. Typically, analyzers measure the sizes of data by applying norms which are mappings from data to natural numbers that represent the sizes of the corresponding data. When norms are defined by taking type information into account, they are named typed-norms. The main contribution of this paper is a transformational approach to resource analysis with typed-norms. The analysis is based on a transformation of the program into an intermediate abstract program in which each variable is abstracted with respect to all considered norms which are valid for its type. We also sketch a simple analysis that can be used to automatically infer the required, useful, typed-norms from programs.This work was funded partially by the EU project FP7-ICT-610582 ENVISAGE: Engineering Virtualized Services (http://www.envisage-project.eu) and by the Spanish projects TIN2008-05624 and TIN2012-38137. RaĂșl GutiĂ©rrez is also partially supported by a Juan de la Cierva Fellowship from the Spanish MINECO, ref. JCI-2012-13528.Albert Albiol, EM.; Genaim, S.; GutiĂ©rrez Gil, R. (2014). A Transformational Approach to Resource Analysis with Typed-Norms. Lecture Notes in Computer Science. 8901:38-53. https://doi.org/10.1007/978-3-319-14125-1_3S38538901Albert, E., Arenas, P., Genaim, S., GĂłmez-Zamalloa, M., Puebla, G.: Cost Analysis of Concurrent OO Programs. In: Yang, H. (ed.) APLAS 2011. LNCS, vol. 7078, pp. 238–254. Springer, Heidelberg (2011)Albert, E., Arenas, P., Genaim, S., Puebla, G., Zanardini, D.: Cost Analysis of Java Bytecode. In: De Nicola, R. (ed.) ESOP 2007. LNCS, vol. 4421, pp. 157–172. Springer, Heidelberg (2007)Albert, E., Arenas, P., Genaim, S., Puebla, G., Zanardini, D.: Removing Useless Variables in Cost Analysis of Java Bytecode. In: Proc. of SAC 2008, pp. 368–375. ACM (2008)Alonso, D., Arenas, P., Genaim, S.: Handling Non-linear Operations in the Value Analysis of COSTA. In: Proc. of BYTECODE 2011. ENTCS, vol. 279, pp. 3–17. Elsevier (2011)Bossi, A., Cocco, N., Fabris, M.: Proving Termination of Logic Programs by Exploiting Term Properties. In: Proc. of TAPSOFT 1991. LNCS, vol. 494, pp. 153–180. Springer (1991)Bruynooghe, M., Codish, M., Gallagher, J., Genaim, S., Vanhoof, W.: Termination Analysis of Logic Programs through Combination of Type-Based norms. TOPLAS 29(2), Art. 10 (2007)Claessen, K., Hughes, J.: QuickCheck: A Lightweight Tool for Random Testing of Haskell Programs. In: Proc. of ICFP 2000, pp. 268–279. ACM (2000)FĂ€hndrich, M.: Static Verification for Code Contracts. In: Cousot, R., Martel, M. (eds.) SAS 2010. LNCS, vol. 6337, pp. 2–5. Springer, Heidelberg (2010)Genaim, S., Codish, M., Gallagher, J.P., Lagoon, V.: Combining Norms to Prove Termination. In: Cortesi, A. (ed.) VMCAI 2002. LNCS, vol. 2294, pp. 123–138. Springer, Heidelberg (2002)Johnsen, E.B., HĂ€hnle, R., SchĂ€fer, J., Schlatte, R., Steffen, M.: ABS: A Core Language for Abstract Behavioral Specification. In: Aichernig, B.K., de Boer, F.S., Bonsangue, M.M. (eds.) Formal Methods for Components and Objects. LNCS, vol. 6957, pp. 142–164. Springer, Heidelberg (2011)King, A., Shen, K., Benoy, F.: Lower-bound Time-complexity Analysis of Logic Programs. In: Proc. of ILPS 1997, pp. 261–275. MIT Press (1997)Serrano, A., Lopez-Garcia, P., Bueno, F., Hermenegildo, M.: Sized Type Analysis for Logic Programs. In: Tech. Comms. of ICLP 2013. Cambridge U. Press (2013) (to appear)Spoto, F., Mesnard, F., Payet, É.: A Termination Analyser for Java Bytecode based on Path-Length. TOPLAS 32(3), Art. 8 (2010)VallĂ©e-Rai, R., Hendren, L., Sundaresan, V., Lam, P., Gagnon, E., Co, P.: Soot - a Java Optimization Framework. In: Proc. of CASCON 1999. pp. 125–135. IBM (1999)Vasconcelos, P.: Space Cost Analysis using Sized Types. Ph.D. thesis, School of CS, University of St. Andrews (2008)Vasconcelos, P.B., Hammond, K.: Inferring Cost Equations for Recursive, Polymorphic and Higher-Order Functional Programs. In: Trinder, P., Michaelson, G.J., Peña, R. (eds.) IFL 2003. LNCS, vol. 3145, pp. 86–101. Springer, Heidelberg (2004)Wegbreit, B.: Mechanical Program Analysis. Commun. ACM 18(9), 528–539 (1975

    Worst-case analysis of heap allocations

    Get PDF
    Abstract. In object oriented languages, dynamic memory allocation is a fundamental concept. When using such a language in hard real-time systems, it becomes important to bound both the worst-case execution time and the worst-case memory consumption. In this paper, we present an analysis to determine the worst-case heap allocations of tasks. The analysis builds upon techniques that are well established for worst-case execution time analysis. The difference is that the cost function is not the execution time of instructions in clock cycles, but the allocation in bytes. In contrast to worst-case execution time analysis, worst-case heap allocation analysis is not processor dependent. However, the cost function depends on the object layout of the runtime system. The analysis is evaluated with several real-time benchmarks to establish the usefulness of the analysis, and to compare the memory consumption of different object layouts.

    Invariant Patterns for Program Reasoning

    Full text link

    A Fistful of Dollars: Formalizing Asymptotic Complexity Claims via Deductive Program Verification

    Get PDF
    Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2018International audienceWe present a framework for simultaneously verifying the functional correctness and the worst-case asymptotic time complexity of higher-order imperative programs. We build on top of Separation Logic with Time Credits, embedded in an interactive proof assistant. We formalize the O notation, which is key to enabling modular specifications and proofs. We cover the subtleties of the multivariate case, where the complexity of a program fragment depends on multiple parameters. We propose a way of integrating complexity bounds into specifications, present lemmas and tactics that support a natural reasoning style, and illustrate their use with a collection of examples

    Tight polynomial bounds for Loop programs in polynomial space

    Get PDF
    We consider the following problem: given a program, find tight asymptotic bounds on the values of some variables at the end of the computation (or at any given program point) in terms of its input values. We focus on the case of polynomially-bounded variables, and on a weak programming language for which we have recently shown that tight bounds for polynomially-bounded variables are computable. These bounds are sets of multivariate polynomials. While their computability has been settled, the complexity of this program-analysis problem remained open. In this paper, we show the problem to be PSPACE-complete. The main contribution is a new, space-efficient analysis algorithm. This algorithm is obtained in a few steps. First, we develop an algorithm for univariate bounds, a sub-problem which is already PSPACE-hard. Then, a decision procedure for multivariate bounds is achieved by reducing this problem to the univariate case; this reduction is orthogonal to the solution of the univariate problem and uses observations on the geometry of a set of vectors that represent multivariate bounds. Finally, we transform the univariate-bound algorithm to produce multivariate bounds

    Potential Impact of Antiretroviral Chemoprophylaxis on HIV-1 Transmission in Resource-Limited Settings

    Get PDF
    Background. The potential impact of pre-exposure chemoprophylaxis (PrEP) an heterosexual transmission of HIV-1 infection in resource-limited settings is uncertain. Methodology/Principle Findings. A deterministic mathematical model was used to simulate the effects of antiretroval PreP on an HIV-1 epidemic in sub-Saharan Africa under different scenarios (optimistic neutral and pessimistic) both with and without sexual disinhibition. Sensitivity analyses were used to evaluate the effect of uncertainty in input parameters on model output and included calculation of partial rank correlations and standardized rank regressions. In the scenario without sexual disinhibition after PrEP initiation, key parameters influencing infections prevented were effectiveness of PrEP (partial rank correlation coefficient (PRCC) = 0.94), PrEP discontinuation rate (PRCC=-0.94), level of coverage (PRCC=0.92), and time to achieve target coverage (PRCC=-082). In the scenario with sexual disinhibition, PrEP effectiveness and the extent of sexual disinhibition had the greatest impact on prevention. An optimistic scenario of PrEP with 90% effectiveness and 75% coverage of the general population predicted a 74% decline in cumulative HIV-1 infections after 10 years, and a 28.8% decline with PrEP targeted to the highest risk groups (16% of the population). Even With a 100% increase in at-risk behavior from sexual disinhibition, a beneficial effect (23.4%-62.7% decrease in infections) was seen with 90% effective PrEP across a broad range of coverage (25%-75%). Similar disinhibition led to a rise in infections with lower effectiveness of PrEP (≀50%). Conclusions/Significance. Mathematical modeling supports the potential public health benefit of PrEP. Approximately 2.7 to 3.2 million new HIV-1 infections could be averaged in southern sub-Saharan Africa over 10 years by targeting PrEP (having 90% effectiveness) to those at highest behavioral risk and by preventing sexual disinhibition. This benefit could be lost, however, by sexual disinhibition and by high PrEP discontinuation, especially with lower PrEP effectiveness (≀:50%). © 2007 Abbas et al

    The neurocognitive functioning in bipolar disorder: a systematic review of data

    Full text link

    Unlimp -- uniqueness as a leitmotiv for implementation

    Get PDF
    When evaluation in functional programming languages is explained using ?-calculus and/or term rewriting systems, expressions and function definitions are often defined as terms, that is as em trees. Similarly, the collection of all terms is defined as a em forest, that is a directed, acyclic graph where every vertex has at most one incoming edge. Concrete implementations usually drop the last restriction (and sometimes acyclicity as well), i.e. many terms can share a common subterm, meaning that different paths of subterm edges reach the same vertex in the graph. Any vertex in such a graph represents a term. A term is represented uniquely in such a graph if there are no two different vertices representing it. Such a representation can be established by using em hash-consing for the creation of heap objects. We investigate the consequences of adopting uniqueness in this sense as a leitmotiv for implementation (called Unlimp), i.e. not em allowing any two different vertices in a graph to represent the same term

    An overview of the ECL programming system

    No full text
    • 

    corecore