23,813 research outputs found

    Quasi-friendly sup-interpretations

    Get PDF
    In a previous paper, the sup-interpretation method was proposed as a new tool to control memory resources of first order functional programs with pattern matching by static analysis. Basically, a sup-interpretation provides an upper bound on the size of function outputs. In this former work, a criterion, which can be applied to terminating as well as non-terminating programs, was developed in order to bound polynomially the stack frame size. In this paper, we suggest a new criterion which captures more algorithms computing values polynomially bounded in the size of the inputs. Since this work is related to quasi-interpretations, we compare the two notions obtaining two main features. The first one is that, given a program, we have heuristics for finding a sup-interpretation when we consider polynomials of bounded degree. The other one consists in the characterizations of the set of function computable in polynomial time and in polynomial space

    Synthesis of sup-interpretations: a survey

    Get PDF
    In this paper, we survey the complexity of distinct methods that allow the programmer to synthesize a sup-interpretation, a function providing an upper- bound on the size of the output values computed by a program. It consists in a static space analysis tool without consideration of the time consumption. Although clearly related, sup-interpretation is independent from termination since it only provides an upper bound on the terminating computations. First, we study some undecidable properties of sup-interpretations from a theoretical point of view. Next, we fix term rewriting systems as our computational model and we show that a sup-interpretation can be obtained through the use of a well-known termination technique, the polynomial interpretations. The drawback is that such a method only applies to total functions (strongly normalizing programs). To overcome this problem we also study sup-interpretations through the notion of quasi-interpretation. Quasi-interpretations also suffer from a drawback that lies in the subterm property. This property drastically restricts the shape of the considered functions. Again we overcome this problem by introducing a new notion of interpretations mainly based on the dependency pairs method. We study the decidability and complexity of the sup-interpretation synthesis problem for all these three tools over sets of polynomials. Finally, we take benefit of some previous works on termination and runtime complexity to infer sup-interpretations.Comment: (2012

    Resource Control for Synchronous Cooperative Threads

    Get PDF
    We develop new methods to statically bound the resources needed for the execution of systems of concurrent, interactive threads. Our study is concerned with a \emph{synchronous} model of interaction based on cooperative threads whose execution proceeds in synchronous rounds called instants. Our contribution is a system of compositional static analyses to guarantee that each instant terminates and to bound the size of the values computed by the system as a function of the size of its parameters at the beginning of the instant. Our method generalises an approach designed for first-order functional languages that relies on a combination of standard termination techniques for term rewriting systems and an analysis of the size of the computed values based on the notion of quasi-interpretation. We show that these two methods can be combined to obtain an explicit polynomial bound on the resources needed for the execution of the system during an instant. As a second contribution, we introduce a virtual machine and a related bytecode thus producing a precise description of the resources needed for the execution of a system. In this context, we present a suitable control flow analysis that allows to formulte the static analyses for resource control at byte code level

    Mandarin Singing Voice Synthesis Based on Harmonic Plus Noise Model and Singing Expression Analysis

    Full text link
    The purpose of this study is to investigate how humans interpret musical scores expressively, and then design machines that sing like humans. We consider six factors that have a strong influence on the expression of human singing. The factors are related to the acoustic, phonetic, and musical features of a real singing signal. Given real singing voices recorded following the MIDI scores and lyrics, our analysis module can extract the expression parameters from the real singing signals semi-automatically. The expression parameters are used to control the singing voice synthesis (SVS) system for Mandarin Chinese, which is based on the harmonic plus noise model (HNM). The results of perceptual experiments show that integrating the expression factors into the SVS system yields a notable improvement in perceptual naturalness, clearness, and expressiveness. By one-to-one mapping of the real singing signal and expression controls to the synthesizer, our SVS system can simulate the interpretation of a real singer with the timbre of a speaker.Comment: 8 pages, technical repor

    On Quasi-Interpretations, Blind Abstractions and Implicit Complexity

    Full text link
    Quasi-interpretations are a technique to guarantee complexity bounds on first-order functional programs: with termination orderings they give in particular a sufficient condition for a program to be executable in polynomial time, called here the P-criterion. We study properties of the programs satisfying the P-criterion, in order to better understand its intensional expressive power. Given a program on binary lists, its blind abstraction is the nondeterministic program obtained by replacing lists by their lengths (natural numbers). A program is blindly polynomial if its blind abstraction terminates in polynomial time. We show that all programs satisfying a variant of the P-criterion are in fact blindly polynomial. Then we give two extensions of the P-criterion: one by relaxing the termination ordering condition, and the other one (the bounded value property) giving a necessary and sufficient condition for a program to be polynomial time executable, with memoisation.Comment: 18 page

    Quasi-interpretations a way to control resources

    Get PDF
    International audienceThis paper presents in a reasoned way our works on resource analysis by quasi- interpretations. The controlled resources are typically the runtime, the runspace or the size of a result in a program execution. Quasi-interpretations allow analyzing system complexity. A quasi-interpretation is a numerical assignment, which provides an upper bound on computed func- tions and which is compatible with the program operational semantics. Quasi- interpretation method offers several advantages: (i) It provides hints in order to optimize an execution, (ii) it gives resource certificates, and (iii) finding quasi- interpretations is decidable for a broad class which is relevant for feasible com- putations. By combining the quasi-interpretation method with termination tools (here term orderings), we obtained several characterizations of complexity classes starting from Ptime and Pspace

    Confidence limits of evolutionary synthesis models. IV Moving forward to a probabilistic formulation

    Get PDF
    Synthesis models predict the integrated properties of stellar populations. Several problems exist in this field, mostly related to the fact that integrated properties are distributed. To date, this aspect has been either ignored (as in standard synthesis models, which are inherently deterministic) or interpreted phenomenologically (as in Monte Carlo simulations, which describe distributed properties rather than explain them). We approach population synthesis as a problem in probability theory, in which stellar luminosities are random variables extracted from the stellar luminosity distribution function (sLDF). We derive the population LDF (pLDF) for clusters of any size from the sLDF, obtaining the scale relations that link the sLDF to the pLDF. We recover the predictions of standard synthesis models, which are shown to compute the mean of the sLDF. We provide diagnostic diagrams and a simplified recipe for testing the statistical richness of observed clusters, thereby assessing whether standard synthesis models can be safely used or a statistical treatment is mandatory. We also recover the predictions of Monte Carlo simulations, with the additional bonus of being able to interpret them in mathematical and physical terms. We give examples of problems that can be addressed through our probabilistic formalism. Though still under development, ours is a powerful approach to population synthesis. In an era of resolved observations and pipelined analyses of large surveys, this paper is offered as a signpost in the field of stellar populations.Comment: Accepted by A&A. Substantially modified with respect to the 1st draft. 26 pages, 14 fig

    Complexity Bounds for Ordinal-Based Termination

    Full text link
    `What more than its truth do we know if we have a proof of a theorem in a given formal system?' We examine Kreisel's question in the particular context of program termination proofs, with an eye to deriving complexity bounds on program running times. Our main tool for this are length function theorems, which provide complexity bounds on the use of well quasi orders. We illustrate how to prove such theorems in the simple yet until now untreated case of ordinals. We show how to apply this new theorem to derive complexity bounds on programs when they are proven to terminate thanks to a ranking function into some ordinal.Comment: Invited talk at the 8th International Workshop on Reachability Problems (RP 2014, 22-24 September 2014, Oxford

    Polynomial Size Analysis of First-Order Shapely Functions

    Get PDF
    We present a size-aware type system for first-order shapely function definitions. Here, a function definition is called shapely when the size of the result is determined exactly by a polynomial in the sizes of the arguments. Examples of shapely function definitions may be implementations of matrix multiplication and the Cartesian product of two lists. The type system is proved to be sound w.r.t. the operational semantics of the language. The type checking problem is shown to be undecidable in general. We define a natural syntactic restriction such that the type checking becomes decidable, even though size polynomials are not necessarily linear or monotonic. Furthermore, we have shown that the type-inference problem is at least semi-decidable (under this restriction). We have implemented a procedure that combines run-time testing and type-checking to automatically obtain size dependencies. It terminates on total typable function definitions.Comment: 35 pages, 1 figur

    Feasible reactivity in a synchronous pi-calculus

    Get PDF
    Reactivity is an essential property of a synchronous program. Informally, it guarantees that at each instant the program fed with an input will `react' producing an output. In the present work, we consider a refined property that we call ` feasible reactivity'. Beyond reactivity, this property guarantees that at each instant both the size of the program and its reaction time are bounded by a polynomial in the size of the parameters at the beginning of the computation and the size of the largest input. We propose a method to annotate programs and we develop related static analysis techniques that guarantee feasible reactivity for programs expressed in the S-pi-calculus. The latter is a synchronous version of the pi-calculus based on the SL synchronous programming model
    • …
    corecore