8 research outputs found

    Towards an aesthetic reading of communicative competence models: the relevance of creativeness

    Get PDF
    Antonio Garcés Rodríguez. Facultad de Ciencias de la Educación. Universidad de GranadaRecepción: 10/02/2019 | Aceptado: 25/05/2019Correspondencia a través de ORCID: Antonio Garcés-Rodríguez - 0000-0002-3426-7841El presente trabajo realiza una revisión bibliográfica de los diferentes conceptos y modelos de competencia comunicativa, revisitándolos en función del desarrollo creativo y estético que estimulan en el discente. Los diferentes modelos clásicos de Hymes (1972); Halliday (1971,72); Munby (1978); Widdowson (1978); Savignon (1983); Canale and Swain (1980), Canale (1983); Spitzberg (1988); Saville-Troike, 1989, 1996); Stern (1986); Spolsky (1990) y Bachman (1990) son revisados con un doble objetivo: (i) recopilar e integrar los hallazgos sobre CC y (ii) revisar el modelo de competencia comunicativa a través del filtro de la creatividad y la lectura estética. La metodología de investigación selecciona los diferentes modelos analizados en función de su alto grado de aceptación en el campo de la Didáctica de la Lengua y la Literatura y de motores de búsqueda especializados en temas académico-científicos. La discusión y conclusiones sugieren la necesidad de ofertar un modelo más abierto de competencia comunicativa cuyo eje central sea la creatividad y la estética como sustentadores de la competencia lingüística y su transformación en competencia comunicativa.Abstract: This paper makes a literature review of the different concepts and models of communicative competence, revisiting them according to the creative and aesthetic development that they stimulate in students. The different classic models of Hymes (1972); Halliday (1972); Munby (1978); Widdowson (1978); Savignon (1983); Canale and Swain (1980), Canale (1983); Spitzberg (1988); Saville-Troike, (1989, 1996); Stern (1986); Spolsky (1990) and Bachman (1990) are revised with a double objective: (i) to collect and integrate the findings on CC to offer future teachers of English as a Foreign language clear notions and models and (ii) to revise the model of communicative competence through the filter of creativity and aesthetic reading. The research methodology selects the different models to be under review according to their high degree of acceptance in the field of Language and Literature Didactics. The discussion and conclusions suggest the need to offer an open model of communicative competence whose central axis is creativity and aesthetics as supporters of linguistic acquisition and learning and its transformation into communicative competence

    Soft Contract Verification

    Full text link
    Behavioral software contracts are a widely used mechanism for governing the flow of values between components. However, run-time monitoring and enforcement of contracts imposes significant overhead and delays discovery of faulty components to run-time. To overcome these issues, we present soft contract verification, which aims to statically prove either complete or partial contract correctness of components, written in an untyped, higher-order language with first-class contracts. Our approach uses higher-order symbolic execution, leveraging contracts as a source of symbolic values including unknown behavioral values, and employs an updatable heap of contract invariants to reason about flow-sensitive facts. We prove the symbolic execution soundly approximates the dynamic semantics and that verified programs can't be blamed. The approach is able to analyze first-class contracts, recursive data structures, unknown functions, and control-flow-sensitive refinements of values, which are all idiomatic in dynamic languages. It makes effective use of an off-the-shelf solver to decide problems without heavy encodings. The approach is competitive with a wide range of existing tools---including type systems, flow analyzers, and model checkers---on their own benchmarks.Comment: ICFP '14, September 1-6, 2014, Gothenburg, Swede

    Automating Verification of Functional Programs with Quantified Invariants

    Get PDF
    We present the foundations of a verifier for higher-order functional programs with generics and recursive algebraic data types. Our ver- ifier supports finding sound proofs and counterexamples even in the presence of certain quantified invariants and recursive functions. Our approach uses the same language to describe programs and in- variants and uses semantic criteria for establishing termination. Our implementation makes effective use of SMT solvers by encoding first-class functions and quantifiers into a quantifier-free fragment of first-order logic with theories. We are able to specify properties of datastructure operations involving higher-order functions with minimal annotation overhead and verify them with a high degree of automation. Our system is also effective at reporting counterexam- ples, even in the presence of first-order quantification

    Programming with Specifications

    Get PDF
    This thesis explores the use of specifications for the construction of correct programs. We go beyond their standard use as run-time assertions, and present algorithms, techniques and implementations for the tasks of 1) program verification, 2) declarative programming and 3) software synthesis. These results are made possible by our advances in the domains of decision procedure design and implementation. In the first part of this thesis, we present a decidability result for a class of logics that support user-defined recursive function definitions. Constraints in this class can encode expressive properties of recursive data structures, such as sortedness of a list, or balancing of a search tree. As a result, complex verification conditions can be stated concisely and solved entirely automatically. We also present a new decision procedure for a logic to reason about sets and constraints over their cardinalities. The key insight lies in a technique to decompose con- straints according to mutual dependencies. Compared to previous techniques, our algorithm brings significant improvements in running times, and for the first time integrates reasoning about cardinalities within the popular DPLL(T ) setting. We integrated our algorithmic ad- vances into Leon, a static analyzer for functional programs. Leon can reason about constraints involving arbitrary recursive function definitions, and has the desirable theoretical property that it will always find counter-examples to assertions that do not hold. We illustrate the flexibility and efficiency of Leon through experimental evaluation, where we used it to prove detailed correctness properties of data structure implementations. We then illustrate how program specifications can be used as a high-level programming construct ; we present Kaplan, an extension of Scala with first-class logical constraints. Kaplan allows programmers to create, manipulate and combine constraints as they would any other data structure. Our implementation of Kaplan illustrates how declarative programming can be incorporated into an existing mainstream programming language. Moreover, we examine techniques to transform, at compile-time, program specifications into efficient executable code. This approach of software synthesis combines the correctness benefits of declarative programming with the efficiency of imperative or functional programming

    Analyse statique de transformations pour l'Ă©limination de motifs

    Get PDF
    Program transformation is an extremely common practice in computer science. From compilation to tests generation, through many approaches of code analysis and formal verification of programs, it is a process that is both ubiquitous and critical to properly functioning programs and information systems. This thesis proposes to study the program transformations mechanisms in order to express and verify syntactical guarantees on the behaviour of these transformations and on their results.Giving a characterisation of the shape of terms returned by such a transformation is, indeed, a common approach to the formal verification of programs. In order to express some properties often used by this type of approaches, we propose in this thesis a formalism inspired by themodel of compilation passes, which are used to describe the general compilation of a program as a sequence of minimal transformations, and based on the notions of pattern matching and term rewriting.This formalism relies on an annotation mechanism of function symbols in order to express a set of specifications describing the behaviours of the associated functions. We then propose a static analysis method in order to check that a transformation, expressed as a term rewritesystem, actually verifies its specifications.La transformation de programmes est une pratique très courante dans le domaine des sciences informatiques. De la compilation à la génération de test en passant par de nombreuses approches d’analyse de codes et de vérification formelle des programmes, c’est un procédé qui est à la fois omniprésent et crucial au bon fonctionnement des programmes et systèmes informatiques. Cette thèse propose une étude formelle des procédures de transformation de programmes dans le but d’exprimer et de garantir des propriétés syntaxiques sur le comportement et les résultats d’unetelle transformation.Dans le contexte de la vérification formelle des programmes, il est en effet souvent nécessaire de pouvoir caractériser la forme des termes obtenus par réduction suivant une telle transformation. En s’inspirant du modèle de passes de compilation, qui décrivent un séquençage de la compilation d’un programme en étapes de transformation minimales n’affectant qu’un petit nombre des constructions du langage, on introduit, dans cette thèse, un formalisme basé sur les notions de filtrage par motif et de réécriture permettant de décrire certaines propriétés couramment induites par ce type de transformations.Le formalisme proposé se repose sur un système d’annotations des symboles de fonction décrivant une spécification du comportement attendu des fonctions associées. On présente alors une méthode d’analyse statique permettant de vérifier que les transformations étudiées, exprimées par un système de réécriture, satisfont en effet ces spécifications
    corecore