8 research outputs found

    Multi-Agent Spiral Software Engineering: A Lakatosian Approach

    Get PDF
    This paper presents an epistemological approach for the development and validation of an original agent oriented software development methodology (see [Wautelet05a, Wautelet05b]). Agent orientation has been widely presented as a new modeling, design and programming paradigm that could be adopted to build systems mark to the determinant advantages it offers. This will be exposed and put into perspective in the paper through the Lakatosian approach. Spiral development (see [Boehm00a]) has become popular, especially through object-oriented software project development since it allows efficient software project management, continuous organizational modeling and requirements acquisition, early implementation, continuous testing and modularity, etc. The iterative nature of this requirements engineering process will be studied here through Herbert Simon's bounded rationality principle and Popper's knowledge growth principle but nuanced by Lakatos falsification principle criticism

    A JML-Based strategy for incorporating formal specifications into the software development process

    Get PDF
    This thesis presents a JML-based strategy that incorporates formal specifications into the software development process of object-oriented programs. The strategy evolves functional requirements into a “semi-formal” requirements form, and then expressing them as JML formal specifications. The strategy is implemented as a formal-specification pseudo-phase that runs in parallel with the other phase of software development. What makes our strategy different from other software development strategies used in literature is the particular use of JML specifications we make all along the way from requirements to validation-and-verification.Orientador: Néstor Catañ

    Compiler Support for Operator Overloading and Algorithmic Differentiation in C++

    Get PDF
    Multiphysics software needs derivatives for, e.g., solving a system of non-linear equations, conducting model verification, or sensitivity studies. In C++, algorithmic differentiation (AD), based on operator overloading (overloading), can be used to calculate derivatives up to machine precision. To that end, the built-in floating-point type is replaced by the user-defined AD type. It overloads all required operators, and calculates the original value and the corresponding derivative based on the chain rule of calculus. While changing the underlying type seems straightforward, several complications arise concerning software and performance engineering. This includes (1) fundamental language restrictions of C++ w.r.t. user-defined types, (2) type correctness of distributed computations with the Message Passing Interface (MPI) library, and (3) identification and mitigation of AD induced overheads. To handle these issues, AD experts may spend a significant amount of time to enhance a code with AD, verify the derivatives and ensure optimal application performance. Hence, in this thesis, we propose a modern compiler-based tooling approach to support and accelerate the AD-enhancement process of C++ target codes. In particular, we make contributions to three aspects of AD. The initial type change - While the change to the AD type in a target code is conceptually straightforward, the type change often leads to a multitude of compiler error messages. This is due to the different treatment of built-in floating-point types and user-defined types by the C++ language standard. Previously legal code constructs in the target code subsequently violate the language standard when the built-in floating-point type is replaced with a user-defined AD type. We identify and classify these problematic code constructs and their root cause is shown. Solutions by localized source transformation are proposed. To automate this rather mechanical process, we develop a static code analyser and source transformation tool, called OO-Lint, based on the Clang compiler framework. It flags instances of these problematic code constructs and applies source transformations to make the code compliant with the requirements of the language standard. To show the overall relevance of complications with user-defined types, OO-Lint is applied to several well-known scientific codes, some of which have already been AD enhanced by others. In all of these applications, except the ones manually treated for AD overloading, problematic code constructs are detected. Type correctness of MPI communication - MPI is the de-facto standard for programming high performance, distributed applications. At the same time, MPI has a complex interface whose usage can be error-prone. For instance, MPI derived data types require manual construction by specifying memory locations of the underlying data. Specifying wrong offsets can lead to subtle bugs that are hard to detect. In the context of AD, special libraries exist that handle the required derivative book-keeping by replacing the MPI communication calls with overloaded variants. However, on top of the AD type change, the MPI communication routines have to be changed manually. In addition, the AD type fundamentally changes memory layout assumptions as it has a different extent than the built-in types. Previously legal layout assumptions have, thus, to be reverified. As a remedy, to detect any type-related errors, we developed a memory sanitizer tool, called TypeART, based on the LLVM compiler framework and the MPI correctness checker MUST. It tracks all memory allocations relevant to MPI communication to allow for checking the underlying type and extent of the typeless memory buffer address passed to any MPI routine. The overhead induced by TypeART w.r.t. several target applications is manageable. AD domain-specific profiling - Applying AD in a black-box manner, without consideration of the target code structure, can have a significant impact on both runtime and memory consumption. An AD expert is usually required to apply further AD-related optimizations for the reduction of these induced overheads. Traditional profiling techniques are, however, insufficient as they do not reveal any AD domain-specific metrics. Of interest for AD code optimization are, e.g., specific code patterns, especially on a function level, that can be treated efficiently with AD. To that end, we developed a static profiling tool, called ProAD, based on the LLVM compiler framework. For each function, it generates the computational graph based on the static data flow of the floating-point variables. The framework supports pattern analysis on the computational graph to identify the optimal application of the chain rule. We show the potential of the optimal application of AD with two case studies. In both cases, significant runtime improvements can be achieved when the knowledge of the code structure, provided by our tool, is exploited. For instance, with a stencil code, a speedup factor of about 13 is achieved compared to a naive application of AD and a factor of 1.2 compared to hand-written derivative code

    Ferramentas de verificação formal de protocolos criptográficos

    Get PDF
    Dissertação de mestrado em Engenharia de InformáticaAo longo destes anos o número de aplicações distribuídas e o uso da Internet têm aumentado consideravelmente. Muitas destas aplicações críticas efetuam um conjunto de operações sensíveis, manipulando frequentemente dados privados e confidenciais. Torna-se deste modo importante que, antes de usufruir de uma estrutura deste tipo, se averigúe quais são as políticas de segurança da aplicação em questão de modo a que, mais tarde, o utilizador não tenha surpresas indesejadas. Os protocolos criptográficos constituem um recurso importante nos componentes dos sistemas encarregues de fornecer as garantias de segurança pretendidas. Tratam-se de protocolos de comunicação normalmente pequenos, que fazem uso de técnicas criptográficas, e que têm objectivos bem especificados como sejam o estabelecimento de uma chave de sessão ou de garantias de autenticidade na comunicação. A natureza crítica desses protocolos justifica que se invista numa análise rigorosa (i.e., formal) desses protocolos, garantindo dessa forma que eles cumprem a função para que foram desenhados. No entanto, a análise desses protocolos tem-se revelado um problema complexo, mesmo quando se atribui um comportamento idealizado às operações criptográficas empregues. Por esse motivo, a comunidade científica tem vindo a propor novas metodologias e ferramentas que auxiliam no processo de verificação formal de protocolos criptográficos. Neste trabalho, iremos apresentar um conjunto representativo de ferramentas para a verificação formal de protocolos criptográficos. Após a exposição das suas principais características e dos respetivos modelos que as suportam, iremos emprega-las na análise de fragmentos de um protocolo concreto que se tem implantado como um standard de facto na internet: o protocolo de autenticação centralizada (Single Sign-On) “OpenID”.Over these years the number of distributed applications and the Internet use have increased considerably. Many of these critical applications perform a set of sensitive operations that often manipulate private and confidential data. In this way, it’s important before we use one of these critical applications, determine the security policies of the application in question, just to the user doesn’t have surprises later. The cryptographic protocols are an important resource of the systems components that have the charge of provide the desired security guarantees. These cryptographic protocols, usually small, make use of cryptographic techniques, whose aims are well specified, such as the settlement of a session key or some guarantees of communication authenticity. The critical nature of these protocols justifies that we must do a rigorous analysis (i.e. formal) of these protocols. In this way, we guarantee that they fit the function for what they were designed. However, the analysis of these protocols has showed to be a complex problem, even when it is attributed an idealized behavior for the cryptographic operations employed. That’s why the scientific community has proposed new methodologies and tools that help in the formal verification process of cryptographic protocols. In this work, we introduce a representative set of tools for the formal verification of cryptographic protocols. After we demonstrate their main characteristics and models that support them, we will use those tools in the analysis of the fragments of one concrete protocol that has implemented like a standard indeed on the internet: the centralized authentication protocol (Single Sign-On) “OpenID”

    Automated specification-based testing of graphical user interfaces

    Get PDF
    Tese de doutoramento. Engenharia Electrónica e de Computadores. 2006. Faculdade de Engenharia. Universidade do Porto, Departamento de Informática, Escola de Engenharia. Universidade do Minh

    JML- Based formal development of a Java card application for managing medical appointments

    Get PDF
    Although formal methods can dramatically increase the quality of software systems, they have not widely been adopted in software industry. Many software companies have the perception that formal methods are not cost-effective cause they are plenty of mathematical symbols that are difficult for non-experts to assimilate. The Java Modelling Language (short for JML) Section 3.3 is an academic initiative towards the development of a common formal specification language for Java programs, and the implementation of tools to check program correctness. This master thesis work shows how JML based formal methods can be used to formally develop a privacy sensitive Java application. This is a smart card application for managing medical appointments. The application is named HealthCard. We follow the software development strategy introduced by João Pestana, presented in Section 3.4. Our work influenced the development of this strategy by providing hands-on insight on challenges related to development of a privacy sensitive application in Java. Pestana’s strategy is based on a three-step evolution strategy of software specifications, from informal ones, through semiformal ones, to JML formal specifications. We further prove that this strategy can be automated by implementing a tool that generates JML formal specifications from a welldefined subset of informal software specifications. Hence, our work proves that JML-based formal methods techniques are cost-effective, and that they can be made popular in software industry. Although formal methods are not popular in many software development companies, we endeavour to integrate formal methods to general software practices. We hope our work can contribute to a better acceptance of mathematical based formalisms and tools used by software engineers. The structure of this document is as follows. In Section 2, we describe the preliminaries of this thesis work. We make an introduction to the application for managing medical applications we have implemented. We also describe the technologies used in the development of the application. This section further illustrates the Java Card Remote Method Invocation communication model used in the medical application for the client and server applications. Section 3 introduces software correctness, including the design by contract and the concept of contract in JML. Section 4 presents the design structure of the application. Section 5 shows the implementation of the HealthCard. Section 6 describes how the HealthCard is verified and validated using JML formal methods tools. Section 7 includes some metrics of the HealthCard implementation and specification. Section 8 presents a short example of how a client-side of a smart card application can be implemented while respecting formal specifications. Section 9 describes a prototype tools to generate JML formal specifications from informal specifications automatically. Section 10 describes some challenges and main ideas came acrorss during the development of the HealthCard. The full formal specification and implementation of the HealthCard smart card application presented in this document can be reached at https://sourceforge.net/projects/healthcard/.Orientador: Néstor Catañ

    The logic of correctness in software engineering

    No full text

    The Logic of Correctness in Software Engineering

    No full text
    Abstract. This paper uses a framework drawn from work in the philosophy of science to characterize the concepts of program correctness that have been used in software engineering, and the contrasting methodological approaches of formal methods and testing. It is argued that software engineering has neglected performative accounts of software development in favour of those inspired by formal logic.
    corecore