1,620 research outputs found

    Security analyses for detecting deserialisation vulnerabilities : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Computer Science at Massey University, Palmerston North, New Zealand

    Get PDF
    An important task in software security is to identify potential vulnerabilities. Attackers exploit security vulnerabilities in systems to obtain confidential information, to breach system integrity, and to make systems unavailable to legitimate users. In recent years, particularly 2012, there has been a rise in reported Java vulnerabilities. One type of vulnerability involves (de)serialisation, a commonly used feature to store objects or data structures to an external format and restore them. In 2015, a deserialisation vulnerability was reported involving Apache Commons Collections, a popular Java library, which affected numerous Java applications. Another major deserialisation-related vulnerability that affected 55\% of Android devices was reported in 2015. Both of these vulnerabilities allowed arbitrary code execution on vulnerable systems by malicious users, a serious risk, and this came as a call for the Java community to issue patches to fix serialisation related vulnerabilities in both the Java Development Kit and libraries. Despite attention to coding guidelines and defensive strategies, deserialisation remains a risky feature and a potential weakness in object-oriented applications. In fact, deserialisation related vulnerabilities (both denial-of-service and remote code execution) continue to be reported for Java applications. Further, deserialisation is a case of parsing where external data is parsed from their external representation to a program's internal data structures and hence, potentially similar vulnerabilities can be present in parsers for file formats and serialisation languages. The problem is, given a software package, to detect either injection or denial-of-service vulnerabilities and propose strategies to prevent attacks that exploit them. The research reported in this thesis casts detecting deserialisation related vulnerabilities as a program analysis task. The goal is to automatically discover this class of vulnerabilities using program analysis techniques, and to experimentally evaluate the efficiency and effectiveness of the proposed methods on real-world software. We use multiple techniques to detect reachability to sensitive methods and taint analysis to detect if untrusted user-input can result in security violations. Challenges in using program analysis for detecting deserialisation vulnerabilities include addressing soundness issues in analysing dynamic features in Java (e.g., native code). Another hurdle is that available techniques mostly target the analysis of applications rather than library code. In this thesis, we develop techniques to address soundness issues related to analysing Java code that uses serialisation, and we adapt dynamic techniques such as fuzzing to address precision issues in the results of our analysis. We also use the results from our analysis to study libraries in other languages, and check if they are vulnerable to deserialisation-type attacks. We then provide a discussion on mitigation measures for engineers to protect their software against such vulnerabilities. In our experiments, we show that we can find unreported vulnerabilities in Java code; and how these vulnerabilities are also present in widely-used serialisers for popular languages such as JavaScript, PHP and Rust. In our study, we discovered previously unknown denial-of-service security bugs in applications/libraries that parse external data formats such as YAML, PDF and SVG

    Acta Cybernetica : Volume 21. Number 3.

    Get PDF

    Execution/Simulation of Context/Constraint-aware Composite Services using GIPSY

    Get PDF
    For fulfilling a complex requirement comprising of several sub-tasks, a composition of simple web services, each of which is dedicated to performing a specific sub-task involved, proves to be a more competent solution in comparison to an equivalent atomic web service. Owing to advantages such as re-usability of components, broader options for composition requesters and liberty to specialize for component providers, for over two decades now, composite services have been extensively researched to the point of being perfected in many aspects. Yet, most of the studies undertaken in this field fail to acknowledge that every web service has a limited context in which it can successfully perform its tasks, the boundaries of which are defined by the internal constraints placed on the service by its providers. When used as part of a composition, the restricted context-spaces of all such component services together define the contextual boundaries of the composite service as a unit, which makes internal constraints an influential factor for composite service functionality. However, due to the limited exposure received by them, no systems have yet been proposed to cater to the specific verification of internal constraints imposed on components of a composite service. In an attempt to address this gap in service composition research, in this thesis, we propose a multi-faceted solution capable of not only automatically constructing context-aware composite web services with their internal constraints positioned for optimum resource-utilization but also of validating the generated compositions using the General Intensional Programming SYstem (GIPSY) as a time- and cost-efficient simulation/execution environment

    A Framework for Interoperability Across Heterogeneous Service Description Models

    Get PDF
    Automated Web service processing, composition and execution is a research area that has yielded different service description models that can be implemented using a wide array of standards and technologies. Due to the diversity of their underlying service description models and the specific operational standards and platforms they are using, it has been difficult to fairly compare various research solutions pertaining to Web service processing, composition and execution. All solutions require a Web service description repository, for example, to automatically compose and execute composite services. Different research endeavors have provided original and diverse solutions to these Web service processing problems, many of them being extensions to existing solutions using standard service description models. We propose a highly modular Web service description framework that can allow the user to import, export, search, modify, and enable the composition and execution of services described using different service description models and/or standards. Our service description framework uses a simple and flexible interface to test and compare Web service processing, composition and execution models and algorithms. We evaluate our framework by creating specific instances of our framework to achieve concrete motivation scenarios, which we do successfully, achieving all of our goals

    A Novel Approach to Ontology Management

    Get PDF
    The term ontology is defined as the explicit specification of a conceptualization. While much of the prior research has focused on technical aspects of ontology management, little attention has been paid to the investigation of issues that limit the widespread use of ontologies and the evaluation of the effectiveness of ontologies in improving task performance. This dissertation addresses this void through the development of approaches to ontology creation, refinement, and evaluation. This study follows a multi-paper model focusing on ontology creation, refinement, and its evaluation. The first study develops and evaluates a method for ontology creation using knowledge available on the Web. The second study develops a methodology for ontology refinement through pruning and empirically evaluates the effectiveness of this method. The third study investigates the impact of an ontology in use case modeling, which is a complex, knowledge intensive organizational task in the context of IS development. The three studies follow the design science research approach, and each builds and evaluates IT artifacts. These studies contribute to knowledge by developing solutions to three important issues in the effective development and use of ontologies

    Automated and Effective Security Testing for XML-based Vulnerabilities

    Get PDF
    Nowadays, the External Markup Language (XML) is the most commonly used technology in web services for enabling service providers and consumers to exchange data. XML is also widely used to store data and configuration files that control the operation of software systems. Nevertheless, XML suffers from several well-known vulnerabilities such as XML Injections (XMLi). Any exploitation of these vulnerabilities might cause serious and undesirable consequences, e.g., denial of service and accessing or modifying highly-confidential data. Fuzz testing techniques have been investigated in the literature to detect XMLi vulnerabilities. However, their success rate tends to be very low since they cannot generate complex test inputs required for the detection of these vulnerabilities. Furthermore, these approaches are not effective for real-world complex XML-based enterprise systems, which are composed of several components including front-end web applications, XML gateway/firewall, and back-end web services. In this dissertation, we propose several automated security testing strategies for detecting XML-based vulnerabilities. In particular, we tackle the challenges of security testing in an industrial context. Our proposed strategies, target various and complementary aspects of security testing for XML-based systems, e.g., test case generation for XML gateway/firewall. The development and evaluation of these strategies have been done in close collaboration with a leading financial service provider in Luxembourg/Switzerland, namely SIX Payment Services (formerly known as CETREL S.A.). SIX Payment Services processes several thousand financial transactions daily, providing a range of financial services, e.g., online payments, issuing of credit and debit cards. The main research contributions of this dissertation are: -A large-scale and systematic experimental assessment for detecting vulnerabilities in numerous widely-used XML parsers and the underlying systems using them. In particular, we targeted two common XML parser’s vulnerabilities: (i) XML Billion Laughs (BIL), and (ii) XML External Entities (XXE). - A novel automated testing approach, that is based on constraint-solving and input mutation techniques, to detect XMLi vulnerabilities in XML gateway/firewall and back-end web services. - A black-box search-based testing approach to detect XMLi vulnerabilities in front-end web applications. Genetic algorithms are used to search for inputs that can manipulate the application to generate malicious XML messages. - An in-depth analysis of various search algorithms and fitness functions, to improve the search-based testing approach for front-end web applications. - Extensive evaluations of our proposed testing strategies on numerous real-world industrial web services, XML gateway/firewall, and web applications as well as several open-source systems

    An approach to architecture-centric domain-specific modelling and implementation for software development and reuse.

    Get PDF
    Model-driven development has been considered to be the hope of improving software productivity significantly. However, it has not been achieved even after many years of research and application. Models are only and still used at the analysis and design stage, furthermore, models gradually deviate from system implementation. The thesis integrates domain-specific modelling and web service techniques with model-driven development and proposes a unified approach, SODSMI (Service Oriented executable Domain-Specific Modelling and Implementation), to build the executable domain-specific model and to achieve the target of model-driven development. The approach is organised by domain space at architectural level which is the elementary unit of the domain-specific modelling and implementation framework. The research of SODSMI is made up of three main parts: Firstly, xDSM (eXecutable Domain-Specific Model) is proposed as the core construction for domain-specific modelling. Behaviour scenario is adopted to build the meta-modelling framework for xDSM. Secondly, XDML language (eXecutable Domain-specific Meta-modelling Language) is designed to describe the xDSM meta-model and its application model. Thirdly, DSMEI (Domain-Specific Model Execution Infrastructure) is designed as the execution environment for xDSM. Web services are adopted as the implementation entities mapping to core functions of xDSM so as to achieve the service-oriented domain-specific application. The thesis embodies the core value of model and provides a feasible approach to achieve real model-driven development from modelling to system implementation which makes domain-specific software development and reuse coming true

    Improving Schema Mapping by Exploiting Domain Knowledge

    Get PDF
    This dissertation addresses the problem of semi-automatically creating schema mappings. The need for developing schema mappings is a pervasive problem in many integration scenarios. Although the problem is well-known and a large body of work exists in the area, the development of schema mappings is today largely performed manually in industrial integration scenarios. In this thesis an approach for the semi-automatic creation of high quality schema mappings is developed

    Dynamic Connector Synthesis: Principles, Methods, Tools and Assessment

    Get PDF
    CONNECT adopts a revolutionary approach to the seamless networking of digital systems, that is, onthe- fly synthesis of the connectors via which networked systems communicate. Within CONNECT, the role of the WP3 work package is to devise automated and efficient approaches to the synthesis of such emergent connectors, provided the behavioral specification of the components to be connected. Thanks to WP3 scientific and technology development, emergent connectors can be synthesized on the fly as networked systems get discovered, implementing the necessary mediation between networked systems' protocols, from application down to middleware layers. This document being the final report about WP3 achievements, it outlines both: (i) specific contributions over the reporting period, and (ii) overall contributions in the area of automated, on-the-fly protocol mediation, from theory to supporting tool
    • …
    corecore