78,843 research outputs found

    An executable formal semantics of PHP with applications to program analysis

    Get PDF
    Nowadays, many important activities in our lives involve the web. However, the software and protocols on which web applications are based were not designed with the appropriate level of security in mind. Many web applications have reached a level of complexity for which testing, code reviews and human inspection are no longer sufficient quality-assurance guarantees. Tools that employ static analysis techniques are needed in order to explore all possible execution paths through an application and guarantee the absence of undesirable behaviours. To make sure that an analysis captures the properties of interest, and to navigate the trade-offs between efficiency and precision, it is necessary to base the design and the development of static analysis tools on a firm understanding of the language to be analysed. When this underlying knowledge is missing or erroneous, tools can’t be trusted no matter what advanced techniques they use to perform their task. In this Thesis, we introduce KPHP, the first executable formal semantics of PHP, one of the most popular languages for server-side web programming. Then, we demonstrate its practical relevance by developing two verification tools, of increasing complexity, on top of it - a simple verifier based on symbolic execution and LTL model checking and a general purpose, fully configurable and extensible static analyser based on Abstract Interpretation. Our LTL-based tool leverages the existing symbolic execution and model checking support offered by K, our semantics framework of choice, and constitutes a first proof-of-concept of the usefulness of our semantics. Our abstract interpreter, on the other hand, represents a more significant and novel contribution to the field of static analysis of dynamic scripting languages (PHP in particular). Although our tool is still a prototype and therefore not well suited for handling large real-world codebases, we demonstrate how our semantics-based, principled approach to the development of verification tools has lead to the design of static analyses that outperform existing tools and approaches, both in terms of supported language features, precision, and breadth of possible applications.Open Acces

    Building in web application security at the requirements stage : a tool for visualizing and evaluating security trade-offs : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Science in Information Systems at Massey University, Albany, New Zealand

    Get PDF
    One dimension of Internet security is web application security. The purpose of this Design-science study was to design, build and evaluate a computer-based tool to support security vulnerability and risk assessment in the early stages of web application design. The tool facilitates risk assessment by managers and helps developers to model security requirements using an interactive tree diagram. The tool calculates residual risk for each component of a web application and for the application overall so developers are provided with better information for making decisions about which countermeasures to implement given limited resources tor doing so. The tool supports taking a proactive approach to building in web application security at the requirements stage as opposed to the more common reactive approach of putting countermeasures in place after an attack and loss have been incurred. The primary contribution of the proposed tool is its ability to make known security-related information (e.g. known vulnerabilities, attacks and countermeasures) more accessible to developers who are not security experts and to translate lack of security measures into an understandable measure of relative residual risk. The latter is useful for managers who need to prioritize security spending. Keywords: web application security, security requirements modelling, attack trees, threat trees, risk assessment

    The IUCN Red List of Ecosystems: motivations, challenges, and applications

    Get PDF
    Abstract In response to growing demand for ecosystem-level risk assessment in biodiversity conservation, and rapid proliferation of locally tailored protocols, the IUCN recently endorsed new Red List criteria as a global standard for ecosystem risk assessment. Four qualities were sought in the design of the IUCN criteria: generality; precision; realism; and simplicity. Drawing from extensive global consultation, we explore trade-offs among these qualities when dealing with key challenges, including ecosystem classification, measuring ecosystem dynamics, degradation and collapse, and setting decision thresholds to delimit ordinal categories of threat. Experience from countries with national lists of threatened ecosystems demonstrates well-balanced trade-offs in current and potential applications of Red Lists of Ecosystems in legislation, policy, environmental management and education. The IUCN Red List of Ecosystems should be judged by whether it achieves conservation ends and improves natural resource management, whether its limitations are outweighed by its benefits, and whether it performs better than alternative methods. Future development of the Red List of Ecosystems will benefit from the history of the Red List of Threatened Species which was trialed and adjusted iteratively over 50 years from rudimentary beginnings. We anticipate the Red List of Ecosystems will promote policy focus on conservation outcomes in situ across whole landscapes and seascapes

    Datacenter Traffic Control: Understanding Techniques and Trade-offs

    Get PDF
    Datacenters provide cost-effective and flexible access to scalable compute and storage resources necessary for today's cloud computing needs. A typical datacenter is made up of thousands of servers connected with a large network and usually managed by one operator. To provide quality access to the variety of applications and services hosted on datacenters and maximize performance, it deems necessary to use datacenter networks effectively and efficiently. Datacenter traffic is often a mix of several classes with different priorities and requirements. This includes user-generated interactive traffic, traffic with deadlines, and long-running traffic. To this end, custom transport protocols and traffic management techniques have been developed to improve datacenter network performance. In this tutorial paper, we review the general architecture of datacenter networks, various topologies proposed for them, their traffic properties, general traffic control challenges in datacenters and general traffic control objectives. The purpose of this paper is to bring out the important characteristics of traffic control in datacenters and not to survey all existing solutions (as it is virtually impossible due to massive body of existing research). We hope to provide readers with a wide range of options and factors while considering a variety of traffic control mechanisms. We discuss various characteristics of datacenter traffic control including management schemes, transmission control, traffic shaping, prioritization, load balancing, multipathing, and traffic scheduling. Next, we point to several open challenges as well as new and interesting networking paradigms. At the end of this paper, we briefly review inter-datacenter networks that connect geographically dispersed datacenters which have been receiving increasing attention recently and pose interesting and novel research problems.Comment: Accepted for Publication in IEEE Communications Surveys and Tutorial

    Analysis of the potentials of multi criteria decision analysis methods to conduct sustainability assessment

    Get PDF
    Sustainability assessments require the management of a wide variety of information types, parameters and uncertainties. Multi criteria decision analysis (MCDA) has been regarded as a suitable set of methods to perform sustainability evaluations as a result of its flexibility and the possibility of facilitating the dialogue between stakeholders, analysts and scientists. However, it has been reported that researchers do not usually properly define the reasons for choosing a certain MCDA method instead of another. Familiarity and affinity with a certain approach seem to be the drivers for the choice of a certain procedure. This review paper presents the performance of five MCDA methods (i.e. MAUT, AHP, PROMETHEE, ELECTRE and DRSA) in respect to ten crucial criteria that sustainability assessments tools should satisfy, among which are a life cycle perspective, thresholds and uncertainty management, software support and ease of use. The review shows that MAUT and AHP are fairly simple to understand and have good software support, but they are cognitively demanding for the decision makers, and can only embrace a weak sustainability perspective as trade-offs are the norm. Mixed information and uncertainty can be managed by all the methods, while robust results can only be obtained with MAUT. ELECTRE, PROMETHEE and DRSA are non-compensatory approaches which consent to use a strong sustainability concept, accept a variety of thresholds, but suffer from rank reversal. DRSA is less demanding in terms of preference elicitation, is very easy to understand and provides a straightforward set of decision rules expressed in the form of elementary “if … then …” conditions. Dedicated software is available for all the approaches with a medium to wide range of results capability representation. DRSA emerges as the easiest method, followed by AHP, PROMETHEE and MAUT, while ELECTRE is regarded as fairly difficult. Overall, the analysis has shown that most of the requirements are satisfied by the MCDA methods (although to different extents) with the exclusion of management of mixed data types and adoption of life cycle perspective which are covered by all the considered approaches

    Banking on Nature's Assets: How Multilateral Development Banks Can Strengthen Development by Using Ecosystem Services

    Get PDF
    Outlines the benefits of integrating the management of ecosystem services and trade-offs into strategies to improve economic development outcomes, mitigate climate change effects, and reduce economic and human costs. Recommends tools and policy options
    corecore