66,807 research outputs found

    Methodologies to develop quantitative risk evaluation metrics

    Get PDF
    The goal of this work is to advance a new methodology to measure a severity cost for each host using the Common Vulnerability Scoring System (CVSS) based on base, temporal and environmental metrics by combining related sub-scores to produce a unique severity cost by modeling the problem's parameters in to a mathematical framework. We build our own CVSS Calculator using our equations to simplify the calculations of the vulnerabilities scores and to benchmark with other models. We design and develop a new approach to represent the cost assigned to each host by dividing the scores of the vulnerabilities to two main levels of privileges, user and root, and we classify these levels into operational levels to identify and calculate the severity cost of multi steps vulnerabilities. Finally we implement our framework on a simple network, using Nessus scanner as tool to discover known vulnerabilities and to implement the results to build and represent our cost centric attack graph

    Service Level Agreement-based GDPR Compliance and Security assurance in (multi)Cloud-based systems

    Get PDF
    Compliance with the new European General Data Protection Regulation (Regulation (EU) 2016/679) and security assurance are currently two major challenges of Cloud-based systems. GDPR compliance implies both privacy and security mechanisms definition, enforcement and control, including evidence collection. This paper presents a novel DevOps framework aimed at supporting Cloud consumers in designing, deploying and operating (multi)Cloud systems that include the necessary privacy and security controls for ensuring transparency to end-users, third parties in service provision (if any) and law enforcement authorities. The framework relies on the risk-driven specification at design time of privacy and security level objectives in the system Service Level Agreement (SLA) and in their continuous monitoring and enforcement at runtime.The research leading to these results has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 644429 and No 780351, MUSA project and ENACT project, respectively. We would also like to acknowledge all the members of the MUSA Consortium and ENACT Consortium for their valuable help

    Measuring Impact: The Art, Science and Mystery of Nonprofit News

    Get PDF
    This report seeks to answer the two-pronged question, "What is 'impact,' and how can it be measured consistently across nonprofit newsrooms?" A review of recent, relevant literature and our informal conversations with experts in the field reveal growing ambitions toward the goal of developing a common framework for assessing journalism's impact, yet few definitive conclusions about how exactly to reach that framework. This is especially the case when journalism's "impact" is defined by its ultimate social outcomes -- not merely the familiar metrics of audience reach and website traffic. As with all journalism, the frame defines the story, and audience is all-important. Defining "impact" as a social outcome proves a complicated proposition that generally evolves according to the constituency attempting to define it. Because various stakeholders have their own reasons for wanting to measure the impact of news, understanding those interests is an essential step in crafting measurement tools and interpreting the metrics they produce. Limitations of impact assessment arise from several sources: the assumptions invariably made about the product and its outcome; the divergent and overlapping categories into which nonprofit journalism falls in the digital age; and the intractable problem of attempting to quantify "quality." These formidable challenges, though, don't seem to deter people from posing and attempting to find answers to the impact question. Various models for assessing impact are continually being tinkered with, and lessons from similar efforts in other fields offer useful insight for this journalistic endeavor. And past research has pointed to specific needs and suggestions for ways to advance the effort. From all of this collective wisdom, several principles emerge as the cornerstones upon which to build a common framework for impact assessment

    A research review of quality assessment for software

    Get PDF
    Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness

    Planning and managing the cost of compromise for AV retention and access

    No full text
    Long-term retention and access to audiovisual (AV) assets as part of a preservation strategy inevitably involve some form of compromise in order to achieve acceptable levels of cost, throughput, quality, and many other parameters. Examples include quality control and throughput in media transfer chains; data safety and accessibility in digital storage systems; and service levels for ingest and access for archive functions delivered as services. We present new software tools and frameworks developed in the PrestoPRIME project that allow these compromises to be quantitatively assessed, planned, and managed for file-based AV assets. Our focus is how to give an archive an assurance that when they design and operate a preservation strategy as a set of services, it will function as expected and will cope with the inevitable and often unpredictable variations that happen in operation. This includes being able to do cost projections, sensitivity analysis, simulation of “disaster scenarios,” and to govern preservation services using service-level agreements and policies
    • …
    corecore