137,194 research outputs found

    Using Semantic Templates to Study Vulnerabilities Recorded in Large Software Repositories

    Get PDF
    Software vulnerabilities allow an attacker to reduce a system\u27s Confidentiality, Availability, and Integrity by exposing information, executing malicious code, and undermine system functionalities that contribute to the overall system purpose and need. With new vulnerabilities discovered everyday in a variety of applications and user environments, a systematic study of their characteristics is a subject of immediate need for the following reasons: The high rate in which information about past and new vulnerabilities are accumulated makes it difficult to absorb and comprehend. Rather than learning from past mistakes, similar types of vulnerabilities are observed repeatedly. As the scale and complexity of current software grows, better mental models will be required for developers to sense the possibility for the occurrence of vulnerabilities. While the software development community has put a significant effort to capture the artifacts related to a discovered vulnerability in organized repositories, much of this information is not amenable to meaningful analysis and requires a deep and manual inspection. ln the software assurance community a body of knowledge that provides an enumeration of common weaknesses has been developed, but it is complicated and not readily usable for the study of vulnerabilities in specific projects and user environments. Also the discovered vulnerabilities from different projects are collected in various databases with general metadata such as dates, person, and natural language descriptions but without the links to other relevant knowledge, they are hard to be utilized for the purpose of understanding vulnerabilities. This research combines the information sources from these communities in a way that facilitates the study of vulnerabilities recorded in large software repositories. We introduce the notion of Semantic Template to integrate the scattered information relevant to understand and discover vulnerabilities. We evaluate the use of semantic templates by applying it to analyze and annotate vulnerabilities recorded in software repositories from the Apache Web Server project. We refer to software repositories in a general sense that includes source code, version control data, bug reports, developer mailing lists and project development websites. We derive semantic templates from community standards such as the Common Weaknesses Enumeration (CWE) and Common Vulnerabilities and Exposures (CVE). We rely on standards in order to facilitate the adoption, sharing and interoperability of semantic templates. This research contributes a novel theory and corresponding mechanisms for the study of vulnerabilities in large software projects. To support these claims, we discuss our experiences and present our findings from the Apache Web Server project

    A Model-Driven Approach for Business Process Management

    Get PDF
    The Business Process Management is a common mechanism recommended by a high number of standards for the management of companies and organizations. In software companies this practice is every day more accepted and companies have to assume it, if they want to be competitive. However, the effective definition of these processes and mainly their maintenance and execution are not always easy tasks. This paper presents an approach based on the Model-Driven paradigm for Business Process Management in software companies. This solution offers a suitable mechanism that was implemented successfully in different companies with a tool case named NDTQ-Framework.Ministerio de Educación y Ciencia TIN2010-20057-C03-02Junta de Andalucía TIC-578

    A DISCUSSION ON ASSURING SOFTWARE QUALITY IN SMALL AND MEDIUM SOFTWARE ENTERPRISES: AN EMPIRICAL INVESTIGATION

    Get PDF
    Under the studies of general core activities including software inspection, review and testing to achieve quality objectives in small-medium size enterprises (SMEs), the paper presents a contemporary view of such companies against quality measures. The results from a local empirical investigation of quality standards in the Turkish software industry are reported.Around 150 software companies have been approached from which 17 detailed feedback inform that in order to ensure software quality, standards including internationally recognized International Standards Organization (ISO) and Capability Maturity Model Integration (CMMI) are given credit. However the substantial workload and resources required to obtain them are also reported as serious; downscaled frameworks of such large models proposed in the literature are not well known by the SMEs either. The paper also discusses "work around" that bypasses such standards to ease delivery of products while keeping certificates as labels just to acquire new jobs for the business

    Outsourcing and acquisition models comparison related to IT supplier selection decision analysis

    Get PDF
    This paper presents a comparison of acquisition models related to decision analysis of IT supplier selection. The main standards are: Capability Maturity Model Integration for Acquisition (CMMI-ACQ), ISO / IEC 12207 Information Technology / Software Life Cycle Processes, IEEE 1062 Recommended Practice for Software Acquisition, the IT Infrastructure Library (ITIL) and the Project Management Body of Knowledge (PMBOK) guide. The objective of this paper is to compare the previous models to find the advantages and disadvantages of them for the future development of a decision model for IT supplier selection

    Taming Uncertainty in the Assurance Process of Self-Adaptive Systems: a Goal-Oriented Approach

    Full text link
    Goals are first-class entities in a self-adaptive system (SAS) as they guide the self-adaptation. A SAS often operates in dynamic and partially unknown environments, which cause uncertainty that the SAS has to address to achieve its goals. Moreover, besides the environment, other classes of uncertainty have been identified. However, these various classes and their sources are not systematically addressed by current approaches throughout the life cycle of the SAS. In general, uncertainty typically makes the assurance provision of SAS goals exclusively at design time not viable. This calls for an assurance process that spans the whole life cycle of the SAS. In this work, we propose a goal-oriented assurance process that supports taming different sources (within different classes) of uncertainty from defining the goals at design time to performing self-adaptation at runtime. Based on a goal model augmented with uncertainty annotations, we automatically generate parametric symbolic formulae with parameterized uncertainties at design time using symbolic model checking. These formulae and the goal model guide the synthesis of adaptation policies by engineers. At runtime, the generated formulae are evaluated to resolve the uncertainty and to steer the self-adaptation using the policies. In this paper, we focus on reliability and cost properties, for which we evaluate our approach on the Body Sensor Network (BSN) implemented in OpenDaVINCI. The results of the validation are promising and show that our approach is able to systematically tame multiple classes of uncertainty, and that it is effective and efficient in providing assurances for the goals of self-adaptive systems

    Integrated quality and enhancement review: summative review: Lowestoft College

    Get PDF

    Development of quality standards for multi-center, longitudinal magnetic resonance imaging studies in clinical neuroscience

    Get PDF
    Magnetic resonance imaging (MRI) data is generated by a complex procedure. Many possible sources of error exist which can lead to a worse signal. For example, hidden defective components of a MRI-scanner, changes in the static magnetic field caused by a person simply moving in the MRI scanner room as well as changes in the measurement sequences can negatively affect the signal-to-noise ratio (SNR). A comprehensive, reproducible, quality assurance (QA) procedure is necessary, to ensure reproducible results both from the MRI equipment and the human operator of the equipment. To examine the quality of the MRI data, there are two possibilities. On the one hand, water or gel-filled objects, so-called "phantoms", are regularly measured. Based on this signal, which in the best case should always be stable, the general performance of the MRI scanner can be tested. On the other hand, the actually interesting data, mostly human data, are checked directly for certain signal parameters (e.g., SNR, motion parameters). This thesis consists of two parts. In the first part a study-specific QA-protocol was developed for a large multicenter MRI-study, FOR2107. The aim of FOR2107 is to investigate the causes and course of affective disorders, unipolar depression and bipolar disorders, taking clinical and neurobiological effects into account. The main aspect of FOR2107 is the MRI-measurement of more than 2000 subjects in a longitudinal design (currently repeated measurements after 2 years, further measurements planned after 5 years). To bring MRI-data and disease history together, MRI-data must provide stable results over the course of the study. Ensuring this stability is dealt with in this part of the work. An extensive QA, based on phantom measurements, human data analysis, protocol compliance testing, etc., was set up. In addition to the development of parameters for the characterization of MRI-data, the used QA-protocols were improved during the study. The differences between sites and the impact of these differences on human data analysis were analyzed. The comprehensive quality assurance for the FOR2107 study showed significant differences in MRI-signal (for human and phantom data) between the centers. Occurring problems could easily be recognized in time and be corrected, and must be included for current and future analyses of human data. For the second part of this thesis, a QA-protocol (and the freely available associated software "LAB-QA2GO") has been developed and tested, and can be used for individual studies or to control the quality of an MRI-scanner. This routine was developed because at many sites and in many studies, no explicit QA is performed nevertheless suitable, freely available QA-software for MRI-measurements is available. With LAB-QA2GO, it is possible to set up a QA-protocol for an MRI-scanner or a study without much effort and IT knowledge. Both parts of the thesis deal with the implementation of QA-procedures. High quality data and study results can be achieved only by the usage of appropriate QA-procedures, as presented in this work. Therefore, QA-measures should be implemented at all levels of a project and should be implemented permanently in project and evaluation routines

    How are higher education institutions dealing with openness?. A survey of practices, beliefs, and strategies in five European countries

    Get PDF
    Open Education is on the agenda of half of the surveyed Higher Education Institutions (HEIs) in France, Germany, Poland, Spain and the United Kingdom. For the other half of HEIs, Open Education does not seem to be an issue, at least at the time of the data collection of the survey (spring 2015). This report presents results of a representative a survey of Higher Education institutions in five European countries (France, Germany, Poland, Spain and the United Kingdom) to enquire about their Open Education (OE) practices, beliefs and strategies (e.g MOOCs). It aims to provide evidence for the further development of OE to support the supports the Opening Up Communication (European Commission, 2013) and the renewed priority on Open Education, enabled by digital technologies, of ET2020
    corecore