51 research outputs found

    An Extensive Comparison of Static Application Security Testing Tools

    Full text link
    Context: Static Application Security Testing Tools (SASTTs) identify software vulnerabilities to support the security and reliability of software applications. Interestingly, several studies have suggested that alternative solutions may be more effective than SASTTs due to their tendency to generate false alarms, commonly referred to as low Precision. Aim: We aim to comprehensively evaluate SASTTs, setting a reliable benchmark for assessing and finding gaps in vulnerability identification mechanisms based on SASTTs or alternatives. Method: Our SASTTs evaluation is based on a controlled, though synthetic, Java codebase. It involves an assessment of 1.5 million test executions, and it features innovative methodological features such as effort-aware accuracy metrics and method-level analysis. Results: Our findings reveal that SASTTs detect a tiny range of vulnerabilities. In contrast to prevailing wisdom, SASTTs exhibit high Precision while falling short in Recall. Conclusions: The paper suggests that enhancing Recall, alongside expanding the spectrum of detected vulnerability types, should be the primary focus for improving SASTTs or alternative approaches, such as machine learning-based vulnerability identification solutions

    Mixed diffusive-convective relaxation of a broad beam of energetic particles in cold plasma

    Full text link
    We revisit the applications of quasi-linear theory as a paradigmatic model for weak plasma turbulence and the associated bump-on-tail problem. The work, presented here, is built around the idea that large-amplitude or strongly shaped beams do not relax through diffusion only and that there exists an intermediate time scale where the relaxations are convective (ballistic-like). We cast this novel idea in the rigorous form of a self-consistent nonlinear dynamical model, which generalizes the classic equations of the quasi-linear theory to "broad" beams with internal structure. We also present numerical simulation results of the relaxation of a broad beam of energetic particles in cold plasma. These generally demonstrate the mixed diffusive-convective features of supra-thermal particle transport; and essentially depend on nonlinear wave-particle interactions and phase-space structures. Taking into account modes of the stable linear spectrum is crucial for the self-consistent evolution of the distribution function and the fluctuation intensity spectrum.Comment: 25 pages, 15 figure

    Can We Trust the Default Vulnerabilities Severity?

    Get PDF
    As software systems become increasingly complex and interconnected, the risk of security debt has risen significantly, increasing cyber-attacks and data breaches. Vulnerability prioritization is a critical activity in software engineering as it helps identify and address security vulnerabilities in software systems promptly and effectively. With the increasing complexity of software systems and the growing number of potential threats, it is essential to have a systematic approach to vulnerability prioritization to ensure that the most critical vulnerabilities are addressed first. The present study aims to investigate the agreement between the default and the National Vulnerability Database (NVD) severity levels. We analyzed 1626 vulnerabilities encompassing 12 unique types of vulnerabilities associated with 125 Common Platform Enumeration identifiers belonging to 105 Apache projects. Our results show a scarce correlation between the default and NVD severity levels. Thus, the default severity of vulnerabilities is not trustworthy. Moreover, we discovered that, surprisingly, the same type of vulnerability has several NVD severity; therefore, no default prioritization can be accurate based only on the type of vulnerability. Future studies are needed to accurately estimate the priority of vulnerabilities by considering several aspects of vulnerabilities rather than only the type.Peer reviewe

    Trends and Emerging Areas of Agile Research: the Report on XP2014 PhD Symposium

    Get PDF
    The PhD symposium of XP2014, the 15th International Conference on Agile Software Development, was organized as a half-day event prior to the main conference program. Seven PhD candidates came from different research institutes across the globe to present their own research proposals at the symposium. The symposium was run in a lively and interactive manner. The candidates received constructive feedback on their proposals from all the symposium participants. In this report we describe the presented proposals, focusing on the content and feedback. Through them we can take a peek at the trends and emerging areas of agile research in the coming years

    Empirical software engineering experts on the use of students and professionals in experiments

    Get PDF
    [Context] Controlled experiments are an important empirical method to generate and validate theories. Many software engineering experiments are conducted with students. It is often claimed that the use of students as participants in experiments comes at the cost of low external validity while using professionals does not. [Objective] We believe a deeper understanding is needed on the external validity of software engineering experiments conducted with students or with professionals. We aim to gain insight about the pros and cons of using students and professionals in experiments. [Method] We performed an unconventional, focus group approach and a follow-up survey. First, during a session at ISERN 2014, 65 empirical researchers, including the seven authors, argued and discussed the use of students in experiments with an open mind. Afterwards, we revisited the topic and elicited experts' opinions to foster discussions. Then we derived 14 statements and asked the ISERN attendees excluding the authors, to provide their level of agreement with the statements. Finally, we analyzed the researchers' opinions and used the findings to further discuss the statements. [Results] Our survey results showed that, in general, the respondents disagreed with us about the drawbacks of professionals. We, on the contrary, strongly believe that no population (students, professionals, or others) can be deemed better than another in absolute terms. [Conclusion] Using students as participants remains a valid simplification of reality needed in laboratory contexts. It is an effective way to advance software engineering theories and technologies but, like any other aspect of study settings, should be carefully considered during the design, execution, interpretation, and reporting of an experiment. The key is to understand which developer population portion is being represented by the participants in an experiment. Thus, a proposal for describing experimental participants is put forward.Peer reviewe

    Agile Development at Scale: The Next Frontier

    Get PDF
    Agile methods have transformed the way software is developed, emphasizing active end-user involvement, tolerance to change, and evolutionary delivery of products. The first special issue on agile development described the methods as focusing on feedback and change. These methods have led to major changes in how software is developed. Scrum is now the most common framework for development in most countries, and other methods such as extreme programming (XP), elements of lean software development, and Kanban are widely used. What started as a bottom-up movement among software practitioners and consultants has been taken up by major international consulting companies who prescribe agile development, particularly for contexts where learning and innovation are key. Agile development methods have attracted interest primarily in software engineering, but also in a number of other disciplines including information systems and project management.Agile Development at Scale: The Next FrontiersubmittedVersio
    corecore