11,337 research outputs found

    The Federal Information Security Management Act of 2002: A Potemkin Village

    Get PDF
    Due to the daunting possibilities of cyberwarfare, and the ease with which cyberattacks may be conducted, the United Nations has warned that the next world war could be initiated through worldwide cyberattacks between countries. In response to the growing threat of cyberwarfare and the increasing importance of information security, Congress passed the Federal Information Security Management Act of 2002 (FISMA). FISMA recognizes the importance of information security to the national economic and security interests of the United States. However, this Note argues that FISMA has failed to significantly bolster information security, primarily because FISMA treats information security as a technological problem and not an economic problem. This Note analyzes existing proposals to incentivize heightened software quality assurance, and proposes a new solution designed to strengthen federal information security in light of the failings of FISMA and the trappings of Congress’s 2001 amendment to the Computer Fraud and Abuse Act

    Investigating Automatic Static Analysis Results to Identify Quality Problems: an Inductive Study

    Get PDF
    Background: Automatic static analysis (ASA) tools examine source code to discover "issues", i.e. code patterns that are symptoms of bad programming practices and that can lead to defective behavior. Studies in the literature have shown that these tools find defects earlier than other verification activities, but they produce a substantial number of false positive warnings. For this reason, an alternative approach is to use the set of ASA issues to identify defect prone files and components rather than focusing on the individual issues. Aim: We conducted an exploratory study to investigate whether ASA issues can be used as early indicators of faulty files and components and, for the first time, whether they point to a decay of specific software quality attributes, such as maintainability or functionality. Our aim is to understand the critical parameters and feasibility of such an approach to feed into future research on more specific quality and defect prediction models. Method: We analyzed an industrial C# web application using the Resharper ASA tool and explored if significant correlations exist in such a data set. Results: We found promising results when predicting defect-prone files. A set of specific Resharper categories are better indicators of faulty files than common software metrics or the collection of issues of all issue categories, and these categories correlate to different software quality attributes. Conclusions: Our advice for future research is to perform analysis on file rather component level and to evaluate the generalizability of categories. We also recommend using larger datasets as we learned that data sparseness can lead to challenges in the proposed analysis proces
    • …
    corecore