198 research outputs found
Washington University Record, November 13, 1997
https://digitalcommons.wustl.edu/record/1775/thumbnail.jp
Estimating Software Vulnerability Counts in the Context of Cyber Risk Assessments
Stakeholders often conduct cyber risk assessments as a first step towards understanding and managing their risks due to cyber use. Many risk assessment methods in use today include some form of vulnerability analysis. Building on prior research and combining data from several sources, this paper develops and applies a metric to estimate the proportion of latent vulnerabilities to total vulnerabilities in a software system and applies the metric to five scenarios involving software on the scale of operating systems. The findings suggest caution in interpreting the results of cyber risk methodologies that depend on enumerating known software vulnerabilities because the number of unknown vulnerabilities in large-scale software tends to exceed known vulnerabilities
Research and Publishing: Relevance and Irreverence
The value, relevance, and efficacy of conducting and publishing research has been widely debated throughout the agricultural economics profession. On the one hand, some argue that the research process creates little value and directly competes with teaching/outreach output. On the other hand, others argue that research provides answers to important questions, improves human capital, and complements teaching/outreach activities. I argue that the research and publishing process develops human capital, improves the quality of teaching/outreach, reduces bias, generates new ideas, improves societal welfare, creates innovation, and is essential for public policy debate.publishing, research, Research and Development/Tech Change/Emerging Technologies,
Developing Cyberspace Data Understanding: Using CRISP-DM for Host-based IDS Feature Mining
Current intrusion detection systems generate a large number of specific alerts, but do not provide actionable information. Many times, these alerts must be analyzed by a network defender, a time consuming and tedious task which can occur hours or days after an attack occurs. Improved understanding of the cyberspace domain can lead to great advancements in Cyberspace situational awareness research and development. This thesis applies the Cross Industry Standard Process for Data Mining (CRISP-DM) to develop an understanding about a host system under attack. Data is generated by launching scans and exploits at a machine outfitted with a set of host-based data collectors. Through knowledge discovery, features are identified within the data collected which can be used to enhance host-based intrusion detection. By discovering relationships between the data collected and the events, human understanding of the activity is shown. This method of searching for hidden relationships between sensors greatly enhances understanding of new attacks and vulnerabilities, bolstering our ability to defend the cyberspace domain
The Chain-Link Fence Model: A Framework for Creating Security Procedures
A long standing problem in information technology security is how to help reduce the security footprint. Many specific proposals exist to address specific problems in information technology security. Most information technology solutions need to be repeatable throughout the course of an information systems lifecycle. The Chain-Link Fence Model is a new model for creating and implementing information technology procedures. This model was validated by two different methods: the first being interviews with experts in the field of information technology and the second being four distinct case studies demonstrating the creation and implementation of information technology procedures. (169 pages
Information Outlook, October 2004
Volume 8, Issue 10https://scholarworks.sjsu.edu/sla_io_2004/1009/thumbnail.jp
Recommended from our members
Methods for Objective and Subjective Evaluation of Zero-Client Computing
Zero clients are hardware-based devices without a central processing unit (CPU) that deliver virtual desktops (VDs) from remote computing systems to users. We measured the performance of applications accessed through zero clients to study the feasibility of using this approach to provide a desktop-pc experience across a network. Performance evaluation is complicated because monitoring software cannot be downloaded to the zero clients. Therefore, we introduce a new methodology and metric to measure zero-client VD performance that is based on network-traffic analysis. We conducted objective and subjective studies to determine the sensitivity of application-specific metrics to different network conditions. The results show that the packet loss rate (PLR) impacts zero-client performance for some applications such as video streaming. Subjective tests showed a greater user sensitivity to the PLR for video streaming than for image viewing or Skype. A strong correlation was found between the objective and subjective measurements but the rate at which these measurements changed with increasing PLR differed depending on the application.NSF [CNS-1737453]Open access journalThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
- …