6 research outputs found
Software Reliability Issues: An Experimental Approach
In this thesis, we present methodologies involving a data structure called the debugging graph whereby the predictive performance of software reliability models can be analyzed and improved under laboratory conditions. This procedure substitutes the averages of large sample sets for the single point samples normally used as inputs to these models and thus supports scrutiny of their performances with less random input data.
Initially, we describe the construction of an extensive database of empirical reliability data which we derived by testing each partially debugged version of subject software represented by complete or partial debugging graphs. We demonstrate how these data can be used to assign relative sizes to known bugs and to simulate multiple debugging sessions. We then present the results from a series of proof-of-concept experiments.
We show that controlling fault recovery order as represented by the data input to some well-known reliability models can enable them to produce more accurate predictions and can mitigate anomalous effects we attribute to manifestations of the fault interaction phenomenon. Since limited testing resources are common in the real world, we demonstrate the use of two approximation techniques, the surrogate oracle and path truncations, to render the application of our methodologies computationally feasible outside a laboratory setting. We report results which support the assertion that reliability data collected from just a partial debugging graph and subject to these approximations qualitatively agrees with those collected under ideal laboratory conditions, provided one accounts for optimistic bias introduced by the surrogate in later prediction stages. We outline an algorithmic approach for using data derived from a partial debugging graph to improve software reliability predictions, and show its complexity to be no worse than O(n2). We summarize some outstanding questions as areas for future investigations of and improvements to the software reliability prediction process
Software reliability studies
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction
A Survey on Securing Personally Identifiable Information on Smartphones
With an ever-increasing footprint, already topping 3 billion devices, smartphones have become a huge cybersecurity concern. The portability of smartphones makes them convenient for users to access and store personally identifiable information (PII); this also makes them a popular target for hackers. This survey shares practical insights derived from analyzing 16 real-life case studies that exemplify: the vulnerabilities that leave smartphones open to cybersecurity attacks; the mechanisms and attack vectors typically used to steal PII from smartphones; the potential impact of PII breaches upon all parties involved; and recommended defenses to help prevent future PII losses. The contribution of this research is recommending proactive measures to dramatically decrease the frequency of PII loss involving smartphones
A Study of Existing Cross-Site Scripting Detection and Prevention Techniques Using XAMPP and VirtualBox
Most operating websites experience a cyber-attack at some point. Cross-site Scripting (XSS) attacks are cited as the top website risk. More than 60 percent of web applications are vulnerable to them, and they ultimately are responsible for over 30 percent of all web application attacks. XSS attacks are complicated, and they often are used in conjunction with social engineering techniques to cause even more damage. Although prevention techniques exist, hackers still find points of vulnerability to launch their attacks. This project explored what XSS attacks are, examples of popular attacks, and ways to detect and prevent them. Using knowledge gained and lessons-learned from analyzing prior XSS incidents, a simulation environment was built using XAMPP and VirtualBox. Four typical XSS attacks were launched in this virtual environment, and their potential to cause significant damage was measured and compared using the Common Vulnerability Scoring System (CVSS) Calculator. Recommendations are offered for approaches to impeding XSS attacks including solutions involving sanitizing data, whitelisting data, implementing a content security policy and statistical analysis tools
The James Webb Space Telescope Mission
Twenty-six years ago a small committee report, building on earlier studies,
expounded a compelling and poetic vision for the future of astronomy, calling
for an infrared-optimized space telescope with an aperture of at least .
With the support of their governments in the US, Europe, and Canada, 20,000
people realized that vision as the James Webb Space Telescope. A
generation of astronomers will celebrate their accomplishments for the life of
the mission, potentially as long as 20 years, and beyond. This report and the
scientific discoveries that follow are extended thank-you notes to the 20,000
team members. The telescope is working perfectly, with much better image
quality than expected. In this and accompanying papers, we give a brief
history, describe the observatory, outline its objectives and current observing
program, and discuss the inventions and people who made it possible. We cite
detailed reports on the design and the measured performance on orbit.Comment: Accepted by PASP for the special issue on The James Webb Space
Telescope Overview, 29 pages, 4 figure
The James Webb Space Telescope Mission
Twenty-six years ago a small committee report, building on earlier studies, expounded a compelling and poetic vision for the future of astronomy, calling for an infrared-optimized space telescope with an aperture of at least 4 m. With the support of their governments in the US, Europe, and Canada, 20,000 people realized that vision as the 6.5 m James Webb Space Telescope. A generation of astronomers will celebrate their accomplishments for the life of the mission, potentially as long as 20 yr, and beyond. This report and the scientific discoveries that follow are extended thank-you notes to the 20,000 team members. The telescope is working perfectly, with much better image quality than expected. In this and accompanying papers, we give a brief history, describe the observatory, outline its objectives and current observing program, and discuss the inventions and people who made it possible. We cite detailed reports on the design and the measured performance on orbit