1,439 research outputs found

    Evaluating and Securing Text-Based Java Code through Static Code Analysis

    Get PDF
    As the cyber security landscape dynamically evolves and security professionals work to keep apace, modern-day educators face the issue of equipping a new generation for this dynamic landscape. With cyber-attacks and vulnerabilities substantially increased over the past years in frequency and severity, it is important to design and build secure software applications from the group up. Therefore, defensive secure coding techniques covering security concepts must be taught from beginning computer science programming courses to exercise building secure applications. Using static analysis, this study thoroughly analyzed Java source code in two textbooks used at a collegiate level, with the goal of guiding educators to make a reference of the resources in teaching programming concepts from a security perspective. The resources include the methods of source code analysis and relevant tools, categorized bugs detected in the code, and compliant code examples with fixing the bugs. Overall, the first text revealed a relatively moderate bug rate of approximately 44% of files analyzed contained either regular or security bugs. About 13% of the total bugs found were security bugs and the most common security bug was related to the Pseudo Random security vulnerability. The second text produced a slightly larger bug rate of 53.80% with approximately 8% of security bugs. After combining the texts for an average rate, the total number of security bugs that were likely to appear was roughly 10% percent. This encompasses security bugs such as malicious code vulnerabilities and security vulnerabilities related to exposing or manipulating data in these programs

    A survey of compiler development aids

    Get PDF
    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected

    Annales Mathematicae et Informaticae (48.)

    Get PDF

    Algorithmic pedagogy: Using code analysis to deliver targeted supplemental instruction

    Get PDF
    Doctor of PhilosophyCurriculum and Instruction ProgramsJacqueline D SpearsLearning to program has long been known to be a difficult task, requiring a student to develop both fluency in the syntax and grammar of a formal programming language and learn the problem-solving approaches and techniques of computational thinking. The successful teaching strategies of the past have involved maintaining small teacher-student ratios and large amounts of supplemental instruction in lab courses. However, recent growth in the demand for programming courses from both computer science major and nonmajor students has drastically outpaced the expansion of computer science faculty and created a shortage in available lab space and time across American universities. This study involved creating a software tool for automatically delivering targeted supplemental instruction to students based on a real-time algorithmic analysis of the program code they were writing. This approach was piloted with students enrolled in a sophomore-level object-oriented software development course. The majority of students reported finding the detection and reporting of issues in their code helpful. Moreover, students who were less proficient programmers entering the course who utilized the tool showed statistically significant improvement in their final exam grade over those who did not. Thus, adopting the strategy piloted in this study could improve instruction in larger classes and relieve some of the strain on overburdened computer science departments while providing additional learning benefits for students

    Ethical Hacking Using Penetration Testing

    Get PDF
    This thesis provides details of the hardware architecture and the software scripting, which are employed to demonstrate penetration testing in a laboratory setup. The architecture depicts an organizational computing asset or an environment.¬¬¬ With the increasing number of cyber-attacks throughout the world, the network security is becoming an important issue. This has motivated a large number of “ethical hackers” to indulge and develop methodologies and scripts to defend against the security attacks. As it is too onerous to maintain and monitor attacks on individual hardware and software in an organization, the demand for the new ways to manage security systems invoked the idea of penetration testing. Many research groups have designed algorithms depending on the size, type and purpose of application to secure networks [55]. In this thesis, we create a laboratory setup replicating an organizational infrastructure to study penetration testing on real time server-client atmosphere. To make this possible, we have used Border Gateway Protocol (BGP) as routing protocol as it is widely used in current networks. Moreover, BGP exhibits few vulnerabilities of its own and makes the security assessment more promising. Here, we propose (a) computer based attacks and (b) actual network based attacks including defense mechanisms. The thesis, thus, describes the way penetration testing is accomplished over a desired BGP network. The procedural generation of the packets, exploit, and payloads involve internal and external network attacks. In this thesis, we start with the details of all sub-fields in the stream of penetration testing, including their requirements and outcomes. As an informative and learning research, this thesis discusses the types of attacks over the routers, switches and physical client machines. Our work also deals with the limitations of the implementation of the penetration testing, discussing over the vulnerabilities of the current standards in the technology. Furthermore, we consider the possible methodologies that require attention in order to accomplish most efficient outcomes with the penetration testing. Overall, this work has provided a great learning opportunity in the area of ethical hacking using penetration testing

    Development of the D-Optimality-Based Coordinate-Exchange Algorithm for an Irregular Design Space and the Mixed-Integer Nonlinear Robust Parameter Design Optimization

    Get PDF
    Robust parameter design (RPD), originally conceptualized by Taguchi, is an effective statistical design method for continuous quality improvement by incorporating product quality into the design of processes. The primary goal of RPD is to identify optimal input variable level settings with minimum process bias and variation. Because of its practicality in reducing inherent uncertainties associated with system performance across key product and process dimensions, the widespread application of RPD techniques to many engineering and science fields has resulted in significant improvements in product quality and process enhancement. There is little disagreement among researchers about Taguchi\u27s basic philosophy. In response to apparent mathematical flaws surrounding his original version of RPD, researchers have closely examined alternative approaches by incorporating well-established statistical methods, particularly the response surface methodology (RSM), while accepting the main philosophy of his RPD concepts. This particular RSM-based RPD method predominantly employs the central composite design technique with the assumption that input variables are quantitative on a continuous scale. There is a large number of practical situations in which a combination of input variables is of real-valued quantitative variables on a continuous scale and qualitative variables such as integer- and binary-valued variables. Despite the practicality of such cases in real-world engineering problems, there has been little research attempt, if any, perhaps due to mathematical hurdles in terms of inconsistencies between a design space in the experimental phase and a solution space in the optimization phase. For instance, the design space associated with the central composite design, which is perhaps known as the most effective response surface design for a second-order prediction model, is typically a bounded convex feasible set involving real numbers due to its inherent real-valued axial design points; however, its solution space may consist of integer and real values. Along the lines, this dissertation proposes RPD optimization models under three different scenarios. Given integer-valued constraints, this dissertation discusses why the Box-Behnken design is preferred over the central composite design and other three-level designs, while maintaining constant or nearly constant prediction variance, called the design rotatability, associated with a second-order model. Box-Behnken design embedded mixed integer nonlinear programming models are then proposed. As a solution method, the Karush-Kuhn-Tucker conditions are developed and the sequential quadratic integer programming technique is also used. Further, given binary-valued constraints, this dissertation investigates why neither the central composite design nor the Box-Behnken design is effective. To remedy this potential problem, several 0-1 mixed integer nonlinear programming models are proposed by laying out the foundation of a three-level factorial design with pseudo center points. For these particular models, we use standard optimization methods such as the branch-and-bound technique, the outer approximation method, and the hybrid nonlinear based branch-and-cut algorithm. Finally, there exist some special situations during the experimental phase where the situation may call for reducing the number of experimental runs or using a reduced regression model in fitting the data. Furthermore, there are special situations where the experimental design space is constrained, and therefore optimal design points should be generated. In these particular situations, traditional experimental designs may not be appropriate. D-optimal experimental designs are investigated and incorporated into nonlinear programming models, as the design region is typically irregular which may end up being a convex problem. It is believed that the research work contained in this dissertation is the initial examination in the related literature and makes a considerable contribution to an existing body of knowledge by filling research gaps

    UNIT-LEVEL ISOLATION AND TESTING OF BUGGY CODE

    Get PDF
    In real-world software development, maintenance plays a major role and developers spend 50-80% of their time in maintenance-related activities. During software maintenance, a significant amount of effort is spent on ending and fixing bugs. In some cases, the fix does not completely eliminate the buggy behavior; though it addresses the reported problem, it fails to account for conditions that could lead to similar failures. There could be many possible reasons: the conditions may have been overlooked or difficult to reproduce, e.g., when the components that invoke the code or the underlying components it interacts with can not put it in a state where latent errors appear. We posit that such latent errors can be discovered sooner if the buggy section can be tested more thoroughly in a separate environment, a strategy that is loosely analogous to the medical procedure of performing a biopsy where tissue is removed, examined and subjected to a battery of tests to determine the presence of a disease. In this thesis, we propose a process in which the buggy code is extracted and isolated in a test framework. Test drivers and stubs are added to exercise the code and observe its interactions with its dependencies. We lay the groundwork for the creation of an automated tool for isolating code by studying its feasibility and investigating existing testing technologies that can facilitate the creation of such drivers and stubs. We investigate mocking frameworks, symbolic execution and model checking tools and test their capabilities by examining real bugs from the Apache Tomcat project. We demonstrate the merits of performing unit-level symbolic execution and model checking to discover runtime exceptions and logical errors. The process is shown to have high coverage and able to uncover latent errors due to insufficient fixes

    A PC-based data acquisition system for sub-atomic physics measurements

    Get PDF
    Modern particle physics measurements are heavily dependent upon automated data acquisition systems (DAQ) to collect and process experiment-generated information. One research group from the University of Saskatchewan utilizes a DAQ known as the Lucid data acquisition and analysis system. This thesis examines the project undertaken to upgrade the hardware and software components of Lucid. To establish the effectiveness of the system upgrades, several performance metrics were obtained including the system's dead time and input/output bandwidth.Hardware upgrades to Lucid consisted of replacing its aging digitization equipment with modern, faster-converting Versa-Module Eurobus (VME) technology and replacing the instrumentation processing platform with common, PC hardware. The new processor platform is coupled to the instrumentation modules via a fiber-optic bridging-device, the sis1100/3100 from Struck Innovative Systems.The software systems of Lucid were also modified to follow suit with the new hardware. Originally constructed to utilize a proprietary real-time operating system, the data acquisition application was ported to run under the freely available Real-Time Executive for Multiprocessor Systems (RTEMS). The device driver software provided with sis1100/3100 interface also had to be ported for use under the RTEMS-based system. Performance measurements of the upgraded DAQ indicate that the dead time has been reduced from being on the order of milliseconds to being on the order of several tens of microseconds. This increased capability means that Lucid's users may acquire significantly more data in a shorter period of time, thereby decreasing both the statistical uncertainties and data collection duration associated with a given experiment

    Annales Mathematicae et Informaticae 2018

    Get PDF
    corecore