108,824 research outputs found

    Software component testing : a standard and the effectiveness of techniques

    Get PDF
    This portfolio comprises two projects linked by the theme of software component testing, which is also often referred to as module or unit testing. One project covers its standardisation, while the other considers the analysis and evaluation of the application of selected testing techniques to an existing avionics system. The evaluation is based on empirical data obtained from fault reports relating to the avionics system. The standardisation project is based on the development of the BC BSI Software Component Testing Standard and the BCS/BSI Glossary of terms used in software testing, which are both included in the portfolio. The papers included for this project consider both those issues concerned with the adopted development process and the resolution of technical matters concerning the definition of the testing techniques and their associated measures. The test effectiveness project documents a retrospective analysis of an operational avionics system to determine the relative effectiveness of several software component testing techniques. The methodology differs from that used in other test effectiveness experiments in that it considers every possible set of inputs that are required to satisfy a testing technique rather than arbitrarily chosen values from within this set. The three papers present the experimental methodology used, intermediate results from a failure analysis of the studied system, and the test effectiveness results for ten testing techniques, definitions for which were taken from the BCS BSI Software Component Testing Standard. The creation of the two standards has filled a gap in both the national and international software testing standards arenas. Their production required an in-depth knowledge of software component testing techniques, the identification and use of a development process, and the negotiation of the standardisation process at a national level. The knowledge gained during this process has been disseminated by the author in the papers included as part of this portfolio. The investigation of test effectiveness has introduced a new methodology for determining the test effectiveness of software component testing techniques by means of a retrospective analysis and so provided a new set of data that can be added to the body of empirical data on software component testing effectiveness

    A simple method for detecting chaos in nature

    Full text link
    Chaos, or exponential sensitivity to small perturbations, appears everywhere in nature. Moreover, chaos is predicted to play diverse functional roles in living systems. A method for detecting chaos from empirical measurements should therefore be a key component of the biologist's toolkit. But, classic chaos-detection tools are highly sensitive to measurement noise and break down for common edge cases, making it difficult to detect chaos in domains, like biology, where measurements are noisy. However, newer tools promise to overcome these limitations. Here, we combine several such tools into an automated processing pipeline, and show that our pipeline can detect the presence (or absence) of chaos in noisy recordings, even for difficult edge cases. As a first-pass application of our pipeline, we show that heart rate variability is not chaotic as some have proposed, and instead reflects a stochastic process in both health and disease. Our tool is easy-to-use and freely available

    Hypermedia for language learning: The FREE model at Coventry University

    Get PDF
    Coventry University is pioneering the integration of hypermedia into the curriculum for the teaching of Italian language and society with the creation of a package based on Nerino Rossi's novel La neve nel bicchiere. The novel was already in use as a basic course text, and developing a hypermedia package was felt to be the ideal way of creating a more stimulating means of access to it. The procedure used in creating the package is described, as are its contents, the ways in which the students use it and the tasks they are given to perform, the feedback from the students, and its impact on their performance. The testing of the prototype has helped in creating a new cognitive model: the FREE (Fluid Role‐Exchange Environment) which functions as a fluid and interactive ‘pool’ where the three main actors, or act ants, ie. the learner, the lecturer and the computer, exchange roles. Within the FREE, students were involved in the construction and evaluation of the courseware, as well as testing the various versions of the prototype. The development and use of hypermedia inside and outside the classroom has made it possible to change both the students’ and the lecturer's attitude towards the material being learnt. However, the courseware does not seem to equip students sufficiently for essay writing, and this problem needs further investigation

    Fault Localization Models in Debugging

    Full text link
    Debugging is considered as a rigorous but important feature of software engineering process. Since more than a decade, the software engineering research community is exploring different techniques for removal of faults from programs but it is quite difficult to overcome all the faults of software programs. Thus, it is still remains as a real challenge for software debugging and maintenance community. In this paper, we briefly introduced software anomalies and faults classification and then explained different fault localization models using theory of diagnosis. Furthermore, we compared and contrasted between value based and dependencies based models in accordance with different real misbehaviours and presented some insight information for the debugging process. Moreover, we discussed the results of both models and manifested the shortcomings as well as advantages of these models in terms of debugging and maintenance.Comment: 58-6

    Using the Co-design Process to Build Non-designer Ability in Making Visual Thinking Tools

    Get PDF
    This research is a case study of using co-design as a way of assisting the capacity building process for an Indianapolis-based community organizer. The community organizer seeks to develop a visual thinking tool for enhancing her engagement with community participants. Community organizers face a wide array of complicated challenges, addressing these kinds of challenges and social issues calls for innovative and inclusive approaches to community problem solving. The author hopes this case study will showcase itself as an example of leveraging design thinking and visual thinking to support and equip more first-line workers who are non-designers to do their community jobs with a more creative problem-solving approach

    Hiding in Plain Sight: Identifying Computational Thinking in the Ontario Elementary School Curriculum

    Get PDF
    Given a growing digital economy with complex problems, demands are being made for education to address computational thinking (CT) – an approach to problem solving that draws on the tenets of computer science. We conducted a comprehensive content analysis of the Ontario elementary school curriculum documents for 44 CT-related terms to examine the extent to which CT may already be considered within the curriculum. The quantitative analysis strategy provided frequencies of terms, and a qualitative analysis provided information about how and where terms were being used. As predicted, results showed that while CT terms appeared mostly in Mathematics, and concepts and perspectives were more frequently cited than practices, related terms appeared across almost all disciplines and grades. Findings suggest that CT is already a relevant consideration for educators in terms of concepts and perspectives; however, CT practices should be more widely incorporated to promote 21st century skills across disciplines. Future research would benefit from continued examination of the implementation and assessment of CT and its related concepts, practices, and perspectives

    A Model-Driven Approach for Business Process Management

    Get PDF
    The Business Process Management is a common mechanism recommended by a high number of standards for the management of companies and organizations. In software companies this practice is every day more accepted and companies have to assume it, if they want to be competitive. However, the effective definition of these processes and mainly their maintenance and execution are not always easy tasks. This paper presents an approach based on the Model-Driven paradigm for Business Process Management in software companies. This solution offers a suitable mechanism that was implemented successfully in different companies with a tool case named NDTQ-Framework.Ministerio de Educación y Ciencia TIN2010-20057-C03-02Junta de Andalucía TIC-578

    Applications of Machine Learning to Threat Intelligence, Intrusion Detection and Malware

    Get PDF
    Artificial Intelligence (AI) and Machine Learning (ML) are emerging technologies with applications to many fields. This paper is a survey of use cases of ML for threat intelligence, intrusion detection, and malware analysis and detection. Threat intelligence, especially attack attribution, can benefit from the use of ML classification. False positives from rule-based intrusion detection systems can be reduced with the use of ML models. Malware analysis and classification can be made easier by developing ML frameworks to distill similarities between the malicious programs. Adversarial machine learning will also be discussed, because while ML can be used to solve problems or reduce analyst workload, it also introduces new attack surfaces

    Development and implementation of a LabVIEW based SCADA system for a meshed multi-terminal VSC-HVDC grid scaled platform

    Get PDF
    This project is oriented to the development of a Supervisory, Control and Data Acquisition (SCADA) software to control and supervise electrical variables from a scaled platform that represents a meshed HVDC grid employing National Instruments hardware and LabVIEW logic environment. The objective is to obtain real time visualization of DC and AC electrical variables and a lossless data stream acquisition. The acquisition system hardware elements have been configured, tested and installed on the grid platform. The system is composed of three chassis, each inside of a VSC terminal cabinet, with integrated Field-Programmable Gate Arrays (FPGAs), one of them connected via PCI bus to a local processor and the rest too via Ethernet through a switch. Analogical acquisition modules were A/D conversion takes place are inserted into the chassis. A personal computer is used as host, screen terminal and storing space. There are two main access modes to the FPGAs through the real time system. It has been implemented a Scan mode VI to monitor all the grid DC signals and a faster FPGA access mode VI to monitor one converter AC and DC values. The FPGA application consists of two tasks running at different rates and a FIFO has been implemented to communicate between them without data loss. Multiple structures have been tested on the grid platform and evaluated, ensuring the compliance of previously established specifications, such as sampling and scanning rate, screen refreshment or possible data loss. Additionally a turbine emulator was implemented and tested in Labview for further testing
    corecore