74,705 research outputs found

    A Survey on Economic-driven Evaluations of Information Technology

    Get PDF
    The economic-driven evaluation of information technology (IT) has become an important instrument in the management of IT projects. Numerous approaches have been developed to quantify the costs of an IT investment and its assumed profit, to evaluate its impact on business process performance, and to analyze the role of IT regarding the achievement of enterprise objectives. This paper discusses approaches for evaluating IT from an economic-driven perspective. Our comparison is based on a framework distinguishing between classification criteria and evaluation criteria. The former allow for the categorization of evaluation approaches based on their similarities and differences. The latter, by contrast, represent attributes that allow to evaluate the discussed approaches. Finally, we give an example of a typical economic-driven IT evaluation

    Software Measurement Activities in Small and Medium Enterprises: an Empirical Assessment

    Get PDF
    An empirical study for evaluating the proper implementation of measurement/metric programs in software companies in one area of Turkey is presented. The research questions are discussed and validated with the help of senior software managers (more than 15 years’ experience) and then used for interviewing a variety of medium and small scale software companies in Ankara. Observations show that there is a common reluctance/lack of interest in utilizing measurements/metrics despite the fact that they are well known in the industry. A side product of this research is that internationally recognized standards such as ISO and CMMI are pursued if they are a part of project/job requirements; without these requirements, introducing those standards to the companies remains as a long-term target to increase quality

    A Methodological Framework for Evaluating Software Testing Techniques and Tools

    Get PDF
    © 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.There exists a real need in industry to have guidelines on what testing techniques use for different testing objectives, and how usable (effective, efficient, satisfactory) these techniques are. Up to date, these guidelines do not exist. Such guidelines could be obtained by doing secondary studies on a body of evidence consisting of case studies evaluating and comparing testing techniques and tools. However, such a body of evidence is also lacking. In this paper, we will make a first step towards creating such body of evidence by defining a general methodological evaluation framework that can simplify the design of case studies for comparing software testing tools, and make the results more precise, reliable, and easy to compare. Using this framework, (1) software testing practitioners can more easily define case studies through an instantiation of the framework, (2) results can be better compared since they are all executed according to a similar design, (3) the gap in existing work on methodological evaluation frameworks will be narrowed, and (4) a body of evidence will be initiated. By means of validating the framework, we will present successful applications of this methodological framework to various case studies for evaluating testing tools in an industrial environment with real objects and real subjects.This work was funded by the European project FITTEST (ICT257574, 2010-2013) and Spanish National project CaSA-Calidad (TIN2010-12312-E, Ministerio de Ciencia e Innovación)Vos, TE.; Marín, B.; Escalona, MJ.; Marchetto, A. (2012). A Methodological Framework for Evaluating Software Testing Techniques and Tools. IEEE. https://doi.org/10.1109/QSIC.2012.16

    AM-OER: An Agile Method for the Development of Open Educational Resources

    Get PDF
    Open Educational Resources have emerged as important elements of education in the contemporary society, promoting life-long and personalized learning that transcends social, eco- nomic and geographical barriers. To achieve the potential of OERs and bring impact on education, it is necessary to increase their development and supply. However, one of the current challenges is how to produce quality and relevant OERs to be reused and adapted to different contexts and learning situations. In this paper we proposed an agile method for the development of OERs – AM-OER, grounded on agile practices from Software Engineering. Learning Design practices from the OULDI project (UK Open University) are also embedded into the AM-OER aiming at improving quality and facilitating reuse and adaptation of OERs. In order to validate AM-OER, an experiment was conducted by applying it in the development of an OER on software testing. The results showed preliminary evidences on the applicability, effectiveness and ef ciency of the method in the development of OERs

    Evaluating software testing techniques and tools

    Get PDF
    Case studies can help companies to evaluate the benefi ts of testing techniques and tools before their possible incorporation into the testing processes. Although general guidelines and organizational frameworks exist describing what a case study should consist of, no general methodological framework exists that can be instantiated to easily design case studies to evaluate different testing techniques. In this paper we de nfine a fi rst version of a general methodological framework for evaluating software testing techniques, that focusses on the evaluation of eff ectiveness and efficiency. Using this framework, (1) software testing practitioners can more easily de fine case studies through an instantiation of the framework, (2) results can be better compared since they are all executed according to a similar design, and (3) the gap in existing work on methodological evaluation frameworks will be narrowed.Peer ReviewedPostprint (published version

    FFT-Based Deep Learning Deployment in Embedded Systems

    Full text link
    Deep learning has delivered its powerfulness in many application domains, especially in image and speech recognition. As the backbone of deep learning, deep neural networks (DNNs) consist of multiple layers of various types with hundreds to thousands of neurons. Embedded platforms are now becoming essential for deep learning deployment due to their portability, versatility, and energy efficiency. The large model size of DNNs, while providing excellent accuracy, also burdens the embedded platforms with intensive computation and storage. Researchers have investigated on reducing DNN model size with negligible accuracy loss. This work proposes a Fast Fourier Transform (FFT)-based DNN training and inference model suitable for embedded platforms with reduced asymptotic complexity of both computation and storage, making our approach distinguished from existing approaches. We develop the training and inference algorithms based on FFT as the computing kernel and deploy the FFT-based inference model on embedded platforms achieving extraordinary processing speed.Comment: Design, Automation, and Test in Europe (DATE) For source code, please contact Mahdi Nazemi at <[email protected]
    corecore