112,182 research outputs found
Recommended from our members
Fault-based regression testing in a reactive environment
Regression testing is the process of retesting software after modification. Regression testing is a major factor contributing to the high cost of software maintenance. To control this cost, regression testing must be accomplished efficiently through effective reuse of test cases and judicious generation of new test cases.Fault-based testing focuses on the detection of particular classes of faults. RELAY is a fault-based testing technique that guarantees the detection of errors caused by any fault in a chosen fault classification. RELAY can be used as a regression testing technique to generate the test cases required to demonstrate that a modification is properly made. In addition, the information related to a test case chosen to detect a potential fault guides in choosing previously-selected test cases that should be reused, for a given modification.This paper presents the concepts behind RELAY and discusses how RELAY could be used as a regression testing technique. It also describes a testing environment that supports reactive regression testing as well as testing throughout the development lifecycle, which is based on integrating the RELAY model with other testing techniques
Business Process Innovation using the Process Innovation Laboratory
Most organizations today are required not only to establish effective business processes but they are required to accommodate for changing business conditions at an increasing rate. Many business processes extend beyond the boundary of the enterprise into the supply chain and the information infrastructure therefore is critical. Today nearly every business relies on their Enterprise System (ES) for process integration and the future generations of enterprise systems will increasingly be driven by business process models. Consequently process modeling and improvement will become vital for business process innovation (BPI) in future organizations. There is a significant body of knowledge on various aspect of process innovation, e.g. on conceptual modeling, business processes, supply chains and enterprise systems. Still an overall comprehensive and consistent theoretical framework with guidelines for practical applications has not been identified. The aim of this paper is to establish a conceptual framework for business process innovation in the supply chain based on advanced enterprise systems. The main approach to business process innovation in this context is to create a new methodology for exploring process models and patterns of applications. The paper thus presents a new concept for business process innovation called the process innovation laboratory a.k.a. the Ă-Lab. The Ă-Lab is a comprehensive framework for BPI using advanced enterprise systems. The Ă-Lab is a collaborative workspace for experimenting with process models and an explorative approach to study integrated modeling in a controlled environment. The Ă-Lab facilitates innovation by using an integrated action learning approach to process modeling including contemporary technological, organizational and business perspectivesNo; keywords
Qualitative Case Studies in Operations Management: Trends, Research Outcomes, And Future Research Implications
Our study examines the state of qualitative case studies in operations management. Five main operations management journals are included for their impact on the field. They are in alphabetical order: Decision Sciences, International Journal of Operations and Production Management, Journal of Operations Management, Management Science, and Production and Operations Management. The qualitative case studies chosen were published between 1992 and 2007. With an increasing trend toward using more qualitative case studies, there have been meaningful and significant contributions to the field of operations management, especially in the area of theory building. However, in many of the qualitative case studies we reviewed, sufficient details in research design, data collection, and data analysis were missing. For instance, there are studies that do not offer sampling logic or a description of the analysis through which research out-comes are drawn. Further, research protocols for doing inductive case studies are much better developed compared to the research protocols for doing deductive case studies. Consequently, there is a lack of consistency in the way the case method has been applied. As qualitative researchers, we offer suggestions on how we can improve on what we have done and elevate the level of rigor and consistency
Requirements traceability in model-driven development: Applying model and transformation conformance
The variety of design artifacts (models) produced in a model-driven design process results in an intricate relationship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship, which helps in assessing the quality of models, realizations and transformation specifications. Our framework is a basis for understanding requirements traceability in model-driven development, as well as for the design of tools that support requirements traceability in model-driven development processes. We propose a notion of conformance between application models which reduces the effort needed for assessment activities. We discuss how this notion of conformance can be integrated with model transformations
J-PET Framework: Software platform for PET tomography data reconstruction and analysis
J-PET Framework is an open-source software platform for data analysis,
written in C++ and based on the ROOT package. It provides a common environment
for implementation of reconstruction, calibration and filtering procedures, as
well as for user-level analyses of Positron Emission Tomography data. The
library contains a set of building blocks that can be combined by users with
even little programming experience, into chains of processing tasks through a
convenient, simple and well-documented API. The generic input-output interface
allows processing the data from various sources: low-level data from the
tomography acquisition system or from diagnostic setups such as digital
oscilloscopes, as well as high-level tomography structures e.g. sinograms or a
list of lines-of-response. Moreover, the environment can be interfaced with
Monte Carlo simulation packages such as GEANT and GATE, which are commonly used
in the medical scientific community.Comment: 14 pages, 5 figure
- âŠ