174,725 research outputs found

    Cost of Quality in Software Products: An Empirical Analysis

    Get PDF
    Computer software has emerged as a major worldwide industry, estimated at 450Bfor1995ofwhich450B for 1995 of which 225B is attributable to US firms [Boehm, 1987]. Yet, in many organizations, costs and schedules for software projects are largely unpredictable, and product quality is often poor [DeMarco and Lister, 1993]. This underscores the need to study both the quality of the software product and the life-cycle cost incurred in the development and maintenance of the products. Increasing expenditure in software has caught the attention of researchers. Identifying software productivity factors and estimating software costs continue to be important research topics [Mukhopadhyay and Kekre, 1992; Banker et al., 1993]. Researchers have adopted both empirical and theoretical approaches to better understand the process of software development and maintenance. Though software cost continues to be an important research question, competition in the software industry and the increased role of software in everyday life have also made development cycle time and quality important research issues. The quality of software has been studied mainly from defect analysis and software maintenance perspectives. Empirical research has analyzed tradeoffs between software quality and maintenance, and examined drivers of software maintenance costs [Banker, et al., 1993]. Practitioners in the software industry are still faced with the challenge of understanding the key tradeoffs in a software project in order to deliver quality products to customers on time and without cost overruns. This underscores the need to study the various factors that influence the life-cycle cost and quality in software products. Moreover, the effect of the process used in a software project on the outcome of the project in terms of cost to the software developer and quality of the product has not been examined rigorously. Thus in this research, we propose to model the life-cycle cost and quality of software products based on the factorsrelated to product, people, process and technology deployed in the software project

    A Methodology for Variability Reduction in Manufacturing Cost Estimating in the Automotive Industry based on Design Features

    Get PDF
    Organised by: Cranfield UniversitySmall to medium manufacturing companies are coming to realise the increasing importance of performing fast and accurate cost estimates at the early stages of projects to address customers’ requests for quotation. However, they cannot afford the implementation of a knowledge-based cost estimating software. This paper explains the development and validation of a consistent methodology for the cost estimating of manufactured parts (focused on pistons) based on the design features. The research enabled the identification of the sources of variability in cost estimates, and the main one is the lack of formal procedures for the cost estimates in manufacturing SMEs. Finally, a software prototype was developed that reduces the variability in the cost estimates by defining a formal procedure, following the most appropriate cost estimating techniques.Mori Seiki – The Machine Tool Compan

    Short interval control for the cost estimate baseline of novel high value manufacturing products – a complexity based approach

    Get PDF
    Novel high value manufacturing products by default lack the minimum a priori data needed for forecasting cost variance over of time using regression based techniques. Forecasts which attempt to achieve this therefore suffer from significant variance which in turn places significant strain on budgetary assumptions and financial planning. The authors argue that for novel high value manufacturing products short interval control through continuous revision is necessary until the context of the baseline estimate stabilises sufficiently for extending the time intervals for revision. Case study data from the United States Department of Defence Scheduled Annual Summary Reports (1986-2013) is used to exemplify the approach. In this respect it must be remembered that the context of a baseline cost estimate is subject to a large number of assumptions regarding future plausible scenarios, the probability of such scenarios, and various requirements related to such. These assumptions change over time and the degree of their change is indicated by the extent that cost variance follows a forecast propagation curve that has been defined in advance. The presented approach determines the stability of this context by calculating the effort required to identify a propagation pattern for cost variance using the principles of Kolmogorov complexity. Only when that effort remains stable over a sufficient period of time can the revision periods for the cost estimate baseline be changed from continuous to discrete time intervals. The practical implication of the presented approach for novel high value manufacturing products is that attention is shifted from the bottom up or parametric estimation activity to the continuous management of the context for that cost estimate itself. This in turn enables a faster and more sustainable stabilisation of the estimating context which then creates the conditions for reducing cost estimate uncertainty in an actionable and timely manner

    From critical success factors to critical success processes

    Get PDF
    After myriad studies into the main causes of project failure, almost every project manager can list the main factors that distinguish between project failure and project success. These factors are usually called Critical Success Factors (CSF). However, despite the fact that CSF are well-known, the rate of failed projects still remains very high. This may be due to the fact that current CSF are too general and do not contain specific enough know-how to better support project managers decision-making. This paper analyses the impact of 16 specific planning processes on project success and identifies Critical Success Processes (CSP) to which project success is most vulnerable. Results are based on a field study that involved 282 project managers. It was found that the most critical planning processes, which have the greatest impact on project success, are "definition of activities to be performed in the project", "schedule development", "organizational planning", "staff acquisition", "communications planning" and "developing a project plan". It was also found that project managers usually do not divide their time effectively among the different processes, following their influence on project success

    Life cycle assessment (LCA) applied to the process industry: a review

    Get PDF
    Purpose : Life cycle assessment (LCA) methodology is a well-established analytical method to quantify environmental impacts, which has been mainly applied to products. However, recent literature would suggest that it has also the potential as an analysis and design tool for processes, and stresses that one of the biggest challenges of this decade in the field of process systems engineering (PSE) is the development of tools for environmental considerations. Method : This article attempts to give an overview of the integration of LCA methodology in the context of industrial ecology, and focuses on the use of this methodology for environmental considerations concerning process design and optimization. Results : The review identifies that LCA is often used as a multi-objective optimization of processes: practitioners use LCA to obtain the inventory and inject the results into the optimization model. It also shows that most of the LCA studies undertaken on process analysis consider the unit processes as black boxes and build the inventory analysis on fixed operating conditions. Conclusions : The article highlights the interest to better assimilate PSE tools with LCA methodology, in order to produce a more detailed analysis. This will allow optimizing the influence of process operating conditions on environmental impacts and including detailed environmental results into process industry

    Evaluating the quality of project planning: a model and field results

    Get PDF
    Faulty planning will result in project failure, whereas high-quality project planning increases the project's chances of success. The paper reports on the successful development and implementation of a model aimed at evaluating the quality of project planning. The model is based on both the abilities required of the project manager and the organizational support required for a proper project management infrastructure. The model was validated and applied by 282 project managers in nine organizations, where strong and weak planning processes were identified and analysed

    Criteria for the Diploma qualifications in information technology at levels 1, 2 and 3

    Get PDF

    An approach for selecting cost estimation techniques for innovative high value manufacturing products

    Get PDF
    This paper presents an approach for determining the most appropriate technique for cost estimation of innovative high value manufacturing products depending on the amount of prior data available. Case study data from the United States Scheduled Annual Summary Reports for the Joint Strike Fighter (1997-2010) is used to exemplify how, depending on the attributes of a priori data certain techniques for cost estimation are more suitable than others. The data attribute focused on is the computational complexity involved in identifying whether or not there are patterns suited for propagation. Computational complexity is calculated based upon established mathematical principles for pattern recognition which argue that at least 42 data sets are required for the application of standard regression analysis techniques. The paper proposes that below this threshold a generic dependency model and starting conditions should be used and iteratively adapted to the context. In the special case of having less than four datasets available it is suggested that no contemporary cost estimating techniques other than analogy or expert opinion are currently applicable and alternate techniques must be explored if more quantitative results are desired. By applying the mathematical principles of complexity groups the paper argues that when less than four consecutive datasets are available the principles of topological data analysis should be applied. The preconditions being that the cost variance of at least three cost variance types for one to three time discrete continuous intervals is available so that it can be quantified based upon its geometrical attributes, visualised as an n-dimensional point cloud and then evaluated based upon the symmetrical properties of the evolving shape. Further work is suggested to validate the provided decision-trees in cost estimation practice

    Value stream mapping for software development process

    Get PDF
    Includes bibliographical references
    • …
    corecore