10,201 research outputs found

    Product Development Process Modeling Using Advanced Simulation

    Get PDF
    This paper presents a product development process modeling and analysis technique using advanced simulation.The model computes the probability distribution of lead time in a resource-constrained project network where iterations take place among sequential, parallel and overlapped tasks. The model uses the design structure matrix representation to capture the information flows between tasks. In each simulation run, the expected durations of tasks are initially sampled using the Latin Hypercube Sampling method and decrease over time as the model simulates the progress of dynamic stochastic processes. It is assumed that the rework of a task occurs for the following reasons: (1) new information is obtained from overlapped tasks after starting to work with preliminary inputs, (2) inputs change when other tasks are reworked, and (3) outputs fail to meet established criteria. The model can be used for better project planning and control by identifying leverage points for process improvements and evaluating alternative planning and execution strategies. An industrial example is used to illustrate the utility of the model.Center for Innovation in Product Developmen

    Identifying and evaluating parallel design activities using the design structure matrix

    Get PDF
    This paper describes an approach based upon the Design Structure Matrix (DSM) for identifying, evaluating and optimising one aspect of CE: activity parallelism. Concurrent Engineering (CE) has placed emphasis on the management of the product development process and one of its major benefits is the reduction in lead-time and product cost [1]. One approach that CE promotes for the reduction of lead-time is the simultaneous enactment of activities otherwise known as Simultaneous Engineering. Whilst activity parallelism may contribute to the reduction in lead-time and product cost, the effect of iteration is also recognised as a contributing factor on lead-time, and hence was also combined within the investigation. The paper describes how parallel activities may be identified within the DSM, before detailing how a process may be evaluated with respect to parallelism and iteration using the DSM. An optimisation algorithm is then utilised to establish a near-optimal sequence for the activities with respect to parallelism and iteration. DSM-based processes from previously published research are used to describe the development of the approach

    Modeling and Managing Engineering Changes in a Complex Product Development Process

    Get PDF
    Today\u27s hyper-competitive worldwide market, turbulent environment, demanding customers, and diverse technological advancements force any corporations who develop new products to look into all the possible areas of improvement in the entire product lifecycle management process. One of the areas that both scholars and practitioners have overlooked in the past is Engineering Change Management (ECM). The vision behind this dissertation is to ultimately bridge this gap by identifying main characteristics of a New Product Development (NPD) process that are potentially associated with the occurrence and magnitude of iterations and Engineering Changes (ECs), developing means to quantify these characteristics as well as the interrelationships between them in a computer simulation model, testing the effects of different parameter settings and various coordination policies on project performance, and finally gaining operational insights considering all relevant EC impacts. The causes for four major ECM problems (occurrence of ECs, long EC lead time, high EC cost, and occurrence frequency of iterations and ECs), are first discussed diagrammatically and qualitatively. Factors that contribute to particular system behavior patterns and the causal links between them are identified through the exploratory construction of causal/causal-loop diagrams. To further understand the nature of NPD/ECM problems and verify the key assumptions made in the conceptual causal framework, three field survey studies were conducted in the summer of 2010 and 2011. Information and data were collected to assess the current practice in automobile and information technology industries where EC problems are commonly encountered. ased upon the intuitive understanding gained from these two preparation work, a Discrete Event Simulation (DES) model is proposed. In addition to combining essential project features, such as concurrent engineering, cross functional integration, resource constraints, etc., it is distinct from existing research by introducing the capability of differentiating and characterizing various levels of uncertainties (activity uncertainty, solution uncertainty, and environmental uncertainty) that are dynamically associated with an NPD project and consequently result in stochastic occurrence of NPD iterations and ECs of two different types (emergent ECs and initiated ECs) as the project unfolds. Moreover, feedback-loop relationships among model variables are included in the DES model to enable more accurate prediction of dynamic work flow. Using a numerical example, different project-related model features (e.g., learning curve effects, rework likelihood, and level of dependency of product configuration) and coordination policies (e.g., overlapping strategy, rework review strategy, IEC batching policy, and resource allocation policy) are tested and analyzed in detail concerning three major performance indicators: lead time, cost, and quality, based on which decision-making suggestions regarding EC impacts are drawn from a systems perspective. Simulation results confirm that the nonlinear dynamics of interactions between NPD and ECM plays a vital role in determining the final performance of development efforts

    GHOST: Building blocks for high performance sparse linear algebra on heterogeneous systems

    Get PDF
    While many of the architectural details of future exascale-class high performance computer systems are still a matter of intense research, there appears to be a general consensus that they will be strongly heterogeneous, featuring "standard" as well as "accelerated" resources. Today, such resources are available as multicore processors, graphics processing units (GPUs), and other accelerators such as the Intel Xeon Phi. Any software infrastructure that claims usefulness for such environments must be able to meet their inherent challenges: massive multi-level parallelism, topology, asynchronicity, and abstraction. The "General, Hybrid, and Optimized Sparse Toolkit" (GHOST) is a collection of building blocks that targets algorithms dealing with sparse matrix representations on current and future large-scale systems. It implements the "MPI+X" paradigm, has a pure C interface, and provides hybrid-parallel numerical kernels, intelligent resource management, and truly heterogeneous parallelism for multicore CPUs, Nvidia GPUs, and the Intel Xeon Phi. We describe the details of its design with respect to the challenges posed by modern heterogeneous supercomputers and recent algorithmic developments. Implementation details which are indispensable for achieving high efficiency are pointed out and their necessity is justified by performance measurements or predictions based on performance models. The library code and several applications are available as open source. We also provide instructions on how to make use of GHOST in existing software packages, together with a case study which demonstrates the applicability and performance of GHOST as a component within a larger software stack.Comment: 32 pages, 11 figure

    Realising intelligent virtual design

    Get PDF
    This paper presents a vision and focus for the CAD Centre research: the Intelligent Design Assistant (IDA). The vision is based upon the assumption that the human and computer can operate symbiotically, with the computer providing support for the human within the design process. Recently however the focus has been towards the development of integrated design platforms that provide general support irrespective of the domain, to a number of distributed collaborative designers. This is illustrated within the successfully completed Virtual Reality Ship (VRS) virtual platform, and the challenges are discussed further within the NECTISE, SAFEDOR and VIRTUE projects

    Realising intelligent virtual design

    Get PDF
    This paper presents a vision and focus for the CAD Centre research: the Intelligent Design Assistant (IDA). The vision is based upon the assumption that the human and computer can operate symbiotically, with the computer providing support for the human within the design process. Recently however the focus has been towards the development of integrated design platforms that provide general support irrespective of the domain, to a number of distributed collaborative designers. This is illustrated within the successfully completed Virtual Reality Ship (VRS) virtual platform, and the challenges are discussed further within the NECTISE, SAFEDOR and VIRTUE projects

    Using Discrete Event Simulation for Evaluating Engineering Change Management Decisions

    Get PDF

    Modeling and analyzing concurrent processes for project performance improvement

    Get PDF
    Ph.DNUS-TU/E JOINT PH.D. PROGRAMM

    A Method for Improving Overlapping of Testing and Design

    Get PDF
    Testing is a critical activity in product development. The academic literature provides limited insight about overlapping between upstream testing and downstream design tasks, especially in considering the qualitative differences between activities that are overlapped. In general, the existing literature treats two overlapped sequential activities as similar, and suggests optimal overlapping policies, techniques, and time–cost assessment. However, this case study-based research identifies that the overlapping of upstream testing with downstream design activities has different characteristics than the overlapping of two design activities. This paper first analyzes the characteristics that affect the overlapping of upstream testing and downstream design activities, and then proposes a method to reduce the time of rework in cases where the upstream testing is overlapped with subsequent redesign phases
    corecore