71,363 research outputs found
Recommended from our members
Simulating intertwined design processes that have similar structures: A case study of a small company that creates made-to-order fashion products
The authors use simulation to analyse the resource-driven dependencies between concurrent processes used to create customised products in a company. Such processes are uncertain and unique according to the design changes required. However, they have similar structures. For simulation, a level of abstraction is chosen such that all possible processes are represented by the same activity network. Differences between processes are determined by the customisations that they implement. The approach is illustrated through application to a small business that creates customised fashion products. We suggest that similar techniques could be applied to study intertwined design processes in more complex domains.The case study was carried out as part of Considerate Design for Personalised
Fashion funded by the EPSRC/AHRC Design in the 21st century programme. The
context of a multi-project environment was analysed as part of the EU Framework 7
CONVERGE project CP-FP 228746-2.Post-prin
Testing in the incremental design and development of complex products
Testing is an important aspect of design and development which consumes significant time and resource in many companies. However, it has received less research attention than many other activities in product development, and especially, very few publications report empirical studies of engineering testing. Such studies are needed to establish the importance of testing and inform the development of pragmatic support methods. This paper combines insights from literature study with findings from three empirical studies of testing. The case studies concern incrementally developed complex products in the automotive domain. A description of testing practice as observed in these studies is provided, confirming that testing activities are used for multiple purposes depending on the context, and are intertwined with design from start to finish of the development process, not done after it as many models depict. Descriptive process models are developed to indicate some of the key insights, and opportunities for further research are suggested
Overlapping Boundaries of the Project Time Management and Project Risk Management
Based on utility function, milestones during project and/or the end of projects or programme may be categorized in what are called soft-deadline and hard-deadline. In contrast with the soft-end projects, the hard-end projects posses a decrease of utility function with a vertical asymptote character around the deadline for project completion. In extreme situations, the utility function itself may fall under zero (projects may generate losses to both constructor and customer). Existing risk analysis methodologies observe risks from monetary terms. The typical risks are correlated with an increase in final project costs. In order to estimate harddeadline milestones and/or end of projects or programme is critical to employ the time dimension rather than the typical cost-based risk analysis. Here, we comprehensively describe a structured methodology that focuses on minimizing and mitigating project specific delay risks. The method may supplement existing cost-based risk analysis in projects. We aim to elegantly combine moderation techniques to reveal the intrinsic risk of the projects. In addition to the technical risks, the moderation techniques are able to bring evidence of risks as the team efficacy, diverse un-correlations or miss-understanding about the roles of the team members in the team – most of the project soft risk. Described methodology encourages the common understanding of risks for participants, crystallizing the essence of what can go wrong in complex situations and where the opportunities can be unlocked.Project Management, Risk Management, Time Management, Deadline, Delays
Elicitation of structured engineering judgement to inform a focussed FMEA
The practical use of Failure Mode and Effects Analysis (FMEA) has been criticised because it is often implemented too late and in a manner that does not allow information to be fed-back to inform the product design. Lessons learnt from the use of elicitation methods to gather structured expert judgement about engineering concerns for a new product design has led to an enhancement of the approach for implementing design and process FMEA. We refer to this variant as a focussed FMEA since the goal is to enable relevant engineers to contribute to the analysis and to act upon the outcomes in such a way that all activities focus upon the design needs. The paper begins with a review of the proposed process to identify and quantify engineering concerns. The pros and cons of using elicitation methods, originally designed to support construction of a Bayesian prior, to inform a focussed FMEA are analysed and a comparison of the proposed process in relation to the existing standards is made. An industrial example is presented to illustrate customisation of the process and discuss the impact on the design process
Detecting the community structure and activity patterns of temporal networks: a non-negative tensor factorization approach
The increasing availability of temporal network data is calling for more
research on extracting and characterizing mesoscopic structures in temporal
networks and on relating such structure to specific functions or properties of
the system. An outstanding challenge is the extension of the results achieved
for static networks to time-varying networks, where the topological structure
of the system and the temporal activity patterns of its components are
intertwined. Here we investigate the use of a latent factor decomposition
technique, non-negative tensor factorization, to extract the community-activity
structure of temporal networks. The method is intrinsically temporal and allows
to simultaneously identify communities and to track their activity over time.
We represent the time-varying adjacency matrix of a temporal network as a
three-way tensor and approximate this tensor as a sum of terms that can be
interpreted as communities of nodes with an associated activity time series. We
summarize known computational techniques for tensor decomposition and discuss
some quality metrics that can be used to tune the complexity of the factorized
representation. We subsequently apply tensor factorization to a temporal
network for which a ground truth is available for both the community structure
and the temporal activity patterns. The data we use describe the social
interactions of students in a school, the associations between students and
school classes, and the spatio-temporal trajectories of students over time. We
show that non-negative tensor factorization is capable of recovering the class
structure with high accuracy. In particular, the extracted tensor components
can be validated either as known school classes, or in terms of correlated
activity patterns, i.e., of spatial and temporal coincidences that are
determined by the known school activity schedule
GTTC Future of Ground Testing Meta-Analysis of 20 Documents
National research, development, test, and evaluation ground testing capabilities in the United States are at risk. There is a lack of vision and consensus on what is and will be needed, contributing to a significant threat that ground test capabilities may not be able to meet the national security and industrial needs of the future. To support future decisions, the AIAA Ground Testing Technical Committees (GTTC) Future of Ground Test (FoGT) Working Group selected and reviewed 20 seminal documents related to the application and direction of ground testing. Each document was reviewed, with the content main points collected and organized into sections in the form of a gap analysis current state, future state, major challenges/gaps, and recommendations. This paper includes key findings and selected commentary by an editing team
Liquid rocket booster integration study. Volume 1: Executive summary
The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the executive summary of the five volume series
Concurrent Engineering Implementation in Design-Build Railway Projects
Design-build as a procurement method is increasingly being used in the design and construction of greenfield rail networks, and that is despite the complexities that characterise rail networks—rail infrastructure projects involves significantly more complex systems such as safety, telecommunications, signalling and electrification. One of the key drivers for this choice of procurement method for the delivery of rail networks is that the design-build contractor commits to an aggressive schedule and implements strategies to enable the works to be completed to time and cost. One of such strategies is the application of concurrent engineering principles to the design and construction works. This Chapter gives an overview of concurrent engineering as applicable to design-build rail projects, focusing mainly on the design as an activity. It identifies factors that impact the application of concurrent engineering as well as mitigations that can be applied for the successful application of concurrent engineering principles in design-build rail projects
Performance of Coupled Product Development Activities with a Deadline
This paper explores the performance of coupled development tasks subject to a deadline constraint
by proposing a performance generation model (PGM). The goal of the PGM is to develop insights
about optimal strategies (i.e. sequential, concurrent, or overlapped) to manage coupled design tasks
that share fixed amount of engineering resources subject to performance and deadline constraints.
Model analysis characterizes the solution space for the coupled development problem. The solution
space is used to explore the generation of product performance and the associated dynamic forces
affecting concurrent development practices. We use these forces to explain conditions under which
concurrency is a desirable strategy
Back to Keynes?
After a brief review of classical, Keynesian, New Classical and New Keynesian theories of macroeconomic policy, we assess whether New Keynesian Economics captures the quintessential features stressed by J.M. Keynes. Particular attention is paid to Keynesian features omitted in New Keynesian workhorses such as the micro-founded Keynesian multiplier and the New Keynesian Phillips curve. These theories capture wage and price sluggishness and aggregate demand externalities by departing from a competitive framework and give a key role to expectations. The main deficiencies, however, are the inability to predict a pro-cyclical real wage in the face of demand shocks, the absence of inventories, credit constraints and bankruptcies in explaining the business cycle, and no effect of the nominal as well as the real interest rate on aggregate demand. Furthermore, they fail to allow for quantity rationing and to model unemployment as a catastrophic event. The macroeconomics based on the New Keynesian Phillips curve has quite a way to go before the quintessential Keynesian features are captured.Keynesian economics, New Keynesian Phillips curve, monopolistic competition, nominal wage rigidity, welfare, pro-cyclical real wage, inventories, liquidity, bankruptcy, unemployment, monetary policy
- …