128,628 research outputs found

    A metric to represent the evolution of CAD/analysis models in collaborative design

    Get PDF
    Computer Aided Design (CAD) and Computer Aided Engineering (CAE) models are often used during product design. Various interactions between the different models must be managed for the designed system to be robust and in accordance with initially defined specifications. Research published to date has for example considered the link between digital mock-up and analysis models. However design/analysis integration must take into consideration the important number of models (digital mock-up and simulation) due to model evolution in time, as well as considering system engineering. To effectively manage modifications made to the system, the dependencies between the different models must be known and the nature of the modification must be characterised to estimate the impact of the modification throughout the dependent models. We propose a technique to describe the nature of a modification which may be used to determine the consequence within other models as well as a way to qualify the modified information. To achieve this, a metric is proposed that allows the qualification and evaluation of data or information, based on the maturity and validity of information and model

    Identifying and evaluating parallel design activities using the design structure matrix

    Get PDF
    This paper describes an approach based upon the Design Structure Matrix (DSM) for identifying, evaluating and optimising one aspect of CE: activity parallelism. Concurrent Engineering (CE) has placed emphasis on the management of the product development process and one of its major benefits is the reduction in lead-time and product cost [1]. One approach that CE promotes for the reduction of lead-time is the simultaneous enactment of activities otherwise known as Simultaneous Engineering. Whilst activity parallelism may contribute to the reduction in lead-time and product cost, the effect of iteration is also recognised as a contributing factor on lead-time, and hence was also combined within the investigation. The paper describes how parallel activities may be identified within the DSM, before detailing how a process may be evaluated with respect to parallelism and iteration using the DSM. An optimisation algorithm is then utilised to establish a near-optimal sequence for the activities with respect to parallelism and iteration. DSM-based processes from previously published research are used to describe the development of the approach

    Space, Time and Color in Hadron Production Via e+e- -> Z0 and e+e- -> W+W-

    Get PDF
    The time-evolution of jets in hadronic e+e- events at LEP is investigated in both position- and momentum-space, with emphasis on effects due to color flow and particle correlations. We address dynamical aspects of the four simultanously-evolving, cross-talking parton cascades that appear in the reaction e+e- -> gamma/Z0 -> W+W- -> q1 q~2 q3 q~4, and compare with the familiar two-parton cascades in e+e- -> Z0 -> q1 q~2. We use a QCD statistical transport approach, in which the multiparticle final state is treated as an evolving mixture of partons and hadrons, whose proportions are controlled by their local space-time geography via standard perturbative QCD parton shower evolution and a phenomenological model for non-perturbative parton-cluster formation followed by cluster decays into hadrons. Our numerical simulations exhibit a characteristic `inside-outside' evolution simultanously in position and momentum space. We compare three different model treatments of color flow, and find large effects due to cluster formation by the combination of partons from different W parents. In particular, we find in our preferred model a shift of several hundred MeV in the apparent mass of the W, which is considerably larger than in previous model calculations. This suggests that the determination of the W mass at LEP2 may turn out to be a sensitive probe of spatial correlations and hadronization dynamics.Comment: 52 pages, latex, 18 figures as uu-encoded postscript fil

    MODES OF INNOVATION & UNCERTAINTIES IN THE CAPITAL GOODS INDUSTRY

    Get PDF
    Product innovation is a subtle process, frequently leading to shifts in the competitiveness of firms. Developing products in an environment undergoing technological change is given to frequent failure, even in well-established and sophisticated organizations. In order to tackle competitiveness and to deal with innovation uncertainty, firms develop diverse innovation processes. Two modes of innovation are suggested in recent literature: 1) Science, Technology and Innovation (STI) mode, which is based on the production and use of codified scientific and technical knowledge; and 2) Doing, Using and Interacting (DUI) mode, which relies on informal processes of learning and experience-based know-how. In this paper we analyse product innovation at firm level. We perform an exploratory analysis in four leading equipment and machinery producers from the Aveiro region, in Portugal. Doing so, we explore the main features of the capital goods’ industry with implications for innovation, and analyse the dominant uncertainties associated to the innovation process. and modes of innovation. Key findings include the complete absence of DUI mode in the cases studied, and even a low learning characteristic in one company. The paper concludes by considering the implications for firms’ competitiveness and for innovation policy.modes of innovation, uncertainties, R&D, capital goods, SME

    Dynamics of disentanglement, density matrix and coherence in neutrino oscillations

    Full text link
    In charged current weak interaction processes, neutrinos are produced in an entangled state with the charged lepton. This correlated state is disentangled by the measurement of the charged lepton in a detector at the production site. We study the dynamical aspects of disentanglement, propagation and detection, in particular the conditions under which the disentangled state is a coherent superposition of mass eigenstates. The appearance and disappearance far-detection processes are described from the time evolution of this disentangled "collapsed" state. The familiar quantum mechanical interpretation and factorization of the detection rate emerges when the quantum state is disentangled on time scales \emph{much shorter} than the inverse oscillation frequency, in which case the final detection rate factorizes in terms of the usual quantum mechanical transition probability provided the final density of states is insensitive to the neutrino energy difference. We suggest \emph{possible} corrections for short-baseline experiments. If the charged lepton is unobserved, neutrino oscillations and coherence are described in terms of a reduced density matrix obtained by tracing out an un-observed charged lepton. The diagonal elements in the mass basis describe the production of mass eigenstates whereas the off diagonal ones provide a measure of coherence. It is shown that coherences are of the same order of the diagonal terms on time scales up to the inverse oscillation frequency, beyond which the coherences oscillate as a result of the interference between mass eigenstates.Comment: 19 pages, v.2: discussions adde

    "Open Innovation" and "Triple Helix" Models of Innovation: Can Synergy in Innovation Systems Be Measured?

    Get PDF
    The model of "Open Innovations" (OI) can be compared with the "Triple Helix of University-Industry-Government Relations" (TH) as attempts to find surplus value in bringing industrial innovation closer to public R&D. Whereas the firm is central in the model of OI, the TH adds multi-centeredness: in addition to firms, universities and (e.g., regional) governments can take leading roles in innovation eco-systems. In addition to the (transversal) technology transfer at each moment of time, one can focus on the dynamics in the feedback loops. Under specifiable conditions, feedback loops can be turned into feedforward ones that drive innovation eco-systems towards self-organization and the auto-catalytic generation of new options. The generation of options can be more important than historical realizations ("best practices") for the longer-term viability of knowledge-based innovation systems. A system without sufficient options, for example, is locked-in. The generation of redundancy -- the Triple Helix indicator -- can be used as a measure of unrealized but technologically feasible options given a historical configuration. Different coordination mechanisms (markets, policies, knowledge) provide different perspectives on the same information and thus generate redundancy. Increased redundancy not only stimulates innovation in an eco-system by reducing the prevailing uncertainty; it also enhances the synergy in and innovativeness of an innovation system.Comment: Journal of Open Innovations: Technology, Market and Complexity, 2(1) (2016) 1-12; doi:10.1186/s40852-016-0039-

    New perspectives on realism, tractability, and complexity in economics

    Get PDF
    Fuzzy logic and genetic algorithms are used to rework more realistic (and more complex) models of competitive markets. The resulting equilibria are significantly different from the ones predicted from the usual static analysis; the methodology solves the Walrasian problem of how markets can reach equilibrium, starting with firms trading at disparate prices. The modified equilibria found in these complex market models involve some mutual self-restraint on the part of the agents involved, relative to economically rational behaviour. Research (using similar techniques) into the evolution of collaborative behaviours in economics, and of altruism generally, is summarized; and the joint significance of these two bodies of work for public policy is reviewed. The possible extension of the fuzzy/ genetic methodology to other technical aspects of economics (including international trade theory, and development) is also discussed, as are the limitations to the usefulness of any type of theory in political domains. For the latter purpose, a more differentiated concept of rationality, appropriate to ill-structured choices, is developed. The philosophical case for laissez-faire policies is considered briefly; and the prospects for change in the way we ‘do economics’ are analysed
    • 

    corecore