1,882 research outputs found

    Multi-Platform Thinking: Knowledge Integration in the Lean Enterprise

    Get PDF
    Student research poster description shee

    Knowledge Ingtegration in Large-Scale Organizations and Networks - Conceptual Overview and Operational Definition

    Get PDF
    Knowledge integration is an emerging discipline in organizational science where the central proposition is that the increasing complexity of products and services being developed and delivered, means that the knowledge required for production is increasingly specialized, varied (multi-disciplinary) and distributed across the organization’s internal boundaries, and as a result there is a need for organizations to continuously gather their knowledge resources in order to maintain their ability to innovate, and to sustain their competitive position in the market. In addition, the increasing scale and scope of organizational arrangements, such as multinational partnerships or multi-tiered prime-supplier arrangements commonly encountered in the aerospace, automotive and other complex product development industries, also give rise to environments of dispersed knowledge resources, thus necessitating the subsequent integration of this knowledge across external boundaries spanning large-scale organizational networks. Knowledge integration in this context is done through a process of transferring knowledge from multiple sources in the organizational network to where it is needed, combining it with existing knowledge, before it can be applied to accomplish complex tasks and to solve major problems. The primary purpose of this paper is to define the powerful concept of knowledge integration in large-scale organizational networks using an extensive review of the pertinent literature on knowledge in organizations. An operational definition for knowledge integration is also proposed, followed by a systematic identification and classification of the different strategies, practices, channels and mechanisms for integrating different types of knowledge across a multitude of organizational boundaries and environments

    The probabilistic k-center problem

    Get PDF
    The k-Center problem on a graph is to nd a set K of k vertices minimizing the radius dened as the maximum distance between any vertex and K. We propose a probabilistic combinatorial optimization model for this problem, with uncertainty on vertices. This model is inspired by a wildre management problem. The graph represents the adjacency of zones of a landscape, where each vertex represents a zone. We consider a nite set of re scenarios with related probabilities. Given a k-center, its radius may change in some scenarios since some evacuation paths become impracticable. The objective is to nd a robust k-center that minimizes the expected value of the radius over all scenarios. We study this new problem with scenarios limited to a single burning vertex. First results deal with explicit solutions on paths and cycles, and hardness on planar graphs

    Multi-Platform Thinking: Integrating the Lean Enterprise

    Get PDF
    Presentation on integrating the lean enterpris

    Knowledge integration for problem solving in the development of complex aerospace systems

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2008.Includes bibliographical references (p. 238-248).The development of complex products requires widespread knowledge interactions among a significant number of individuals and teams designing numerous interrelated components. Increasing product complexity typically leads to a corresponding increase in the types and sources of knowledge that need to be tapped during development, and a common strategy for managing product complexity is to outsource parts and components to external suppliers. As a result, the knowledge required for development is increasingly specialized and distributed across multiple boundaries spanning large-scale organizational networks, thus requiring the subsequent integration of this knowledge in order to accomplish the development task. A framework for knowledge integration in the development of complex systems in a large-scale organizational context is proposed in this thesis using an extensive review of the pertinent literature. The framework consists of the main channels, strategies, practices and mechanisms most commonly used to transfer, share and apply knowledge in the course of complex technical problem solving. The framework is progressively refined using empirical data collected through several rounds of interviews and a questionnaire instrument administered across three major aircraft programs in the defense aerospace industry. We find that knowledge integration in routine problem solving situations is most efficiently and effectively accomplished through extensive transfer and sharing of codified information using formal mechanisms such as information systems, while knowledge integration for major non-routine troubleshooting events requires extensive integration of individual expertise and know-how through both formal and informal advice sharing as well as direct assistance across internal and external organizational boundaries.(cont.) A principal contribution of this research is in demonstrating how different characteristics of the engineering artifact defined in terms of product complexity, architecture and technology newness, and different aspects of problem solving including problem type and novelty, drive the knowledge integration process and the organizational system. We conclude that permeability of cross-program boundaries, direct relationships with functional groups and rich tacit knowledge flow from suppliers are critical for countering rampant firefighting in complex product development.by Marc George Haddad.Ph.D

    Feasibility and performance of a device for automatic self-detection of symptomatic acute coronary artery occlusion in outpatients with coronary artery disease : a multicentre observational study

    Get PDF
    Background Time delay between onset of symptoms and seeking medical attention is a major determinant of mortality and morbidity in patients with acute coronary artery occlusion. Response time might be reduced by reliable self-detection. We aimed to formally assess the proof-of-concept and accuracy of self-detection of acute coronary artery occlusion by patients during daily life situations and during the very early stages of acute coronary artery occlusion. Methods In this multicentre, observational study, we tested the operational feasibility, specificity, and sensitivity of our RELF method, a three-lead detection system with an automatic algorithm built into a mobile handheld device, for detection of acute coronary artery occlusion. Patients were recruited continuously by physician referrals from three Belgian hospitals until the desired sample size was achieved, had been discharged with planned elective percutaneous coronary intervention, and were able to use a smartphone; they were asked to perform random ambulatory selfrecordings for at least 1 week. A similar self-recording was made before percutaneous coronary intervention and at 60 s of balloon occlusion. Patients were clinically followed up until 1 month after discharge. We quantitatively assessed the operational feasibility with an automated dichotomous quality check of self-recordings. Performance was assessed by analysing the receiver operator characteristics of the ST difference vector magnitude. This trial is registered with ClinicalTrials.gov, number NCT02983396. Findings From Nov 18, 2016, to April 25, 2018, we enrolled 64 patients into the study, of whom 59 (92%) were eligible for self-applications. 58 (91%) of 64 (95% CI 81.0-95.6) patients were able to perform ambulatory self-recordings. Of all 5011 self-recordings, 4567 (91%) were automatically classified as successful within 1 min. In 65 balloon occlusions, 63 index tests at 60 s of occlusion in 55 patients were available. The mean specificity of daily life recordings was 0.96 (0.95-0.97). The mean false positive rate during daily life conditions was 4.19% (95% CI 3.29-5.10). The sensitivity for the target conditions was 0.87 (55 of 63; 95% CI 0.77-0.93) for acute coronary artery occlusion, 0.95 (54 of 57; 0.86-0.98) for acute coronary artery occlusion with electrocardiogram (ECG) changes, and 1.00 (35 of 35) for acute coronary artery occlusion with ECG changes and ST-segment elevation myocardial infarction criteria (STEMI). The index test was more sensitive to detect a 60 s balloon occlusion than the STEMI criteria on 12-lead ECG (87% vs 56%; p<0.0001). The proportion of total variation in study estimates due to heterogeneity between patients (I-2) was low (12.6%). The area under the receiver operator characteristics curve was 0.973 (95% CI 0.956-0.990) for acute coronary artery occlusion at different cutoff values of the magnitude of the ST difference vector. No patients died during the study. Interpretation Self-recording with our RELF device is feasible for most patients with coronary artery disease. The sensitivity and specificity for automatic detection of the earliest phase of acute coronary artery occlusion support the concept of our RELF device for patient empowerment to reduce delay and increase Survival without overloading emergency services. Copyright (C) 2019 The Author(s). Published by Elsevier Ltd

    Insights into functional mitral regurgitation using the average pixel intensity method

    Get PDF
    Previously we introduced and validated the average pixel intensity (API) method for grading mitral regurgitation (MR) in a heterogeneous MR population. We now investigated the feasibility and added value of the API method more specifically in patients with functional MR (FMR). We consecutively enrolled 283 patients with pure FMR. Transthoracic echocardiography was performed and MR was assessed using the API method and guideline-recommended parameters, including color Doppler, vena contracta width (VCW) and proximal isovelocity surface area (PISA)-based methods. The API method had an applicability of 98% in this FMR cohort, which was significantly higher than VCW (84%) and PISA-based methods (75%). Overall, the API method had significant correlations with direct parameters of FMR severity, ejection fraction, atrial and ventricular dimensions, pulmonary pressures and New York Heart Association class. Analysis of the API dynamics during MR revealed a typical pattern with early and late systolic peaks in API and a midsystolic nadir, which matched the temporal changes of the effective regurgitant orifice (ERO) during FMR. Based on ROC curves of established FMR severity cut-offs, an API value of 125 au was considered the optimal cut-off to determine severe MR. Interestingly, this API severity cut-off is similar to the API severity cut-off for MR in degenerative MR (DMR), despite different EROA/RV cut-offs in current ESC guidelines for FMR and DMR. The API method is an easy, fast and feasible parameter for grading FMR and may complement the multiparametric assessment of FMR in daily clinical practice
    corecore