5 research outputs found

    Framework for collaborative knowledge management in organizations

    Get PDF
    Nowadays organizations have been pushed to speed up the rate of industrial transformation to high value products and services. The capability to agilely respond to new market demands became a strategic pillar for innovation, and knowledge management could support organizations to achieve that goal. However, current knowledge management approaches tend to be over complex or too academic, with interfaces difficult to manage, even more if cooperative handling is required. Nevertheless, in an ideal framework, both tacit and explicit knowledge management should be addressed to achieve knowledge handling with precise and semantically meaningful definitions. Moreover, with the increase of Internet usage, the amount of available information explodes. It leads to the observed progress in the creation of mechanisms to retrieve useful knowledge from the huge existent amount of information sources. However, a same knowledge representation of a thing could mean differently to different people and applications. Contributing towards this direction, this thesis proposes a framework capable of gathering the knowledge held by domain experts and domain sources through a knowledge management system and transform it into explicit ontologies. This enables to build tools with advanced reasoning capacities with the aim to support enterprises decision-making processes. The author also intends to address the problem of knowledge transference within an among organizations. This will be done through a module (part of the proposed framework) for domain’s lexicon establishment which purpose is to represent and unify the understanding of the domain’s used semantic

    Role of stage gates in effective knowledge sharing during the product development process

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, System Design & Management Program, 2002."February 2002."Includes bibliographical references (leaves 135-139).Premise of the thesis is that in today's knowledge economy, competitive advantage comes from effective use of corporate knowledge. This thesis compares and contrasts current practices for knowledge sharing in Xerox with an idealized model of best practices for knowledge sharing. The study explores the hypothesis that stage gates in a product development process are important for sharing corporate knowledge across functions and organizations, and that the product development process itself serves as an infrastructure for knowledge sharing. This study involved an analysis of knowledge sharing practices during stage gates reviews and how they evolved over time after stage-gate reviews. To develop an idealized model of best practices for knowledge sharing, experts of knowledge management in academia and industry were interviewed, and an extensive literature review was completed. This served as a backdrop for analysis in the case study at Xerox. The case study at Xerox utilized a personal interview approach complemented by a survey through electronic mail, and assessment was done against the idealized model of best practices for knowledge sharing. Twenty-six senior managers at Xerox were interviewed/surveyed. Strengths of Xerox in knowledge sharing and areas of improvements were identified. Using open ended questions, a holistic view for the scope of Xerox efforts, as well as the depth and quality of the best practices during the product development process was compiled. Using Carlile's knowledge boundary framework and boundary objects, attempt was done to transform engineering knowledge from one domain to another. This framework also served as a basis for suggestions for future improvements in knowledge sharing at Xerox in the areas of improvements identified through the interviews/surveys. Though any single company has not discovered the mantra for knowledge management and sharing; several good practices, which were consistently enablers of perceived success, were identified. The effective enablers towards knowledge sharing were a synergistic gathering of "common sense" items such as morale, trust, common goals, value and criticality of knowledge, diversity, and structure, rewards/recognition, support and knowledge initiatives along multiple fronts. It was discovered, that the product managers perceive that Xerox has considerable success in promoting a knowledge culture and has an effective product development process. It was also found that knowledge boundary framework and boundary objects serve as a good vehicle to explain the difficulty of knowledge sharing across functional and organizational boundaries. Engineering tools such as critical parameter management could benefit by a uniform, standardized approach to bringing together subject matter experts from various domains and creating the environment for creating new knowledge and innovations. Systems processes like the Xerox platform approach, where the systems architecture is composed of common platform elements, and core competencies in the development of reusable components for the platform elements are the basis for the Xerox product development process. Using the knowledge acquired through practical experience and education and taking a holistic view of the product development process as the boundary framework for knowledge transfer, we used the eCPM (Engineering Critical Parameter Management Tool) to translate knowledge from a domain expert in mechanical engineering to a common semantic base for transformation into the domain of software engineering. Specific tacit knowledge on what makes a parameter critical and how it plays a role in mechanical aspects in the design of Xerox devices, such as the system itself, media and motion path, marker path and the control and image path, as well as how to control these designs is to be transformed into the domain of software engineering. It was found that use of the eCPM tool to develop similar meaning of parameters for tuning software resources such as CPU speeds, memory utilization and performance is possible. Attempt to create new knowledge in the domain of software will be proceeding with a larger number of domain experts. Specific new knowledge in establishment of which software parameters to be labeled as critical (versus design parameters allocated and controlled via Input /Output/ Constraint values), which parameters should be system control parameters (those which span over multiple subsystems, and have latitudes within which to be tweaked in various sub-systems), the failure modes and latitudes for the failure modes will be part of future work. This will be part of a knowledge sharing and management framework proposed in the thesis because of the diagnostic analysis done of the current state at Xerox.by Tulsi D. Ramchandani.S.M

    Investigating a deep learning approach to real-time air quality prediction and visualisation on UK highways

    Get PDF
    The construction of intercity highways by the United Kingdom (UK) government has resulted in a progressive increase in vehicle emissions and pollution from noise, dust, and vibrations amid growing concerns about air pollution. Existing roadside pollution monitoring devices have faced limitations due to their fixed locations, limited sensitivity, and inability to capture the full spatial variability, which can result in less accurate measurements of transient and fine-scale pollutants like nitrogen oxides and particulate matter. Reports on regional highways across the country are based on a limited number of fixed monitoring stations that are sometimes located far from the highway. These periodic and coarse-grained measurements cause inefficient highway air quality reporting, leading to inaccurate air quality forecasts. Multi-target neural network is a type of machine learning algorithm that offers the advantage of simultaneously predicting multiple pollutants, enhancing predictive accuracy and efficiency by capturing complex interdependencies among various air quality parameters. The potentials of this and similar multi-target prediction techniques are yet to be fully exploited in the air quality space due to the unavailability of the right data set. To address these limitations, this doctoral thesis proposes and implements a framework which adopts cutting-edge digital technologies such as Internet of Things, Big Data and Deep Learning for a more efficient way of capturing and forecasting traffic related air pollution (TRAP). The empirical component of the study involves a detailed comparative analysis of advanced predictive models, incorporating an enriched dataset that includes road elevation, vehicle emission factors, and background maps, alongside traditional traffic flow, weather, and pollution data. The research adopts a multi-target regression approach to forecast concentrations of NO2, PM2.5, and PM10 across multiple time steps. Various models were tested, with Fastai's tabular model, Prophet's time-series model, and scikit-learn's multioutput regressor being central to the experimentation. The Fastai model demonstrated superior performance, evidenced by its Root-Mean Square Error (RMSE) scores for each pollutant. Statistical analysis using the Friedman and Wilcoxon tests confirmed the Fastai model's significance, further supported by an algorithmic audit that identified key features contributing to the model's predictive power. This doctoral thesis not only advances the methodology for air quality monitoring and forecasting along highways but also lays the groundwork for future research aimed at refining air quality assessment practices and enhancing environmental health standards

    Law and Ecology

    Get PDF
    Law and Ecology: New Environmental Foundations contains a series of theoretical and applied perspectives on the connection between law and ecology, which together offer a radical and socially responsive foundation for environmental law. While its legal corpus grows daily, environmental law has not enjoyed the kind of jurisprudential underpinning generally found in other branches of law. This book forges a new ecological jurisprudential foundation for environmental law – where ‘ecological' is understood both in the narrow sense of a more ecosystemic perspective on law, and in the broad sense of critical self-reflection of the mechanisms of environmental law as they operate in a context where boundaries between the human and the non-human are collapsing, and where the traditional distinction between ecocentrism and anthropocentrism is recast. Addressing current debates, including the intellectual property of bioresources; the protection of biodiversity in view of tribal land demands; the ethics of genetically modified organisms; the redefinition of the 'human' through feminist and technological research; the spatial/geographical boundaries of environmental jurisdiction; and the postcolonial geographies of pollution – Law and Ecology redefines the way environmental law is perceived, theorised and applied. It also constitutes a radical challenge to the traditionally human-centred frameworks and concerns of legal theory
    corecore