442,560 research outputs found

    Examining the critical interplay of knowledge acquisition and integration capabilities in project oriented service firms

    Get PDF
    While past knowledge-based approaches to service innovation have emphasized the role of knowledge integration in the delivery of customer-focused solutions, these approaches do not adequately address the complexities inherent in knowledge acquisition and integration in project-oriented firms. Adopting a dynamic capability framework and building on knowledge-based approaches to innovation, the current study examines how the interplay of learning capabilities and knowledge integration capability impacts service innovation and sustained competitive advantage. This two-stage multi-sample study finds that entrepreneurial project-oriented service firms in their quest for competitive advantage through greater innovation invest in knowledge acquisition and integration capabilities. Implications for theory and practice are discussed and directions for future research provided

    Automated ontology framework for service robots

    Get PDF
    This paper presents an automated ontology framework for service robots. The framework is designed to automatically create an ontology and an instance of concept in dynamic environment. Ontology learning from text is applied to build a concept hierarchy using WordNet which provides a rich semantic processing for physical objects. The Automated Ontology is composed of four modules: Concept Creation, Property Creation, Relationship Creation and Instance of Concept Creation. The automated ontology algorithm was implemented in order to create the concept hierarchy in the Robot Ontology. The Semantic Knowledge Acquisition represents knowledge of physical objects in dynamic environments. In simulation experiments, the list of object names and property names was identified. The result shows the concept hierarchy which represents explicit terms and the semantic knowledge of physical objects for performing everyday manipulation tasks

    Knowledge Acquisition for a New Business Model Creation and Enabling Factors

    Get PDF
    Title: Knowledge acquisition for a new business model creation and enabling factors Date of the seminar: 28-05-2015 Course: Master Corporate Entrepreneurship and Innovation Internship and Degree Project (Master Thesis 15 ECTS) Authors: Dovilė Gedvilaitė and Shubhabrata Paul Supervisor: Dr. Joakim Winborg Keywords: business model innovation, change, creation, knowledge, knowledge acquisition, absorptive capacity Thesis purpose: To find out how organisation performs knowledge acquisition for a new business model creation and what factors enable it. Methodology: The carried research is a case study based on a qualitative research design. The data collection procedure for this study has been a series of semi-structured and unstructured interviews conducted at the internship company. The interviewees and their relatively spread out positions within the company having varied responsibilities have provided the empirical data which being qualitative in nature are more elaborative and exploratory. This has meant discovery of enabling conditions for knowledge acquisition in the new business model creation and identification of the components of the process by which the organisation proceeded with the change and realized it. Theoretical perspectives: Business model innovation and change theory (Zott et al. 2011), (Zott & Amit, 2007), (Teece, 2010), (Richardson, 2008), (Cavalcante et al., 2011), Organisational knowledge creation theory (Nonaka, 1994), (Nonaka & Krogh, 2009), Absorptive capacity (Cohen & Levinthal, 1990), (Zahra & George, 2002) Conclusions: This thesis work builds on the seminal work of Nonaka (1994) in contributing to the adaptation and/or expansion of the organisational knowledge creation framework to the knowledge acquisition process during an organisation’s new business model creation. Key components are identified in this dynamic interplay between knowledge acquisition and business model creation and modified framework is proposed

    Knowledge Acquisition Analytical Games: games for cognitive systems design

    Get PDF
    Knowledge discovery from data and knowledge acquisition from experts are steps of paramount importance when designing cognitive systems. The literature discusses extensively on the issues related to current knowledge acquisition techniques. In this doctoral work we explore the use of gaming approaches as a knowledge acquisition tools, capitalising on aspects such as engagement, ease of use and ability to access tacit knowledge. More specifically, we explore the use of analytical games for this purpose. Analytical game for decision making is not a new class of games, but rather a set of platform independent simulation games, designed not for entertainment, whose main purpose is research on decision-making, either in its complete dynamic cycle or a portion of it (i.e. Situational Awareness). Moreover, the work focuses on the use of analytical games as knowledge acquisition tools. To this end, the Knowledge Acquisition Analytical Game (K2AG) method is introduced. K2AG is an innovative game framework for supporting the knowledge acquisition task. The framework introduced in this doctoral work was born as a generalisation of the Reliability Game, which on turn was inspired by the Risk Game. More specifically, K2AGs aim at collecting information and knowledge to be used in the design of cognitive systems and their algorithms. The two main aspects that characterise those games are the use of knowledge cards to render information and meta-information to the players and the use of an innovative data gathering method that takes advantage of geometrical features of simple shapes (e.g. a triangle) to easily collect players\u2019 beliefs. These beliefs can be mapped to subjective probabilities or masses (in evidence theory framework) and used for algorithm design purposes. However, K2AGs might use also different means of conveying information to the players and to collect data. Part of the work has been devoted to a detailed articulation of the design cycle of K2AGs. More specifically, van der Zee\u2019s simulation gaming design framework has been extended in order to account for the fact that the design cycle steps should be modified to include the different kinds of models that characterise the design of simulation games and simulations in general, namely a conceptual model (platform independent), a design model (platform independent) and one or more implementation models (platform dependent). In addition, the processes that lead from one model to the other have been mapped to design phases of analytical wargaming. Aspects of game validation and player experience evaluation have been addressed in this work. Therefore, based on the literature a set of validation criteria for K2AG has been proposed and a player experience questionnaire for K2AGs has been developed. This questionnaire extends work proposed in the literature, but a validation has not been possible at the time of writing. Finally, two instantiations of the K2AG framework, namely the Reliability Game and the MARISA Game, have been designed and analysed in details to validate the approach and show its potentialities

    Knowledge discovery for moderating collaborative projects

    Get PDF
    In today's global market environment, enterprises are increasingly turning towards collaboration in projects to leverage their resources, skills and expertise, and simultaneously address the challenges posed in diverse and competitive markets. Moderators, which are knowledge based systems have successfully been used to support collaborative teams by raising awareness of problems or conflicts. However, the functioning of a moderator is limited to the knowledge it has about the team members. Knowledge acquisition, learning and updating of knowledge are the major challenges for a Moderator's implementation. To address these challenges a Knowledge discOvery And daTa minINg inteGrated (KOATING) framework is presented for Moderators to enable them to continuously learn from the operational databases of the company and semi-automatically update the corresponding expert module. The architecture for the Universal Knowledge Moderator (UKM) shows how the existing moderators can be extended to support global manufacturing. A method for designing and developing the knowledge acquisition module of the Moderator for manual and semi-automatic update of knowledge is documented using the Unified Modelling Language (UML). UML has been used to explore the static structure and dynamic behaviour, and describe the system analysis, system design and system development aspects of the proposed KOATING framework. The proof of design has been presented using a case study for a collaborative project in the form of construction project supply chain. It has been shown that Moderators can "learn" by extracting various kinds of knowledge from Post Project Reports (PPRs) using different types of text mining techniques. Furthermore, it also proposed that the knowledge discovery integrated moderators can be used to support and enhance collaboration by identifying appropriate business opportunities and identifying corresponding partners for creation of a virtual organization. A case study is presented in the context of a UK based SME. Finally, this thesis concludes by summarizing the thesis, outlining its novelties and contributions, and recommending future research

    Ecological constraint mapping: understanding uutcome-limiting bottlenecks for improved environmental decision-making in marine and coastal environments

    Get PDF
    Despite genuine attempts, the history of marine and coastal ecosystem management is littered with examples of poor environmental, social and financial outcomes. Marine ecosystems are largely populated by species with open populations, and feature ecological processes that are driven by multiple, interwoven, dynamic causes and effects. This complexity limits the acquisition of relevant knowledge of habitat characteristics, species utilisation and ecosystem dynamics. The consequence of this lack of knowledge is uncertainty about the link between action taken and outcome achieved. Such uncertainty risks misdirected human and financial investment, and sometimes may even lead to perverse outcomes. Technological advances offer new data acquisition opportunities, but the diversity and complexity of the biological and ecological information needed to reduce uncertainty means the increase in knowledge will be slow unless it is undertaken in a structured and focussed way. We introduce “Ecological Constraint Mapping” – an approach that takes a “supply chain” point of view and focusses on identifying the principal factors that constrain life-history outcomes (success/productivity/resilience/fitness) for marine and coastal species, and ultimately the quality and resilience of the ecosystems they are components of, and the life-history supporting processes and values ecosystems provide. By providing a framework for the efficient development of actionable knowledge, Ecological Constraint Mapping can facilitate a move from paradigm-based to knowledge-informed decision-making on ecological issues. It is suitable for developing optimal solutions to a wide range of conservation and management problems, providing an organised framework that aligns with current perspectives on the complex nature of marine and coastal systems

    Human-Centered Automation for Resilience in Acquiring Construction Field Information

    Get PDF
    abstract: Resilient acquisition of timely, detailed job site information plays a pivotal role in maintaining the productivity and safety of construction projects that have busy schedules, dynamic workspaces, and unexpected events. In the field, construction information acquisition often involves three types of activities including sensor-based inspection, manual inspection, and communication. Human interventions play critical roles in these three types of field information acquisition activities. A resilient information acquisition system is needed for safer and more productive construction. The use of various automation technologies could help improve human performance by proactively providing the needed knowledge of using equipment, improve the situation awareness in multi-person collaborations, and reduce the mental workload of operators and inspectors. Unfortunately, limited studies consider human factors in automation techniques for construction field information acquisition. Fully utilization of the automation techniques requires a systematical synthesis of the interactions between human, tasks, and construction workspace to reduce the complexity of information acquisition tasks so that human can finish these tasks with reliability. Overall, such a synthesis of human factors in field data collection and analysis is paving the path towards “Human-Centered Automation” (HCA) in construction management. HCA could form a computational framework that supports resilient field data collection considering human factors and unexpected events on dynamic job sites. This dissertation presented an HCA framework for resilient construction field information acquisition and results of examining three HCA approaches that support three use cases of construction field data collection and analysis. The first HCA approach is an automated data collection planning method that can assist 3D laser scan planning of construction inspectors to achieve comprehensive and efficient data collection. The second HCA approach is a Bayesian model-based approach that automatically aggregates the common sense of people from the internet to identify job site risks from a large number of job site pictures. The third HCA approach is an automatic communication protocol optimization approach that maximizes the team situation awareness of construction workers and leads to the early detection of workflow delays and critical path changes. Data collection and simulation experiments extensively validate these three HCA approaches.Dissertation/ThesisDoctoral Dissertation Civil, Environmental and Sustainable Engineering 201

    Braid: Weaving Symbolic and Neural Knowledge into Coherent Logical Explanations

    Full text link
    Traditional symbolic reasoning engines, while attractive for their precision and explicability, have a few major drawbacks: the use of brittle inference procedures that rely on exact matching (unification) of logical terms, an inability to deal with uncertainty, and the need for a precompiled rule-base of knowledge (the "knowledge acquisition" problem). To address these issues, we devise a novel logical reasoner called Braid, that supports probabilistic rules, and uses the notion of custom unification functions and dynamic rule generation to overcome the brittle matching and knowledge-gap problem prevalent in traditional reasoners. In this paper, we describe the reasoning algorithms used in Braid, and their implementation in a distributed task-based framework that builds proof/explanation graphs for an input query. We use a simple QA example from a children's story to motivate Braid's design and explain how the various components work together to produce a coherent logical explanation. Finally, we evaluate Braid on the ROC Story Cloze test and achieve close to state-of-the-art results while providing frame-based explanations.Comment: Accepted at AAAI-202
    • …
    corecore