101,199 research outputs found

    An Infrastructure for acquiring high quality semantic metadata

    Get PDF
    Because metadata that underlies semantic web applications is gathered from distributed and heterogeneous data sources, it is important to ensure its quality (i.e., reduce duplicates, spelling errors, ambiguities). However, current infrastructures that acquire and integrate semantic data have only marginally addressed the issue of metadata quality. In this paper we present our metadata acquisition infrastructure, ASDI, which pays special attention to ensuring that high quality metadata is derived. Central to the architecture of ASDI is a erification engine that relies on several semantic web tools to check the quality of the derived data. We tested our prototype in the context of building a semantic web portal for our lab, KMi. An experimental evaluation omparing the automatically extracted data against manual annotations indicates that the verification engine enhances the quality of the extracted semantic metadata

    What do business models do? Narratives, calculation and market exploration

    Get PDF
    Building on a case study of an entrepreneurial venture, we investigate the role played by business models in the innovation process. Rather than debating their accuracy and efficiency, we adopt a pragmatic approach to business models -- we examine them as market devices, focusing on their materiality, use and dynamics. Taking into account the variety of its forms, which range from corporate presentations to business plans, we show that the business model is a narrative and calculative device that allows entrepreneurs to explore a market and plays a performative role by contributing to the construction of the techno-economic network of an innovation.WP abstract: Analyzes the uses and functions of business models through original, qualitative case studies focused on research-based spin-offs.Business models; spin-offs; innovation; commercialization; calculation; exploration; R&D; entrepreneurship

    Beyond Bergson: the ontology of togetherness

    Get PDF
    Bergson's views on communication can be deduced from his theory of selfhood, in which he identifies the human self as heterogeneous duration a complex process that can only be adequately understood from within, when we intuit our own inner life. Another person, accessing us from outside, inevitably distorts and misunderstands our nature because duration is incommunicable. Does Bergsonism assert the failure of communication in principle? No, if we develop Bergson's theory further and identify the process of communication as heterogeneous duration. As such, it is intuited from within by its participants who engage with each other in the process of dealing with the same object. They intuit the process of which they are part and thus intuit each other's involvement in it as well. To appreciate the importance of this implicit mutual communicative engagement we only need to imagine an empty airport with just one passenger or a deserted pleasure beach. Bergson does not have a theory of communication per se but his views on communication can be extracted from his ontology and epistemology. These views may account for some apparent failures of communication conflicts, loneliness, hostility and Bergson uses them to suggest a way out towards better and more harmonious intersubjective relations. Bergson claims that we misunderstand reality in general and each other in particular. Instead of trying to grasp human nature directly in intuition we analyse its being and create a distorted view of one another. If we were able to conceive the human self as it is, we would see it as duration and might be able to reach the state of an open society where people's love towards one another is ontologically backed up by their openness towards each other's being. However, the Bergsonian theory of duration and intuition, promising to resolve the difficulties of communication, reasserts these difficulties metaphysically. The idea of duration entails the impossibility of accessing it from outside, as the genuine view of it is only possible from within. This paper, instead of trying to salvage a model of communication where people strive to intuit each other's uniqueness, locates intuition in the very act of communication. Bergson himself finds intuition in artistic creation where the artist and spectators communicate by intuiting a common object without learning any personal details about each other. We find that communication is itself duration and that the communicating participants are heterogeneous elements of that duration. As such they are subservient to the act of communication that displays features of autonomous existence. Our model of communication, although accepting the impenetrability of one's person for a complete cognitive penetration from outside, allows for the partial fusion of minds engaged in the same act of communication and negotiating the same subject matter

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    Specification and implementation of mapping rule visualization and editing : MapVOWL and the RMLEditor

    Get PDF
    Visual tools are implemented to help users in defining how to generate Linked Data from raw data. This is possible thanks to mapping languages which enable detaching mapping rules from the implementation that executes them. However, no thorough research has been conducted so far on how to visualize such mapping rules, especially if they become large and require considering multiple heterogeneous raw data sources and transformed data values. In the past, we proposed the RMLEditor, a visual graph-based user interface, which allows users to easily create mapping rules for generating Linked Data from raw data. In this paper, we build on top of our existing work: we (i) specify a visual notation for graph visualizations used to represent mapping rules, (ii) introduce an approach for manipulating rules when large visualizations emerge, and (iii) propose an approach to uniformly visualize data fraction of raw data sources combined with an interactive interface for uniform data fraction transformations. We perform two additional comparative user studies. The first one compares the use of the visual notation to present mapping rules to the use of a mapping language directly, which reveals that the visual notation is preferred. The second one compares the use of the graph-based RMLEditor for creating mapping rules to the form-based RMLx Visual Editor, which reveals that graph-based visualizations are preferred to create mapping rules through the use of our proposed visual notation and uniform representation of heterogeneous data sources and data values. (C) 2018 Elsevier B.V. All rights reserved

    Planting and harvesting innovation - an analysis of Samsung Electronics

    Get PDF
    This study explores how firms manage the entire life cycle of innovation projects based on the framework of harvesting and planting innovation. While harvesting innovation seeks new products in the expectation of financial performance in the short term, planting innovation pursues creating value over a long time period. Without proper management of the process of planting and harvesting innovation, firms with limited resources may not be successful in launching innovative new products to seize a momentum in high tech industries. To examine this issue, the case of Samsung Electronics (SE), now an electronics giant originated from a former developing country, is analyzed. SE has shown to effectively utilize co-innovation to maintain numerous planting and harvesting innovation projects. Both researchers and practitioners would be interested in learning about how SE shared risks of innovation investment with external partners at the early stage of innovation cycles

    Understanding personal data as a space - learning from dataspaces to create linked personal data

    No full text
    In this paper we argue that the space of personal data is a dataspace as defined by Franklin et al. We define a personal dataspace, as the space of all personal data belonging to a user, and we describe the logical components of the dataspace. We describe a Personal Dataspace Support Platform (PDSP) as a set of services to provide a unified view over the user’s data, and to enable new and more complex workflows over it. We show the differences from a DSSP to a PDSP, and how the latter can be realized using Web protocols and Linked APIs.<br/

    Multimedia search without visual analysis: the value of linguistic and contextual information

    Get PDF
    This paper addresses the focus of this special issue by analyzing the potential contribution of linguistic content and other non-image aspects to the processing of audiovisual data. It summarizes the various ways in which linguistic content analysis contributes to enhancing the semantic annotation of multimedia content, and, as a consequence, to improving the effectiveness of conceptual media access tools. A number of techniques are presented, including the time-alignment of textual resources, audio and speech processing, content reduction and reasoning tools, and the exploitation of surface features
    • 

    corecore