81 research outputs found

    Human and Artificial Intelligence

    Get PDF
    Although tremendous advances have been made in recent years, many real-world problems still cannot be solved by machines alone. Hence, the integration between Human Intelligence and Artificial Intelligence is needed. However, several challenges make this integration complex. The aim of this Special Issue was to provide a large and varied collection of high-level contributions presenting novel approaches and solutions to address the above issues. This Special Issue contains 14 papers (13 research papers and 1 review paper) that deal with various topics related to human–machine interactions and cooperation. Most of these works concern different aspects of recommender systems, which are among the most widespread decision support systems. The domains covered range from healthcare to movies and from biometrics to cultural heritage. However, there are also contributions on vocal assistants and smart interactive technologies. In summary, each paper included in this Special Issue represents a step towards a future with human–machine interactions and cooperation. We hope the readers enjoy reading these articles and may find inspiration for their research activities

    A Comprehensive Classification of Business Activities in the Market of Intellectual Property Rights-related Services

    Get PDF
    Technology and intellectual property markets have witnessed great developments in the last few decades. Due to intellectual property rights gaining more importance and technology companies opening up their innovation processes, a wide range of intellectual property rights related services have emerged in the last two decades. The goal of this research is to develop a comprehensive classification system of intellectual property rights related services (IPSC). The classification is created by applying an ontology engineering process. The IPSC consists of 72 various IPR services divided into six main categories (100 Legal Service; 200 IP Consulting; 300 Matchmaking and Trading; 400 IP Portfolio Processing; 500 IPR-related Financial Service; 600 IPR-related Communication Service). The implications of the thesis are directed to policy makers, technology transfer managers, C-level executives and innovation researchers. The IPSC enables practitioners and researchers to organize industry data that can be thereafter analyzed for better strategy and policy making. In addition, this contributes towards organizing a more transparent and single intellectual property market.:Acknowledgements I Abstract II Contents IV List of Figures VI List of Tables VII 1. Introduction 1 1.1. Introduction to Technology Markets 1 1.2. Explanation of Key Concepts 5 1.3. Research Questions and Goals 9 1.4. Readers Guide 13 2. Literature Review 15 2.1. Intellectual Property Markets State of the Art Review 15 2.2. Ontology Engineering State of the Art Review 22 3. Methodology 26 3.1. Methontology 26 3.2. Planning the IPSC 29 3.3. Specification 30 3.4. Conceptualization 31 3.5. Formalization 32 3.6. Integration 32 3.7. Evaluation 33 3.8. Documentation 33 3.9. Realization and Maintenance 33 4. Data description and collection framework 34 5. Applying Methontology 46 5.1. Knowledge Acquisition and Planning the IPSC 46 5.2. Specification 46 5.3. Conceptualization 47 5.4. Formalization 54 100 Legal Service 56 200 IP Consulting 60 300 Matchmaking and Trading 65 400 IP Portfolio Processing 72 500 IPR-related Financial Service 76 600 IPR-related Communication Service 81 5.5. Integration 86 5.6. Evaluation 95 5.7. Documentation 104 5.8. Realization and Maintenance of the IPSC 106 6. Interview Results and Further Discussions 108 6.1. Implications for Industry 108 6.2. Contributions of the IPSC 110 6.3. Limitations of the IPSC and Future Work 112 7. Conclusions 116 References 120 List of experts interviewed and the date of interview 129 Appendices 13

    Scalable Quality Assessment of Linked Data

    Get PDF
    In a world where the information economy is booming, poor data quality can lead to adverse consequences, including social and economical problems such as decrease in revenue. Furthermore, data-driven indus- tries are not just relying on their own (proprietary) data silos, but are also continuously aggregating data from different sources. This aggregation could then be re-distributed back to “data lakes”. However, this data (including Linked Data) is not necessarily checked for its quality prior to its use. Large volumes of data are being exchanged in a standard and interoperable format between organisations and published as Linked Data to facilitate their re-use. Some organisations, such as government institutions, take a step further and open their data. The Linked Open Data Cloud is a witness to this. However, similar to data in data lakes, it is challenging to determine the quality of this heterogeneous data, and subsequently to make this information explicit to data consumers. Despite the availability of a number of tools and frameworks to assess Linked Data quality, the current solutions do not aggregate a holistic approach that enables both the assessment of datasets and also provides consumers with quality results that can then be used to find, compare and rank datasets’ fitness for use. In this thesis we investigate methods to assess the quality of (possibly large) linked datasets with the intent that data consumers can then use the assessment results to find datasets that are fit for use, that is; finding the right dataset for the task at hand. Moreover, the benefits of quality assessment are two-fold: (1) data consumers do not need to blindly rely on subjective measures to choose a dataset, but base their choice on multiple factors such as the intrinsic structure of the dataset, therefore fostering trust and reputation between the publishers and consumers on more objective foundations; and (2) data publishers can be encouraged to improve their datasets so that they can be re-used more. Furthermore, our approach scales for large datasets. In this regard, we also look into improving the efficiency of quality metrics using various approximation techniques. However the trade-off is that consumers will not get the exact quality value, but a very close estimate which anyway provides the required guidance towards fitness for use. The central point of this thesis is not on data quality improvement, nonetheless, we still need to understand what data quality means to the consumers who are searching for potential datasets. This thesis looks into the challenges faced to detect quality problems in linked datasets presenting quality results in a standardised machine-readable and interoperable format for which agents can make sense out of to help human consumers identifying the fitness for use dataset. Our proposed approach is more consumer-centric where it looks into (1) making the assessment of quality as easy as possible, that is, allowing stakeholders, possibly non-experts, to identify and easily define quality metrics and to initiate the assessment; and (2) making results (quality metadata and quality reports) easy for stakeholders to understand, or at least interoperable with other systems to facilitate a possible data quality pipeline. Finally, our framework is used to assess the quality of a number of heterogeneous (large) linked datasets, where each assessment returns a quality metadata graph that can be consumed by agents as Linked Data. In turn, these agents can intelligently interpret a dataset’s quality with regard to multiple dimensions and observations, and thus provide further insight to consumers regarding its fitness for use

    Mapping and the Citizen Sensor

    Get PDF
    Maps are a fundamental resource in a diverse array of applications ranging from everyday activities, such as route planning through the legal demarcation of space to scientific studies, such as those seeking to understand biodiversity and inform the design of nature reserves for species conservation. For a map to have value, it should provide an accurate and timely representation of the phenomenon depicted and this can be a challenge in a dynamic world. Fortunately, mapping activities have benefitted greatly from recent advances in geoinformation technologies. Satellite remote sensing, for example, now offers unparalleled data acquisition and authoritative mapping agencies have developed systems for the routine production of maps in accordance with strict standards. Until recently, much mapping activity was in the exclusive realm of authoritative agencies but technological development has also allowed the rise of the amateur mapping community. The proliferation of inexpensive and highly mobile and location aware devices together with Web 2.0 technology have fostered the emergence of the citizen as a source of data. Mapping presently benefits from vast amounts of spatial data as well as people able to provide observations of geographic phenomena, which can inform map production, revision and evaluation. The great potential of these developments is, however, often limited by concerns. The latter span issues from the nature of the citizens through the way data are collected and shared to the quality and trustworthiness of the data. This book reports on some of the key issues connected with the use of citizen sensors in mapping. It arises from a European Co-operation in Science and Technology (COST) Action, which explored issues linked to topics ranging from citizen motivation, data acquisition, data quality and the use of citizen derived data in the production of maps that rival, and sometimes surpass, maps arising from authoritative agencies

    Enhancing Usability, Security, and Performance in Mobile Computing

    Get PDF
    We have witnessed the prevalence of smart devices in every aspect of human life. However, the ever-growing smart devices present significant challenges in terms of usability, security, and performance. First, we need to design new interfaces to improve the device usability which has been neglected during the rapid shift from hand-held mobile devices to wearables. Second, we need to protect smart devices with abundant private data against unauthorized users. Last, new applications with compute-intensive tasks demand the integration of emerging mobile backend infrastructure. This dissertation focuses on addressing these challenges. First, we present GlassGesture, a system that improves the usability of Google Glass through a head gesture user interface with gesture recognition and authentication. We accelerate the recognition by employing a novel similarity search scheme, and improve the authentication performance by applying new features of head movements in an ensemble learning method. as a result, GlassGesture achieves 96% gesture recognition accuracy. Furthermore, GlassGesture accepts authorized users in nearly 92% of trials, and rejects attackers in nearly 99% of trials. Next, we investigate the authentication between a smartphone and a paired smartwatch. We design and implement WearLock, a system that utilizes one\u27s smartwatch to unlock one\u27s smartphone via acoustic tones. We build an acoustic modem with sub-channel selection and adaptive modulation, which generates modulated acoustic signals to maximize the unlocking success rate against ambient noise. We leverage the motion similarities of the devices to eliminate unnecessary unlocking. We also offload heavy computation tasks from the smartwatch to the smartphone to shorten response time and save energy. The acoustic modem achieves a low bit error rate (BER) of 8%. Compared to traditional manual personal identification numbers (PINs) entry, WearLock not only automates the unlocking but also speeds it up by at least 18%. Last, we consider low-latency video analytics on mobile devices, leveraging emerging mobile backend infrastructure. We design and implement LAVEA, a system which offloads computation from mobile clients to edge nodes, to accomplish tasks with intensive computation at places closer to users in a timely manner. We formulate an optimization problem for offloading task selection and prioritize offloading requests received at the edge node to minimize the response time. We design and compare various task placement schemes for inter-edge collaboration to further improve the overall response time. Our results show that the client-edge configuration has a speedup ranging from 1.3x to 4x against running solely by the client and 1.2x to 1.7x against the client-cloud configuration

    The effect of corporate venture capital investments on the investor´s eco-innovation performance

    Get PDF
    This paper aims at investigating the relationship between firms' corporate venture capital (CVC) investments and firms’propensity to increase corporations' eco-innovation performance. CVC investments may be instrumental in accessing innovative knowledge as well as harvesting eco-innovations from entrepreneurial ventures and, therefore, an essential part of a firm’s overall innovation strategy. Using panel data from 71 CVC investors during 2010-2018, this study investigates under which CVC investment conditions firms increase their eco-innovation performance. The empirical analysis suggests that CVC investments are a mechanism to source external knowledge from ventures allowing them to improve their eco-innovation performance. Furthermore, corporations’ eco-innovation performance benefits the most when firms' CVC investments particularly target ventures in earlyinvestment stages. Additionally, corporations should focus their CVC investments on ventures that have a moderate or even low proximity to theirowntechnological knowledge base. These findings contribute to the corporate entrepreneurship, real-option, and eco-innovation literature by showing how CVC investments improve incumbents' eco-innovation performance through external knowledge sourcing
    • …
    corecore