899 research outputs found

    Time-Aware Probabilistic Knowledge Graphs

    Get PDF
    The emergence of open information extraction as a tool for constructing and expanding knowledge graphs has aided the growth of temporal data, for instance, YAGO, NELL and Wikidata. While YAGO and Wikidata maintain the valid time of facts, NELL records the time point at which a fact is retrieved from some Web corpora. Collectively, these knowledge graphs (KG) store facts extracted from Wikipedia and other sources. Due to the imprecise nature of the extraction tools that are used to build and expand KG, such as NELL, the facts in the KG are weighted (a confidence value representing the correctness of a fact). Additionally, NELL can be considered as a transaction time KG because every fact is associated with extraction date. On the other hand, YAGO and Wikidata use the valid time model because they maintain facts together with their validity time (temporal scope). In this paper, we propose a bitemporal model (that combines transaction and valid time models) for maintaining and querying bitemporal probabilistic knowledge graphs. We study coalescing and scalability of marginal and MAP inference. Moreover, we show that complexity of reasoning tasks in atemporal probabilistic KG carry over to the bitemporal setting. Finally, we report our evaluation results of the proposed model

    Toward a unified TreeTalker data curation process

    Get PDF
    The Internet of Things (IoT) development is revolutionizing environmental monitoring and research in macroecology. This technology allows for the deployment of sizeable diffuse sensing networks capable of continuous monitoring. Because of this property, the data collected from IoT networks can provide a testbed for scientific hypotheses across large spatial and temporal scales. Nevertheless, data curation is a necessary step to make large and heterogeneous datasets exploitable for synthesis analyses. This process includes data retrieval, quality assurance, standardized formatting, storage, and documentation. TreeTalkers are an excellent example of IoT applied to ecology. These are smart devices for synchronously measuring trees’ physiological and environmental parameters. A set of devices can be organized in a mesh and permit data collection from a single tree to plot or transect scale. The deployment of such devices over large-scale networks needs a standardized approach for data curation. For this reason, we developed a unified processing workflow according to the user manual. In this paper, we first introduce the concept of a unified TreeTalker data curation process. The idea was formalized into an R-package, and it is freely available as open software. Secondly, we present the different functions available in “ttalkR”, and, lastly, we illustrate the application with a demonstration dataset. With such a unified processing approach, we propose a necessary data curation step to establish a new environmental cyberinfrastructure and allow for synthesis activities across environmental monitoring networks. Our data curation concept is the first step for supporting the TreeTalker data life cycle by improving accessibility and thus creating unprecedented opportunities for TreeTalker-based macroecological analyse

    Microsimulation - A Survey of Methods and Applications for Analyzing Economic and Social Policy

    Get PDF
    This essential dimensions of microsimulation as an instrument to analyze and forecast the individual impacts of alternative economic and social policy measures are surveyed in this study. The basic principles of microsimulation, which is a tool for practical policy advising as well as for research and teaching, are pointed out and the static and dynamic (cross-section and life-cycle) approaches are compared to one another. Present and past developments of microsimulation models and their areas of application are reviewed, focusing on the US, Europe and Australia. Based on general requirements and components of microsimulation models a microsimulation model's actual working mechanism are discussed by a concrete example: the concept and realization of MICSIM, a PC microsimulation model based on a relational database system, an offspring of the Sfb 3 Statitic Microsimulation Model. Common issues of microsimulation modeling are regarded: micro/macro link, behavioural response and the important question of evaluating microsimulation results. The concluding remarks accentuate the increasing use of microcomputers for microsimulation models also for teaching purposes.Microsimulation, Microanalytic Simulation Models, Microanalysis, Economic and Social Policy Analysis

    Evolution and Ethics of Digital Technology in Marketing

    Get PDF
    The purpose of this paper is to identify the implications of ethics in the evolution of technology. This article aims to define and analyze concepts such as: Big Data, Artificial Intelligence and Bioinformatics. Also, this article presents the applicability of Artificial Intelligence and discover the future trend of jobs in the coming years, the importance of adapting to changes and learning more skills that help to support future jobs

    Strategies for reducing risk inpatent applications'analysis

    Get PDF
    Patents are a unique and exhaustive source of technological knowledge. Technical information we can find in them is not possible to obtain in other ways, such as market and economic analysis, voice of customer, etc. Patent databases also have high accessibility (since they are free available on the web) and a high level of format uniformity. This lets patent data can be electronically searched individually or together. These special features make patents a strategic source for supporting CEOs in decision making activities. At present, worldwide patent database contains over 100 million of documents. Over the last decades, the number of patent applications per year is globally raising. The global growth in patent activity can be understood as an effect due to the shift of the economy towards the knowledge-based economy paradigm. According to this, the outcomes generated by knowledge, like patents, are business products or productive assets, which can be exploited as economical goods. In such a framework, it is crucial for patent owners knowing the value of held patents to adopt the best exploitation strategy. For whom works in the patents' environment, the main difficulty relates to the proceeding the application for patent is subjected, which generally is long and complex. The application filing is the first step in an ‘obstacle course’. Dozens of events and scenarios can affect the likelihood that the application reaches the grant, some of which might cause the unavoidable fall of the application itself. The first effect of this contest is the lack of certainties and the need to adopt work strategies and assessment criteria that take the risk into account. The surge of patent filings had drastically increased the uncertainty status of patent literature. The tools and methods currently available for patent experts are not designed to manage the risk due to this uncertain scenario. IP offices of firms, patent valuation experts of banks and other expert-in-the-field people must take the risk and manage it through their own professional expertise: a difficult job which this work addresses to. Despite the high relevance and practical consequences of the uncertainty and risk related to the procedural aspects of patent applications, only few works paid attention to them. They did not give suggestions about tools or methods able to prevent or assess the level of uncertainty in patent proceeding, neither to support the applicant carrying out patent analyses in presence of high share of patent applications. This thesis is a sort of full immersion in the uncertainty of the patent application environment. From the coarsest errors anyone might do, to suggestions about most up-to-date sources of information, tools and strategies available to limit the uncertainty risk, up to an analytical system to compute the impact of procedural events on the success likelihood of the application for patent. It is a journey into the complex world of patent seen from a non-common point of view that can give useful insight to anyone working in the field. Chapter 1 presents an overview on the currently available valuation methods for patents and the limitation they have in working with uncertainty due to patent applications. Chapter 2 is an in-depth discussion about issues related to the transformations the text of patent application may undergo during the PCT and EPC proceedings. Chapter 3 expounds a wide analysis that carried out in EP patent register to make an infographic about the success-rate of EP applications in grant and post grant proceedings. Chapter 4 gives operative indications about building a business intelligence to assess the background into which positioning a patent application. Finally, the Chapter 5 deals with the extraction of information about the market structure from patent data. In presence of patent thicket, dominant positions of main incumbent competitors might hindrance the access to the market of new entrants

    Strategies for reducing risk inpatent applications'analysis

    Get PDF
    Patents are a unique and exhaustive source of technological knowledge. Technical information we can find in them is not possible to obtain in other ways, such as market and economic analysis, voice of customer, etc. Patent databases also have high accessibility (since they are free available on the web) and a high level of format uniformity. This lets patent data can be electronically searched individually or together. These special features make patents a strategic source for supporting CEOs in decision making activities. At present, worldwide patent database contains over 100 million of documents. Over the last decades, the number of patent applications per year is globally raising. The global growth in patent activity can be understood as an effect due to the shift of the economy towards the knowledge-based economy paradigm. According to this, the outcomes generated by knowledge, like patents, are business products or productive assets, which can be exploited as economical goods. In such a framework, it is crucial for patent owners knowing the value of held patents to adopt the best exploitation strategy. For whom works in the patents' environment, the main difficulty relates to the proceeding the application for patent is subjected, which generally is long and complex. The application filing is the first step in an ‘obstacle course’. Dozens of events and scenarios can affect the likelihood that the application reaches the grant, some of which might cause the unavoidable fall of the application itself. The first effect of this contest is the lack of certainties and the need to adopt work strategies and assessment criteria that take the risk into account. The surge of patent filings had drastically increased the uncertainty status of patent literature. The tools and methods currently available for patent experts are not designed to manage the risk due to this uncertain scenario. IP offices of firms, patent valuation experts of banks and other expert-in-the-field people must take the risk and manage it through their own professional expertise: a difficult job which this work addresses to. Despite the high relevance and practical consequences of the uncertainty and risk related to the procedural aspects of patent applications, only few works paid attention to them. They did not give suggestions about tools or methods able to prevent or assess the level of uncertainty in patent proceeding, neither to support the applicant carrying out patent analyses in presence of high share of patent applications. This thesis is a sort of full immersion in the uncertainty of the patent application environment. From the coarsest errors anyone might do, to suggestions about most up-to-date sources of information, tools and strategies available to limit the uncertainty risk, up to an analytical system to compute the impact of procedural events on the success likelihood of the application for patent. It is a journey into the complex world of patent seen from a non-common point of view that can give useful insight to anyone working in the field. Chapter 1 presents an overview on the currently available valuation methods for patents and the limitation they have in working with uncertainty due to patent applications. Chapter 2 is an in-depth discussion about issues related to the transformations the text of patent application may undergo during the PCT and EPC proceedings. Chapter 3 expounds a wide analysis that carried out in EP patent register to make an infographic about the success-rate of EP applications in grant and post grant proceedings. Chapter 4 gives operative indications about building a business intelligence to assess the background into which positioning a patent application. Finally, the Chapter 5 deals with the extraction of information about the market structure from patent data. In presence of patent thicket, dominant positions of main incumbent competitors might hindrance the access to the market of new entrants
    corecore