1 research outputs found

    Deep Learning Services for Patents

    No full text
    Most of word embedding techniques provide only one vector representation for each word in a text corpus, despite the fact that a single word could have multiple meanings. In this paper, we developed a domain-specific word and phrase embedding model for the patent domain. It treats patent phrases as single information units. Natural language processing techniques are used to extract the meaningful terms from five million patent documents, and a word embedding algorithm is used for generating semantic representation of those terms. This model can be used for a wide rage of tasks like search query expansion, patent semantic similarity search, enrichment and for supporting other patent text mining tasks like patent technology categorization, and knowledge discovery.
    corecore