Emergent Mind

Abstract

Word embeddings are substantially successful in capturing semantic relations among words. However, these lexical semantics are difficult to be interpreted. Definition modeling provides a more intuitive way to evaluate embeddings by utilizing them to generate natural language definitions of corresponding words. This task is of great significance for practical application and in-depth understanding of word representations. We propose a novel framework for definition modeling, which can generate reasonable and understandable context-dependent definitions. Moreover, we introduce usage modeling and study whether it is possible to utilize embeddings to generate example sentences of words. These ways are a more direct and explicit expression of embedding's semantics for better interpretability. We extend the single task model to multi-task setting and investigate several joint multi-task models to combine usage modeling and definition modeling together. Experimental results on existing Oxford dataset and a new collected Oxford-2019 dataset show that our single-task model achieves the state-of-the-art result in definition modeling and the multi-task learning methods are helpful for two tasks to improve the performance.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.