A Caltech Library Service

Multi Sense Embeddings from Topic Models

Jain, Shobhit and Bodapati, Sravan Babu and Nallapati, Ramesh and Anandkumar, Anima (2019) Multi Sense Embeddings from Topic Models. . (Unpublished)

[img] PDF (3 Feb 2020) - Accepted Version
Creative Commons Attribution.


Use this Persistent URL to link to this item:


Distributed word embeddings have yielded state-of-the-art performance in many NLP tasks, mainly due to their success in capturing useful semantic information. These representations assign only a single vector to each word whereas a large number of words are polysemous (i.e., have multiple meanings). In this work, we approach this critical problem in lexical semantics, namely that of representing various senses of polysemous words in vector spaces. We propose a topic modeling based skip-gram approach for learning multi-prototype word embeddings. We also introduce a method to prune the embeddings determined by the probabilistic representation of the word in each topic. We use our embeddings to show that they can capture the context and word similarity strongly and outperform various state-of-the-art implementations.

Item Type:Report or Paper (Discussion Paper)
Related URLs:
URLURL TypeDescription Paper ItemProceedings
Additional Information:Accepted at ACL supported conference for Natural Language & Speech Processing. this https URL, Year: 2019
Record Number:CaltechAUTHORS:20200109-091458616
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:100581
Deposited By: Tony Diaz
Deposited On:09 Jan 2020 18:16
Last Modified:10 Dec 2020 22:20

Repository Staff Only: item control page