How to Represent Meaning in Natural Language Processing? Word, Sense and Contextualized Embeddings

An illustration of the meaning conflation deficiency in a 2D semantic space around the ambiguous word mouse (Dimensionality was reduced using PCA; visualized with Tensorflow embedding projector).
  • Unsupervised models [5,8,9] directly learn word senses from text corpora.
  • Knowledge-based techniques [6,10,11] exploit the sense inventories of knowledge resources (e.g. WordNet or BabelNet) as their main source for representing meanings.
A general illustration of contextualized word embeddings and how they are integrated in NLP models. A language modelling component is responsible for analyzing the context of the target word (cell in the figure) and generating its dynamic embedding.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jose Camacho Collados

Jose Camacho Collados

Mathematician, AI/NLP researcher and chess International Master. http://www.josecamachocollados.com