MPI-INF Logo
Campus Event Calendar

Event Entry

What and Who

Approaches for Word Representations

Vinh Thinh Ho
International Max Planck Research School for Computer Science - IMPRS
IMPRS Research Seminar

IMPRS-CS Master student
AG 1, AG 2, AG 3, AG 4, AG 5, SWS, RG1, MMCI  
Public Audience
English

Date, Time and Location

Monday, 23 October 2017
12:00
45 Minutes
E1 4
022
Saarbrücken

Abstract

Understanding the meaning of word is considered an important task in natural language processing. It could be then applied in a large number of
fields such as speech recognition, information retrieval, machine translate, or question answering. In this talk, we will go through some
approaches of word representation, which includes:

• Learn distributed representation of words. The main idea is to represent word as a vector of features that has the ability to group similar words,
distinguish different words and even more further tasks. These vectors are typically learned using neural network. We will go through some common
neural network models used to learn these feature vectors.

• We will also focus on the usage of Skip-gram model in learning distributed representation of words, how the model works and some methods
which are used to improve the performance of this model. We then discuss two extensions to speed up the learning phase of Skip-gram model by
sub-sampling frequent words, and help the model to learn phrase vectors by merging words based on a simple data-driven approach.

• Discuss briefly a frequency based approach to represent words, which is called distributional representation. This representation is efficient for
measuring the similarity between individual words.

Contact

IMPRS-CS office
0681 93251800
--email hidden
passcode not visible
logged in users only

Stephanie Jörg, 10/20/2017 12:00 -- Created document.