MPI-INF Logo
Campus Event Calendar

Event Entry

What and Who

Latent Variable Models in Dialogue Generation

Xiaoyu Shen
MMCI
PhD Application Talk

Graduate Student Informatics, UdS
AG 1, AG 2, AG 3, AG 4, AG 5, SWS, RG1, MMCI  
Public Audience
English

Date, Time and Location

Tuesday, 10 October 2017
09:30
60 Minutes
E1 4
R024
Saarbrücken

Abstract

Open-domain dialogue generation is an important research area and is drawing more and more attention in the past years. Nowadays, the vast amount of conversational corpora and the popularity of seq2seq models have made it possible to design data-driven, end-to-end trainable dialogue systems without verbose handcrafted rules. However, vanilla seq2seq models stochastical variations only at the token level, seducing the system to gain immediate short rewards and neglect the long-term structure. One way of attenuating this problem is by introducing “latent variables”, which stand for high-level sentence representations to help guide the generating process. This talk explains the application of latent variable models on dialogue generation and two main challenges: uninterpretability of latent variables and difficulty of training. We propose some solving strategies to these challenges respectively and validate the effectiveness.

Contact

IMPRS Office Team
0681 93251800
--email hidden
passcode not visible
logged in users only

Tags, Category, Keywords and additional notes

Stephanie Jörg, 10/09/2017 13:29 -- Created document.