MPI-INF Logo
Campus Event Calendar

Event Entry

What and Who

Learning with Memory Embeddings

Volker Tresp
Siemens Research and LMU Munich
Talk
AG 1, AG 2, AG 3, AG 4, AG 5, SWS, RG1, MMCI  
Expert Audience
English

Date, Time and Location

Thursday, 14 April 2016
14:30
60 Minutes
E1 4
024
Saarbrücken

Abstract

Embedding learning, a.k.a. representation learning, has been shown to be able to model large-scale semantic knowledge graphs. A key concept is a mapping of the knowledge graph to a tensor representation whose entries are predicted by models using latent representations of generalized entities. Latent variable models are well suited to deal with the high dimensionality and sparsity of typical knowledge graphs and have successfully been employed in knowledge graph completion and fact extraction from the Web. We have extended the approach to also consider temporal evolutions, temporal patterns and subsymbolic representations, which permits us to model medical decision processes. In addition, we consider embedding approaches to be a possible basis for modeling cognitive memory functions, in particular semantic and concept memory, episodic memory, sensory memory, short-term memory, and working memory.

Contact

Petra Schaaf
5000
--email hidden
passcode not visible
logged in users only

Petra Schaaf, 04/13/2016 10:00
Petra Schaaf, 04/13/2016 09:58 -- Created document.