SNL2019まもなく

SNL2019がいよいよ明後日に迫りました。

準備は順調です。会場もいい感じで埋まりそうですが、
まだ当日参加も可能です。

Maximilianが講演のタイトルと内容を少し変更して、連想記憶と階層構想の
埋め込みの関係について、何か新しいことを話してくれることにしたようです。

Title: Representation Learning in Symbolic Domains

Abstract: Many domains such as natural language understanding,
information networks, bioinformatics, and the Web are characterized by
problems involving complex relational structures and large amounts
of uncertainty. Representation learning has become an invaluable approach
for making statistical inferences in this setting by allowing us to
learn high-quality models on a large scale. However, while complex
relational data often exhibits latent hierarchical structures, current
embedding methods do not account for this property. This leads not only
to inefficient representations but also to a reduced interpretability of
the embeddings.

In the first part of this talk, I will discuss methods for learning distributed
representations of relational data such as graphs and text. I will show how
these models are related to classic models of associate memory and that a simple
change in training procedure allows them to capture rule-like patterns on
relational data. In the second part of the talk, I will then introduce a novel
approach for learning hierarchical representations by embedding relations into
hyperbolic space. I will discuss how the underlying hyperbolic geometry allows
us to learn parsimonious representations which simultaneously capture hierarchy
and similarity. Furthermore, I will show that hyperbolic embeddings can
outperform Euclidean embeddings significantly on data with latent hierarchies,
both in terms of representation capacity and in terms of generalization ability.