[an error occurred while processing this directive]

The Hierarchical Paradigm for Graph Embedding

Graph embedding is the problem of learning a mapping function from each node in a graph to a low dimensional latent representation. These representations can then be used as features for common tasks on graphs such as multi-label node classification, clustering and link prediction. Most of the recent graph embedding methods are neural networks based, and have proven both highly scalable and performant. However, they all rely on a non-convex optimization solved using stochastic gradient descent which can become stuck at a local minima.

In this talk, we will present a new method for learning low dimensional embeddings in such a way as to to preserve higher-order structural features. Simplify and Embed achieves this by recursively compressing the input graph prior to embedding it, to avoid troublesome embedding configurations (i.e. local minima) found during non-convex optimization. Experiments on various real-life graphs prove that Simplify and Embed is a general strategy to improve all of the popular neural algorithms for embedding graphs.

Bio

Haochen is a PhD candidate in the Department of Computer Science at Stony Brook University, advised by Prof.Steven Skiena. Haochen's research interests lie at the intersection of Data Mining, Graph Theory and Representation Learning. He is particularly interested in representation learning on graphs. Visit http://www3.cs.stonybrook.edu/~haocchen for more details.

Speaker

Haochen Chen

Date

Wednesday, February 22, 2017

Time

1:15 pm - 2:15 pm

Location

IACS Seminar Room