You are on page 1of 21

Heterogeneous Graph Attention Network

Xiao Wang1, Houye Ji1, Chuan Shi1, Bai Wang1, Peng Cui2, P. Yu2, Yanfang Ye3
1Beijing University of Posts and Telecommunications

2Tsinghua University

3West Virginia University


CONTENTS

1 Background

2 HAN
3 Experiments

4 Conclusions
CONTENTS

1 Background

2 HAN
3 Experiments

4 Conclusions
1 Background Graph-structured Data

Graph-structured Data
n Graph-structured data are ubiquitous.
n Graph-structured data are flexible to model complex interactions.
1 Background Graph Neural network

Graph Neural Network


n Neural networks for processing graph-structured inputs.
n Flexible to characterize non-Euclidean data.
n For example, graph convolutional network and graph attention network.

TARGET NODE B
Receive neighbor’s message B C

A
A
C B
A C
F E
D F
E
Apply neural network D
INPUT GRAPH A
1 Background Heterogeneous Graph

Heterogeneous Graph
uMultiple types of nodes or links

uRich semantic information


u Meta-path: a relation sequence connecting two objects (e.g., Movie-Actor-Movie).

Two movies directed by the Two movies are starred by


same director. the same actor.
1 Background Motivation

Existing Graph Neural Networks focus on homogeneous graph


u Cannot handle multiple types of nodes and edges.
u Cannot capture rich semantic information.

Heterogeneous Structure Rich Semantic


Multiple types of nodes or links Various semantic relations
1 Background Challenges
Challenges
n How to handle the heterogeneity of graph?
n How to discover the differences of meta-path based neighbors?
n How to find some meaningful meta-paths?
CONTENTS

1 Background

2 HAN
3 Experiments

4 Conclusions
2 HAN Overall Framework

Heterogeneous Graph Attention Network(HAN)

Model heterogeneous structure. Capture rich semantics. Task-specific loss.


2 HAN Node-Level Attention and Aggregating

n Type-Specific Transformation

Type-specific transformation matrix


n Importance of Neighbors

Node-level attention vector


n Node-Level Aggregating Node weight
2 HAN Semantic-Level Attention and Aggregating

n Semantic-Level Attention

n Importance of Meta-path

Semantic-level attention vector

n Semantic-Level Aggregating

Semantic weight
2 HAN Prediction

n Semi-supervised Loss
Parameter of classifier

Labeled data

Optimize for the specific task


(e.g., node classification).

Prediction
CONTENTS

1 Background

2 Proposed Method
3 Experiments

4 Conclusions
3 Experiments Datasets & Baselines & Tasks

Baselines Tasks
u Deepwalk u GCN u Node Classification
u Esim u GAT u Node Clustering
u Metapath2vec u HANnd u Analysis of Attention Mechanism
u HERec u HANsem u Visualization
3 Experiments Node Classification
3 Experiments Node Clustering & Visualization
3 Experiments Attention Analysis

n Node-Level Attention (e.g., P8311)

P831 > P699 > … > P2328 > P1973


Important neighbors have larger
attention values.

n Semantic-Level Attention
APCPA > APA > APTPA

PAP > PSP


Important meta-paths have
larger attention values.

1Xintao Wu, et al. Screening and Interpreting Multi-item Associations Based on Log-linear Modeling. KDD 2003
CONTENTS

1 Background

2 Proposed Method
3 Experiments

4 Conclusions
4 Conclusions

n The first attempt to study the heterogeneous graph neural network


based on attention mechanism.

n A novel heterogeneous graph attention network (HAN) which


includes both of the node-level and semantic-level attentions.

n The state-of-art performance and good interpretability.


Thank you !
Q&A

More materials in http://shichuan.org


Code https://github.com/Jhy1993/HAN

You might also like