You are on page 1of 1

Graph Based Dependency Parsing

Deniz Hacısüleyman, Ahmet Reşat Soysal


Advisor: Deniz Yüret
Department of Computer Engineering
Koç University

Main goal is to build a model that extracts the


dependency grammar of a given sentence.

Dependency grammar can be used for complex


machine translation and spell checking programs.

• Learning: Train a working parsing model.


• Parsing: Construct the optimal dependency tree of a
given sentence.

•Programming Language:

•Deep Learning Library for Julia: Knet


•Working Environment:

• GPU Machine for Training:

Language Model(Bi-LSTM): Extracts hidden


vectors from the sentence sequentially.

Biaffine Decoder: Gives the scores for every


potential head for every word in the sentence
using MLP and biaffine attention mechanism.

Parser: Eisner’s algorithm to find best


spanning tree from the graph.

•We got arround 22% error on English-EWT dataset from CoNLL 2018
Shared Task when training word embeddings together with the model.

•We got arround 29% error on English-EWT dataset from CoNLL 2018
Shared Task when using pre-trained word embeddings as is without
training them further.
Acknowledgement
Special Thanks to Deniz Yüret, Berkay Furkan Önder, Öznur Özkasap and Karim Sonbol
References
[1] Kübler Sandra, Ryan McDonald, and Joakim Nivre. 2009. Dependency Parsing. San Rafael, Calif.: Morgan & Claypool, US.
[2]Berkay Furkan Önder, Can Gümeli, and Deniz Yüret. 2018. SParse: Koc ̧ University Graph-Based Parsing System for the CoNLL 2018 Shared Task

You might also like