You are on page 1of 1

Bangla question answering agent

using Deep Learning

Abstract: We would like to develop a deep learning model


that focuses on question answering and can interact with humans
in Bangla. This type of models need to understand the structure
of the language and have a semantic understanding of the context
and the questions. In order to develop such a model we want
to use transfer learning on a well known language model called
BERT(Bidirectional Encoder Representations from Transformers).
This language model(BERT) can start to understand any language
if it is trained on a sufficiently large enough dataset. Traditional
RNNs were used to solve such problems. But due to slow train-
ing and vanishing gradients problems of large RNN structures, the
new state of the art technology the transformer models are well suit-
able solution for such problems. Transformer-based language models
have impressive performance due to self attention mechanism and
their ability to process the inputs in parallel. Different Transformer-
based language models with small changes in their architecture and
pre-training objective, perform differently on different types of tasks.
In order to train the model we will use a Bengali dataset which is
created in the format same as the SQuAD dataset from Stanford.

You might also like