Professional Documents
Culture Documents
AAAI2019 Poster Modified
AAAI2019 Poster Modified
Introduction
Neural Question Generation (NQG)
- A task of generating a question from a given passage with deep neural
networks.
Motivation
- Previous NQG models suffer from a problem that a significant
proportion of the generated questions include words in the target
answer.
Approach
- Separately encode the passage and the target answer
- Extract key information from the target answer
Outcome
- Significantly reduces the number of improper questions that include
target answer.
- Quantitatively outperforms previous state-of-the-art NQG models.
Answer-Separated Seq2Seq
• ℎ : j-th hidden state of
Input / Output
Keyword-Net answer encoder
- A masked passage and a target answer / A question
𝑜 =𝑐 • 𝑠 : t-th hidden state of
Answer-Separated Passage Encoder & Answer Encoder
𝑝 = 𝑆𝑜𝑓𝑡𝑚𝑎𝑥 𝑜 ℎ decoder
- two separate Bi-LSTMs
• 𝑐 : context
Answer-Separated Decoder
𝑜 = 𝑝 ℎ vector(attention
- LSTM with attention mechanism
mechanism)
- Initialization with the last hidden state of the answer encoder
- Keyword-Net: attention-based key information extractor 𝑠 = 𝐿𝑆𝑇𝑀(𝑦 ,𝑠 ,𝑐 ,𝑜 ) • 𝑜 : l-th hop Keyword-Net
- Retrieval style word generator: Adopt the idea from (Ma et al., feature vector for t-th
2018) timestep(of decoder)