You are on page 1of 12

Pytorch

Seq2Seq

pytorch-chatbot
github
https://github.com/ywk991112/pytorc
h-chatbot

Requirements

1. linux
2. python
3. pytorch
4. tqdm

$ git clone https://github.com/ywk991112/pytorch-chatbot

$ cd pytorch-chatbot

$ mkdir data

$ cd data

$ wget https://www.dropbox.com/s/euu7w7peogicxti/jacoxu

1.
a. Preprocessing
b. Feature extraction
2. Training
a. Module
b. Graph
c. Gradient decent
3. Testing
Module
When building neural networks we frequently think of arranging the computation
into layers, some of which have learnable parameters which will be optimized
during learning.

In PyTorch, the nn package serves this same purpose. The nn package defines
a set of Modules, which are roughly equivalent to neural network layers.

A Module receives input Variables and computes output Variables, but may also
hold internal state such as Variables containing learnable parameters.
Sequence to Sequence with Attention
EncoderRNN
( Encoder )
Attention ( forward )
Attention ( score )
Decoder

$ python3 main.py -tr ./data/jacoxu -la 1 -hi 512 -lr 0.0001 -it
1000 -b 64 -p 50 -s 100

You might also like