Professional Documents
Culture Documents
Spring 2021
http://cs.brown.edu/courses/csci1850
• Class Activity
HM1
HM2 5. Softmax
HM3
f(X)
HM4
HM5 Y=0/1
X
1. Convolution 2. Max 3. Dropout 4. Multi-layer
Pooling Perceptron
DeepChrome: Class based optimixation
Trained model
HM1
HM2 5. Softmax
HM3
f(X)
HM4
HM5 Y=1
X
1. Convolution 2. Max 3. Dropout 4. Multi-layer
Pooling Perceptron
DeepChrome: Class based optimixation
Trained model
HM1
HM2 5. Softmax
HM3
f(X)
HM4
HM5 Y=1
X
1. Convolution 2. Max 3. Dropout 4. Multi-layer
Pooling Perceptron
DeepChrome: Class based optimization
Color Scale
Y= 0 0.0 1.0
Cell-type: E057
H3K27me3
H3K36me3
H3K4me1
H3K4me3
H3K9me3
Freq. of active bins
PROMOTER
DISTAL PROMOTER
REPRESSOR
Questions?
Class activity [10 mins]
• Think
Compare and contrast the following visualization methods
Method Advantages Disadvantages
Perturbation analysis
Saliency maps
Temporal output
Attention
Mechanism
“Park”
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
Formulation of attention in neural networks (NLP)
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
Formulation of attention in neural networks (NLP)
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
Formulation of attention in neural networks (NLP)
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
Formulation of attention in neural networks (NLP)
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
Formulation of attention in neural networks (NLP)
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
Formulation of attention in neural networks (NLP)
Courtesy: https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html
AttentiveChrome: Setup
Input
HM1 HM2 HM3
AttentiveChrome: Recurrent Neural Network
HM1 HM1
AttentiveChrome: Attention
Softmax
W
Context
Vector
HM1 HM1
AttentiveChrome: Performance
0.95
0.85
0.8
0.75
0.7
0.65
0.6
0.55
Improvement for 49/56 0.5
56 Cell-types
Cell-types
RFC SVC DeepChrome AttentiveChrome
AttentiveChrome: Bin-level attention
(1) What positions are important?
CELL TYPE: GM12878 (Blood Cell)
Repressors Promoters
H3K27me3
H3K36me3
H3K4me1
H3K4me3
PROMOTER
DISTAL PROMOTER
REPRESSOR H3K9me3
Gene: PAX5
Questions?
Upcoming
Course website: http://cs.brown.edu/courses/csci1850