You are on page 1of 3

I. EXPERIMENTAL RESULTS Table IV and Fig.

5 also examined the dropout layer in


In this study, it is aimed to diagnose eye retinal diseases AlexNet (Dropout) has prevented the loss of memorization,
using Alexnet, Vgg16 and ResNet using this dataset. In this but did not show much effect on performance. This shows
direction, each model was run in 8 iterations and successful that the models have not gone by memorization, but it can
results were obtained. Table II and Fig. 3 show the results of be observed that the Dropout layer exhibits better results in
the training success of these models. In the study, Vgg16 layers that do not use this layer in minimizing loss.
resulted in 95.76 percent and AlexNet 95.28 percent =-------------
performance. TABLE IV. RESULTS OF TRAINING LOSS.
TABLE II. RESULTS OF ACCURACY. Last Average Lowest
Method Epoch Last Average Best Method Epochs
Loss Loss Loss
Accuracy Accuracy Accuracy AlexNet 8 0.8057 0.3554 0.7941
AlexNet 8 94.73 88.48 94.73 VGG16 8 0.7838 0.4087 0.7838
VGG16 8 84.21 87.23 84.21 ResNet 8
ResNet 8 89.47 84.62 89.47
1

------------
0, ,8
9
0 7
0, ,6
0 5
0, ,4
0 ,3
Accuracy

1,
Loss
0, ,1
2

5
0 0

1,
4
1,
3
12

1,
65

Le out

2
1

N )
p

1,
el
ex

Le exN et
N

1
et

N
0,

Al lexN t)
(D

e t e t Dr
9 1
ro

A
0,
76 oc

op gg1
26
10 hs

o
V
Ep

u 6
0,

51
1

(
Fig. 5. LeNet, AlexNet, AlexNet (Dropout Layer) and Vgg16 Train Loss
Al

Graphics.
ex

0,

Fig. 3. LeNet, AlexNet, AlexNet (Dropout Layer) and Vgg16 Train


t

N
12

6
e

Accuracy Graphics
Vg 176

0,
61

76 po
g1

In Table V and Fig. 6, verification loss values was


51

10 ch
6

E
0,

In parallel with the train accuracies, verification results 11 s


observed that Vgg16 produced losses value at the lowest
26
4
20

0,

showed high results in these models as shown in Table


1

values about 0.187. But the Verification of the loss results


3

15
0,

III and Fig. 4. Vgg16 achieved the highest verification 11


showed that Dropout level results in good results in the
2

76
0,

score with percent 94.02 accuracy. 20


Validation side this layer has no effect equal to some of the
1
1

TABLE III. RESULTS OF VALIDATION ACCURACY.


0

data coming from each model is good the results of the


Last Average Best
Method Epoch
Accuracy Accuracy Accuracy
memorization, It was observed that it was more beneficial in
LeNet 200 82.66 78.72 83.76 the training of the model.
AlexNet TABLE V. RESULTS OF VALIDATION LOSS.
(Dropout) 200 89.68 79.91 92.28
AlexNet 200 92.58 86.58 93.1 Loss
Last Loss
Average Loss
Lowest
VGG16 200 93.78 84.31 94.02 Method Epochs
200 0.5479 0.6179 0.4806
1
0,

LeNet 200 0.3149 0.5821 0.2312


9

AlexNet
0,

200 0.2325 0.3818 0.2158


8

(Dropout)
200 0.1979 0.4207 0.187
0,

AlexNet
7

VGG16
0,
6
0,
Accuracy
5
0,
4
0,

2,
Le
12

N
t
3

5
65

e
Al

2,
0,

Loss
ex
t)

25
2

N
et
0,

2
(D

1,
1

ro

75 ,
po
0

1
76 oc

5
10 hs
Ep

1,
1

25
Al

Le exN et(
ex
t g1

N
1
N

Al exN t)
Vg

et et
12

0,
6

Al
61

75 ,

12

Fig. 4. LeNet, AlexNet, AlexNet (Dropout Layer) and Vgg16 Validation


51

op g1
65
0

Accuracy Graphics.
17

o
17

Vg

u
5
62

D
61 Epo

r
0,
01

01

Fig. 6. LeNet, AlexNet, AlexNet (Dropout Layer) and Vgg16 Validation


25

12 hs

Loss Graphics.
61
0

51
17
62
01
When all the graphs were examined, the highest score showed that Vgg16 had a 93.01 verification
performance result of the Alexnet (No Dropout layer) model. Generally, the graphs of the accuracies
have been observed to increase steadily in the models, but AlexNet (Dropout) model of each iteration 50
percent of the amount of data as a result of the model verification and train showed the fluctuations in
the part and tried to balance itself.
Similarly, as loss rates, the lowest loss rate was 0.1187 % for Vgg16 and 0.2158 % for AlexNet (No
Dropout Layer). In the best weighted models of these models, the test images that are never seen by this
dataset are given in 4000 uniform images. Table VI shows the sensitivity, recall and f score values of
the error matrices calculated according to these test results in Fig. 7. And Table VI shows that the
highest scores again given the Vgg16'in but then the high-value model AlexNet (Dropout) model has
been seen to produce high scores.

Fig. 7. LeNet Confusion Matrix.

TABLE VI. LENET RESULTS OF TEST.


Class Accuracy Precision Recall F Score
CNV 90.58 % 0.94 0.75 0.83
DME 84.83 % 0.63 0.73 0.67
DRUSEN 85.13 % 0.51 0.83 0.63
NORMAL 86.96 % 0.88 0.69 0.77

Fig. 8. AlexNet Confusion Matrix.

TABLE VII. ALEXNET RESULTS OF TEST.


Class Accuracy Precision Recall F Score
CNV 96.6 % 0.99 0.89 0.94
DME 95.5 % 0.86 0.96 0.90
DRUSEN 92.6 % 0.73 0.96 0.83
NORMAL 93.25 % 0.98 0.80 0.88
Fig. 9. AlexNet(Dropout) Confusion Matrix.

TABLE VIII. ALEXNET RESULTS OF TEST.


Class Accuracy Precision Recall F Score
CNV 97.53 % 0.99 0.92 0.95
DME 96.1 % 0.89 0.95 0.92
DRUSEN 93.85 % 0.79 0.96 0.87
NORMAL 94.38 % 0.97 0.83 0.90

Fig. 10. VGG16 Confusion Matrix.

TABLE IX. VGG16 RESULTS OF TEST.


Class Accuracy Precision Recall F Score
CNV 96.28 % 0.99 0.88 0.93
DME 96.6 % 0.91 0.96 0.93
DRUSEN 94.43 % 0.80 0.98 0.88
NORMAL 95.65 % 0.97 0.87 0.92

II. CONCLUSIONS
In this study, using optical coherence images, eye retinal diseases were diagnosed using convolutional
neural networks. The proposed method was used to classify the existing dataset using the AlexNet, VGG-16
ResNet architectures. In all processing steps, the AlexNet architecture yielded a good classification result of
94.7 % accuracy. In the future studies, by using deep learning methods, it is aimed to determine the
deformed region by removing the heat map.

You might also like