Professional Documents
Culture Documents
ABSTRACT
The diagnosis of benign and malignant breast cancer is a challenging issue today. Breast cancer is the most common
cancer that women su®er from. The sooner the cancer is detected, the easier and more successful it is to treat it. The
most common diagnostic method is the mammography of a simple radiographic picture of the chest. The use of image
processing techniques and identifying patterns in the detection of breast cancer from mammographic images reduces
human errors in detecting tumors, and speeds up diagnosis time. Arti¯cial Neural Network (ANN) has been widely
used to detect breast cancer, and has signi¯cantly reduced the percentage of errors. Therefore, in this paper, Convo-
lution as Neural Network (CNN), which is the most e®ective method, is used for the detection of various types of
cancers. This study presents a Multiscale Convolutional Neural Network (MCNN) approach for the classi¯cation of
tumors. Based on the structure of MCNN, which presents mammography picture to several deep CNN with di®erent
size and resolutions, the classical handcrafted features extraction step is avoided. The proposed approach gives better
classi¯cation rates than the classical state-of-the-art methods allowing a safer Computer-Aided Diagnosis of pleural
cancer. This study reaches the diagnosis accuracy of 97 0:3 using multiscale convolution technique which reveals the
e±cient proposed method.
§
Corresponding author: Homayoon Yektaei, Department of Biomedical Engineering, Islamic Azad University, Tehran North
Branch, Tehran, Iran. E-mail: Hyektai1994@gmail.com
1950034-1
H. Yektaei, M. Manthouri & F. Farivar
by UNIVERSITY OF NEW ENGLAND on 09/21/19. Re-use and distribution is strictly not permitted, except for Open Access articles.
Fig. 1 New cases of breast cancer for women and the number of women dying in the last twelve years.4
Biomed. Eng. Appl. Basis Commun. 2019.31. Downloaded from www.worldscientific.com
diseases, and more speci¯cally unwanted and abnormal signi¯cantly increased accuracy in the cases of breast
growth of the cells of the human body is known as tumor. Inter-dataset training and evaluation using
cancer. Cancer can attack any part of the body and can multiple datasets. (g) Generalization: This demonstrates
then be distributed to any part of the body. Among all generalization of the method to new data, from separate
type of cancers, breast cancer (BC) is the most common datasets and graders, and captured from di®erent devi-
one among women. Statistics show the increasing rate of ces. CNN are known for robustness to small changes in
BC each year. Figure 1 shows the number of females inputs because of °exibility. Also, they do not require
recently facing BC, as well as the number of females who any speci¯c feature extraction step.5,6 The proposed
have died in Australia since 2007. This ¯gure shows that MCNN architecture relies on several deep neural net-
more and more females are newly facing BC, and the works that alternate convolutional and pooling layers.
number of females dying of it has also increased each These deep neural networks belong to a broad class of
year. This is the situation of Australia (population models generally termed multi-stage architectures.7 This
20–25 million), but it can be used as a symbol of the BC architecture of convolution interlaced with pooling lay-
situation of the whole world.5 ers is dedicated to the automatic feature extraction. The
Four crucial and markable indexes include: (a) Speed ¯nal classi¯cation is performed by some classical fully
and autonomation: This results in a fast method connected layers stacked on top.3
requiring no user input. (b) Independence: The method This research presents multi-scale CNN for diagnosis
is not dependent on other techniques succeeding such as of breast cancer. Due to the abilities of CNN and the
segmentation or detecting other landmarks. (c) No creation of images in the desired size and magnitudes
handcrafted features: Since features do not need to be with the help of a multiscale CNN, the accuracy of the
manually de¯ned, we avoid the di±culty encountered by diagnosis is increased. To show the e±ciency of
conventional machine learning algorithms in identifying the proposed method, comparison is make with other
the best feature set that represents the data. It also methods.40,41 The results reveal the signi¯cant achieve-
removes the requirement of a skilled technician to ment of the suggested technique.
identify such features manually which takes a consider- The remainder of this paper is organized as follows:
able amount of time and can produce subjective results, Second-section describes the proposed methodology.
particularly with a large dataset. (d) Accurate simulta- The results of this research are described in third section.
neous detection: We detect more than one position Fourth section is discussion part for analytical results of
simultaneously, retaining high accuracy for each. the paper. And the paper is concluded in ¯fth section.
(e) Robustness: The method is robust in the sense that it
continues to work well even on poor quality images. We
develop a multiscale approach to convolutional neural MATERIALS AND METHODS
networks (CNN) to focus on the region of interest.
Materials
(f) Improved Accuracy: This approach allows the
method to focus on the region of interest, removing re- Database
dundant background data from consideration and fa- The study is based on MIAS,8 referring to the patient's
cilitating re¯nement of the localization. This results in imaging center. Pictures of 1024 samples of 322 patients,
1950034-2
Diagnosis of Breast Cancer using Multiscale Convolutional Neural Network
1950034-3
H. Yektaei, M. Manthouri & F. Farivar
Pooling layers
A pooling layer is usually located after a convolutional
layer and can be used to reduce the size of the feature
maps and network parameters. Its overall goal is to re-
duce the size of the data and reduce the volume of the
calculations. Like the convoys, the pooling layers
are stable (stable) to change neighborhoods in their
calculations. The pooling layer is placed after the
by UNIVERSITY OF NEW ENGLAND on 09/21/19. Re-use and distribution is strictly not permitted, except for Open Access articles.
1950034-4
Diagnosis of Breast Cancer using Multiscale Convolutional Neural Network
Bermain Sahir et al. used arti¯cial neuronal classi¯ca- Pak et al. also utilized Non-subsampled CT for
tion of convolution to classify the masses and natural breast-image (MIAS dataset) classi¯cation and obtained
tissue of the breast. Initially, they captured the area of 91.43% mean Accuracy and 6.42% mean False Positive
interest (ROI) and placed on the average in a subset. Rate (FPR).32
Second, Gradient di®erence (GLDS) and Spot Gray This research has proposed a new approach to CNN
Dependence (SGLD) characteristics from di®erent in detecting breast cancer using the minimias data to
regions are calculated, and ¯nally, the calculated char- maximize accuracy. This proposed method is multiscale
acteristics were introduced as inputs to the convolu- convolution. Also, the proposed approach has yielded
tional classi¯cation.26
by UNIVERSITY OF NEW ENGLAND on 09/21/19. Re-use and distribution is strictly not permitted, except for Open Access articles.
1950034-5
H. Yektaei, M. Manthouri & F. Farivar
Networks Architectures
Construct 5 di®erent CNNs that act at di®erent reso-
lutions. The size of the images at full resolution is
80 80. This size is successively divided by a factor
by UNIVERSITY OF NEW ENGLAND on 09/21/19. Re-use and distribution is strictly not permitted, except for Open Access articles.
DISCUSSION
As the literature shows, di®erent methods and techni-
ques for categorizing images in di®erent types of breast
data have been used. However, the state-of-the-art
image classi¯cation technique of the CNN has put its
Fig. 9 A histogram of mammogram images. strong footprint in the image-analysis ¯eld, especially
1950034-6
Diagnosis of Breast Cancer using Multiscale Convolutional Neural Network
the image-classi¯cation ¯eld. Though the \AlexNet" Fotin et al. also utilized the CNN method for tomo-
model proposed by Krizhevsky has gained new mo- synthesis image classi¯cation and obtained an Area
mentum in the CNN research ¯eld, a CNN model was Under the Curve (AUC) value of 0.93. Transfer learning
¯rst utilized by Fukushima et al.33 who proposed the is another important concept of the CNN method,
\Necognitron" model that recognizes stimulus patterns. which allows the model to derive attributes from the
For the mammogram image classi¯cation, Wu et al. beginning and not use a weight sharing concept to teach
Biomed. Eng. Appl. Basis Commun. 2019.31. Downloaded from www.worldscientific.com
¯rst utilized the CNN model.34 Though little work had a model. This method is helpful when the database
been done on the CNN model until the end of the 20th contains fewer images. Jiang et al.38 utilized a transfer
century, this model is only obtained from the AlexNet learning method for Mammogram image classi¯cation
model. Advanced engineering techniques have been and obtained an AUC of 0.88. Before utilizing it in a
used by research groups such as the Visual Geometry CNN model, they performed a preprocessing operation
Group and Google, which have modeled the VGG-16, to enhance the images. Suzuki et al.39 also used the
VGG-19 and GoogleNet models. Arevalo et al.35 clas- bene¯t of transfer learning techniques to train their
si¯ed benign and malignant lesions using the CNN model to classify mammogram images and obtained
model, and this experiment was performed on 766 sensitivity of 89.9%. They performed their experiment
mammogram images, where 426 images contain benign with just 198 images. Most image categorization is
and 310 malignant lesions. Before classifying the data, based on the CNN method based on global features
they utilized preprocessing techniques to increase the extraction techniques. Recently, researchers have also
image enhancement and obtained a 0:82 0:03 Re- shown an interest in how local features can be utilized
ceiver Operating Characteristic (ROC) value. Google- with the CNN model for data classi¯cation. Both
Net and AlexNet methods have been utilized by Zejmo global and local features have been used by Rezaei-
et al.36 for the classi¯cation of cytological specimens louyeh et al.28 for Histopathological image classi¯cation.
into benign and malignant classes. The best accuracy For local feature extraction, the authors used the
obtained when they utilized the GoogleNet model was Shearlet transform and obtained an accuracy of
83.00%. Qiu et al.37 used the CNN method to extract 86 3:00%. For local feature extraction, Sharma et al.29
global features for Mammogram image classi¯cation used the GLCM, GLDM methods and then fed the local
and obtained an average achieved accuracy of 71.40%. features to a CNN model for the Mammogram image
1950034-7
H. Yektaei, M. Manthouri & F. Farivar
A new approach based on a multi-dimensional deep volutional neural networks for vision–based classi¯cation
learning technique is suggested for benign diagnosis of of cells, in Asian Conf Computer Vision, Springer, Vol.
lump in breast mammary imaging. Designed by CNNs 342, p. 352, 2012.
4. Nahid A-A, Kong Y, Histopathological breast-image
by discovering sophisticated images of breasts without
classi¯cation using local and frequency domains by con-
human supervision, the performance of our proposed volutional neural network, Information 9:19, 2018.
system can improve the competitive approach and play 5. Nebauer, C, Evaluation of convolutional neural networks
Biomed. Eng. Appl. Basis Commun. 2019.31. Downloaded from www.worldscientific.com
to a much higher degree of precision. In this section, we for visual recognition, IEEE Trans Neur Networks
reviewed a few examples of convolutional work in this 685:696, 1998.
¯eld. In all of the work that worked on the masses and 6. LeCun Y, Bottou L, Bengio Y, Ha®ner, P, Gradient-based
learning applied to document recognition, Proc IEEE
used the Mia data, the method of coupling of multi-
2278:2324, 1998.
variate convolution has had the best performance. 7. Hubel DH, Wiesel TN, Receptive ¯elds, binocular inter-
action and functional architecture in the cat's visual cor-
tex, J Physiol 106:154, 1962.
CONCLUSION 8. Suckling J, Parker J, Dance D, Astley S, Hutt I, Boggis C,
Ricketts I, Stamatakis E, Cerneaz N, Kok S, The mam-
This paper describes a breast cancer diagnostic system mographic image analysis society digital mammogram
using image and network processing techniques. This database, in Int Congress Series Exerpta Medica, Vol.
375, p. 378, 1994.
work is based on MCNN. In the proposed method, before
9. http://peipa.essex.ac.uk/info/mias.html/5 oct 2018.
the canvassing, the processing of the mammography 10. Krizhevsky A, Sutskever I, Hinton GE, Imagenet classi-
image of the surplus areas, the pectoral muscle and the ¯cation with deep convolutional neural networks, in Adv
labels (if any) are removed, and since working on the Neur Inf Process Syst 1097:1105, 2012.
chest, we are forced to mirror the images to pair them. 11. Girshick R, Donahue J, Darrell T, Malik, J, Rich feature
This research improves the quality of mammogram hierarchies for accurate object detection and semantic
segmentation, in Proc IEEE Conf Computer Vision and
images using the CLAHE algorithm. Depending on the
Pattern Recognition 580:587, 2014.
breasts, we split the image into 5 di®erent sizes. Using 12. Long JL, Zhang N, Darrell T, Do convnets learn corre-
the multi-dimensional method, each image is convoluted spondence?, in Adv Neur Inf Process Syst 1601:1609, 2014.
to a di®erent size, and the number of convolutions to get 13. Ciresan D, Giusti A, Gambardella LM, Schmidhuber J,
the result. Using convolutional multiscale method, the Deep neural networks segment neuronal membranes in
electron microscopy images, Adv Neur Inf Process Syst
mean accuracy of the proposed method was 97 0:3%,
2843:2851, 2012.
the sensitivity of detection, 95.9%, speci¯city of 94.8% 14. Cernazanu-Glavan C, Holban S, Segmentation of bone
and the detection error of 3%. The results indicate that structure in X-ray images using convolutional neural
preprocessing operations on mammographic images network, Adv Electr Comput Eng 87:94, 2013.
signi¯cantly increase the speed and precision of the 15. Li S, Chan AB, 3d human pose estimation from monocular
system. At the end of the process, MCNN will be able to images with deep convolutional neural network, in Asian
Conf Computer Vision, Springer, Vol. 332, p. 347, 2014.
detect and, as far as possible, to improve the accuracy,
16. Levi G, Hassner T, Age and gender classi¯cation using
sensitivity, speci¯city, and accuracy. To conclude, the convolutional neural networks, in Proc IEEE Conf
larger our patches (like the pyramid) would be, the Computer Vision and Pattern Recognition Workshops,
higher the precision conveyed. If we increase the number Vol. 34, p. 42, 2015.
of convolutions from 5 to 7 and even more, our precision 17. Meier U, Ciresan DC, Gambardella LM, Schmidhuber J,
Better digit recognition with a committee of simple neural
will also increase in the multiscale convolution method.
nets, in Int Conf IEEE, Document Analysis and Recog-
In the further work, the purpose of this study is to in- nition (ICDAR), Vol. 1250, p. 1254, 2011.
crease the convolutional layers, which, as a result of the 18. Ueda N, Optimal linear combination of neural networks
investigation, increases the layer thickness, in a multi- for improving classi¯cation performance, IEEE Trans
scale CNN, accuracy also increases. Pattern Anal Mach Intell 207:215, 2000.
1950034-8
Diagnosis of Breast Cancer using Multiscale Convolutional Neural Network
19. Zeiler MD, Hierarchical Convolutional Deep Learning in of mammographic lesions, Med Image Anal 303:312,
Computer Vision, New York University, 2013. 2017.
20. Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov 32. Pak F, Kanan HR, Alikhassi A, Breast cancer detection
D, Erhan D, Vanhoucke V, Rabinovich A, Going deeper and classi¯cation in digital mammography based on Non-
with convolutions, in Proc IEEE Conf Computer Vision Subsampled Contourlet Transform (NSCT) and Super
and Pattern Recognition, Vol. 1, p. 9, 2015. Resolution, Comput Methods Programs Biomed 89:107,
21. Oquab M, Bottou L, Laptev I, Sivic J, Is object localiza- 2015.
tion for free?-weakly-supervised learning with convolu- 33. Fukushima K, Miyake S, Ito T, Neocognitron: A neural
tional neural networks, in Proc IEEE Conf Computer network model for a mechanism of visual pattern recog-
Vision and Pattern Recognition, Vol. 685, p. 694, 2015. nition, IEEE Trans Syst Man Cybernetics 826:834, 1983.
by UNIVERSITY OF NEW ENGLAND on 09/21/19. Re-use and distribution is strictly not permitted, except for Open Access articles.
22. Scherer D, Müller A, Behnke S, Evaluation of pooling 34. Wu CY, Lo S-CB, Freedman MT, Hasegawa A, Zuurbier
operations in convolutional architectures for object rec- RA, Mun SK, Classi¯cation of microcalci¯cations in
ognition, in Arti¯cial Neural Networks–ICANN 2010, radiographs of pathological specimen for the diagnosis of
Springer, Vol. 92, p. 101, 2010. breast cancer, in Medical Imaging: Image Process Int Soc
23. Cireşan DC, Meier U, Masci J, Gambardella LM, Opt Photonics 630:642, 1994.
Schmidhuber J, High-performance neural networks for 35. Arevalo J, Gonz alez FA, Ramos-Pollan R, Oliveira JL,
Biomed. Eng. Appl. Basis Commun. 2019.31. Downloaded from www.worldscientific.com
visual object classi¯cation, arXiv:1102.0183. Lopez MAG, Representation learning for mammography
24. Huang FJ, Boureau Y-L, LeCun Y, Unsupervised learning mass lesion classi¯cation with convolutional neural net-
of invariant feature hierarchies with applications to object works, Computer Methods Programs Biomed 248:257,
recognition, in IEEE Conf Computer Vision and Pattern 2016.
Recognition, CVPR'07, Vol. 1, p. 8, 2007. 36. Żejmo M, Kowal M, Korbicz J, Monczak R, Classi¯cation
25. Al-Bander B, Al-Nuaimy W, Williams BM, Zheng Y, of breast cancer cytological specimen using convolutional
Multiscale sequential convolutional neural networks for neural network, in J Phys Conf Ser 12:60, 2017.
simultaneous detection of fovea and optic disc, Biomed 37. Qiu Y, Wang Y, Yan S, Tan M, Cheng S, Liu H, Zheng B,
Signal Process Control 91:101, 2018. An initial investigation on developing a new method to
26. Satapathy SC, Udgata SK, Biswal BN, in Proc Int Conf predict short-term breast cancer risk based on deep
Frontiers of Intelligent Computing: Theory and Applica- learning technology, in Med Imag Comput-Aided Diag-
tions (FICTA), Springer Science & Business Media, 2012. nosis: Int Soc Opt Photonics 97:521, 2016.
27. Rasti R, Teshnehlab M, Phung SL, Breast cancer diag- 38. Jiang F, Liu H, Yu S, Xie Y, Breast mass lesion classi¯-
nosis in DCE-MRI using mixture ensemble of convolu- cation in mammograms by transfer learning, in Proc 5th
tional neural networks, Pattern Recog 381:390, 2017. Int Conf Bioinformatics and Computational Biology,
28. Rezaeilouyeh H, Mollahosseini A, Mahoor MH, Micro- ACM, Vol. 59, p. 62, 2017.
scopic medical image classi¯cation framework via deep 39. Suzuki S, Zhang X, Homma N, Ichiji K, Sugita N,
learning and shearlet transform, J Med Imag 44:501, Kawasumi Y, Ishibashi T, Yoshizawa M, Mass detection
2016. using deep convolutional neural network for mammo-
29. Sharma K, Preet B, Classi¯cation of mammogram images graphic computer-aided diagnosis, in 55th Annual Conf
by using CNN classi¯er, in 2016 Int Conf IEEE, Advances IEEE Society of Instrument and Control Engineers of
in Computing, Communications and Informatics Japan (SICE), pp. 1382–1386, 2016.
(ICACCI), Vol. 2743, p. 2749, 2016. 40. Huai L and Ray Liu KJ, Multi-modular neural network
30. Jiao Z, Gao X, Wang Y, Li J, A deep feature based for breast cancer detection, Adv Signal Process Technol
framework for breast masses classi¯cation, Neuro- Soft Comput 140:158, 2000.
computing 221:231, 2016. 41. SC Fok, Ng EYK and Tai K, Early detection and visu-
31. Kooi T, Litjens G, van Ginneken B, Gubern-Merida alization of breast tumor with thermogram and neural
A, Sanchez CI, Mann R, den Heeten A, Karssemeijer network, J Mech Med Biol 185–195, 2002.
N, Large scale deep learning for computer aided detection
1950034-9