You are on page 1of 4

A Low-Cost Radar-based Domain Adaptive Breast

Cancer Screening System


Samuel Claflin Mohammad Arif Ul Alam
CUBICS Lab CUBICS Lab
Dept. of Computer Science Dept. of Computer Science
University of Massachusetts Lowell University of Massachusetts Lowell
samuel_claflin@student.uml.edu mohammadariful_alam@uml.edu

Abstract— Over the past three decades, the advancements of addition of MRI screening to annual mammogram screenings
breast cancer screening technologies such mammography, increases medical expenses for a given patient by $50,000+ on
ultrasound, Magnetic resonance imaging (MRI) saved countless average [5]. Ultrasound offers a fair middle-ground to the two
lives. The invention of mammography screening of breast cancer former methods - proving to be significantly less expensive for
in the 1990s led a technological revolution which is, now-a-days, patients than MRI screenings and a decent competitor to the
coupled with MRI and/or ultrasound to achieve diagnoses of much mammogram-MRI duo in accurate detection [13]. However,
greater accuracy than previously attainable. However, these ultrasound machines suitable for accurate breast cancer
technologies (mammography, ultrasound, MRI) are not as widely screening are oftentimes unaffordable for hospitals in less-
available to patients as one might assume less fortunate countries
fortunate countries (like Bangladesh) with an average cost of
(such as Bangladesh) often cannot afford the potentially enormous
price tag that several ultrasound machines of sufficient quality for
$20,000-$40,000 for a single, small unit and costs typically
accurate diagnoses incurs. In this paper, we present a low-cost exceed this average by a great margin [3].
(<100 USD) millimeter Wave (mmWave) Radar sensor array (3- In this paper, we investigate a suitable alternative to both
10 GHz) imaging technology and a deep learning domain ultrasound and MRI supplemental breast cancer screening
adaptation model-based breast cancer screening system. More methods that is capable of achieving equivalent or better
specifically, (i) we develop a mmWave Radar sensor array (18 accuracy while being significantly more affordable and
sensor antennas) based 2D imaging system; (ii) we develop a deep
accessible. In order to accomplish this, we utilize an inexpensive
learning based domain adaptation model that can learn breast
low-cost mmWave radar sensor array-based imaging technology
segmentation and cancer detection from expensive source data
(mammography, ultrasound) and transfer the knowledge to less
(technology commonly found in stud finders) coupled with deep
expensive target data (Radar images), (iii) we validated our system learning and transfer learning.
and methods by utilizing our existing mammography and
ultrasound breast cancer screening data as well as 14 patients'
Radar images collected from a third world country (Bangladesh).

Keywords— domain adaptation, deep learning, breast cancer


detection, radar imaging, mmWave

I. INTRODUCTION
It’s no secret that breast cancer serves as one of the greatest
threats to women’s lives worldwide — being the second leading
cause of death in women with a 2.6% chance of being fatal to a
woman each year, trailing only behind lung cancer [10]. As is Figure 1: Overall System Architectural Diagram
with all types of cancer, early detection and early treatment are
absolutely essential to preventing mortality. Since 1990, breast II. OVERALL SYSTEM ARCHITECTURE
cancer mortality rates have dropped by 30% largely due to the Fig 1 illustrates a schematic diagram of our developed system
great strides made in mammography-based screening methods architecture which consists of four components:
[5]. While this is undeniably a great success for the medical and
scientific communities, it is crucial to maintain the momentum A. Radar Sensing System
and build upon these achievements in order to further prevent We use Walabot [9] mmWave imaging radar sensor. The
countless fatalities. It is this very mindset that led to the Walabot radar sensor unit is a low-cost (79 USD) radar sensor
development and significant advancements in both MRI [12] that operates by using an antenna array (15 antenna arrays with
and ultrasound-based [11] screening methods that are frequency ranging from 3 – 10 GHz) to transmit radar signals,
commonly coupled with mammography [5]. MRI screening has receive any returned signals, and use these received signals to
shown to provide a greater detection rate when performed in construct a matrix of raw image data that can be accessed
addition to mammography, but is typically only recommended through a USB interface and selected programming language
for patients with a hereditary risk for developing breast cancer API. The aspect of sensors of this nature that proves to be most
that exceeds 20% due to its egregious expenses [13]; the crucial to the possible detection of breast cancer is the property

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


of the transmitted waves that causes them to pass through background or region of interest (ROI) [6]. We consider U-NET
dielectric materials (such as a wall) and bounce off of conductive model as a feature extraction layers for domain adaptation
materials (such as a nail within a stud) [9] In theory, signals (Feature Layers in Fig 4).
transmitted by the Walabot sensor would be able to pass through
skin and flesh, but bounce off of a region within the breast that
is potentially cancerous and generate raw image data similar to
that of mammogramography, ultrasound, and MRI. Fig 2 shows
our Walabot mmWave sensor unit (Fig 2-a) and a sample 2D
image generated from the radar signal return with cancer (Fig 2-
b) and no cancer (Fig 2-c)

Figure 4: Domain Adaptation Model Schematic Diagram


D. Domain Adaptation Model
Figure 2: (a) mmWave Radar Sensor device with axis indication (b) While the performance of convolutional neural networks for
Sample 2D(a)image generated from mmWave
(b) Radar Sensor for cancerous
(c) breast computer vision tasks is undeniable, these networks often
and (c) no cancerous breast.
perform exceptionally poorly in the absence of exceedingly
B. Data Preprocessing large datasets. As previously mentioned, datasets of this size and
Our Radar sensing system saves the raw 2D images as well of high quality are especially hard to come by in biomedical
as raw radar sensor data. We annotate the images with “No applications. Strategies such as data augmentation or performing
Cancer” and “Cancer” labels creating a binary classification a predetermined series of transformations on each provided
problem. We apply typical standardization, normalization and image in order to generate several additional images, have been
transformation prior training deep learning model. used with profound success [7]. In order to overcome the
inevitable data shortage that would be faced during
experimentation due to the fact that the entirety of the Walabot
image dataset would need to be generated by our research team,
we decide that data augmentation would be implemented in
addition to transfer learning. Transfer learning or Domain
Adaptation refers to a method of machine learning in which a
model is first trained on one or more general tasks known as
sourcetask are typically closely related to the targettask. After
the initial source training is complete, the model is adapted to
the target task and, in theory, will achieve a greater performance
due to its prior knowledge of the related source task(s). In
Figure 3: U-NET Model Architecture
addition to achieving a greater performance, transfer learning
C. U-NET Training allows all machine learning models to more closely mimic
The convolutional neural network architecture chosen for human learning by emulating the way in which new information
the task of segmenting regions of interest within mammogram, is related to prior concepts in order to develop a more intuitive
ultrasound, and Walabot images is known as the U-Net Model— understanding [8].
an advanced CNN developed specifically for the segmentation Training Domain Adaptation Model: We develop a transfer
of biomedical images. In Fig 3, each blue arrow represents a learning framework as shown in Fig 4. The transfer learning
passage of the image data through a convolutional layer of the
framework has two CNN based models: Source and Target
network with a rectified linear unit (ReLU) activation function.
Each red arrow represents a passage of the image data through a models. Each of the model has two component layers, (i)
max pooling layer; these layers are responsible for reducing the Feature Layers and (ii) Classification Layers. At first, we train
size of the input feature maps. Due to the fact that the first half our baseline model (U-NET) using source dataset and consider
of the U-Net model progressively reduces the x and y as the source model. While training the source model, we
dimensions of the input image until it’s been reduced to a single separate the final two layers of U-NET model as
vector of data, it’s referred to as the contractionside. The second “Classification Layers” and rest of the U-NET model as
half of the U-Net model is referred to as the expansionside “Feature Layers”. After training the source model, we separate
because it is up-convolutional layers in conjunction with out the feature layers, freeze the weight update of feature
additional standard convolutional layers result in the single- layer, add classification layers with a fully connected dense
dimensional vector being converted into a two-channel, high layer (softmax activation) of two classes categorical output
resolution segmentation map. Each channel in the output and enable weight update capabilities on the classification
segmentation map corresponds with one of two classes: layer. The newly developed model is called “Target Model”,
which we aim to train using target dataset. The final loss 2. Digital Database for Screening Mammography
function will be an energy function that is computed by a (DDSM) includes decompressed images, data
pixel-wise soft-max over the final feature map combined with selection and curation by trained mammographers,
the cross-entropy loss function. updated mass segmentation and bounding boxes, and
pathologic diagnosis for training data, formatted
III. EXPERIMENTAL EVALUATION similarly to modern computer vision data sets. The
To evaluate our proposed framework, we used several existing data set contains 753 calcification cases and 891
datasets as well as collected a small amount of exploratory mass cases, providing a data-set size capable of
dataset under IRB exception. Then, we have evaluated our analyzing decision support systems in
proposed framework’s efficiency in terms of breast cancer mammography.
segmentation and screening accuracies. 3. Our Data has been collected in a rural area (Khulna)
of a third world country (Bangladesh). The data is
A. Radar System Sensitivity Test in Lab Environment extremely small comparing to the other datasets. Our
Due to the fact that the strength of the signal received by the data includes 14 Walabot radar image data with a
Walabot sensor is dependent upon the conductivity of the diagnosis report (cancer or no cancer) where 4 of
material that it comes in contact with, we conducted an them have cancer and 10 of them have no cancer. Fig
experiment to confirm that the sensor's sensitivity was suitable 2 shows cancerous (Fig 2(b)) and not cancerous (Fig
for the breast screening task before we started collecting data 2(c) breast cancer screening image via Walabot
from patients. The procedure of the experiment was as mmWave radar sensor.
follows: Inject several pieces of thick, fatty pork with a dense
fluid; calibrate the Walabot on a portion of pork that was C. Training Domain Adaptation Models
known to not contain fluid; following the calibration period, We consider the BUD, DDSM and our datasets towards
slowly move the sensor from the area not containing the fluid evaluating our proposed domain adaptation framework. In this
to the area containing the fluid and take note of any visible regard, we ran the following transfer learning experiments:
differences that may appear in the scan in addition to saving
all of the raw image data obtained from the sensor. All BUD ® DDSM: We down sized high-resolution ultrasound to
visualization of the raw image data obtained from the sensor the similar dimension of mammography image, trained the
as well as the recording of these images was performed using source model using Ultrasound images until convergence,
a proprietary desktop application written using C++ in froze the feature layers, add classification layers towards
conjunction with an API for the Walabot sensor provided by developing target model and train the target model using
Vayyar. The application allowed for all of the Walabot Mammography images.
sensor's parameters to be procedurally adjusted in order to DDSM ® BUD: We down sized high-resolution ultrasound to
achieve optimal results for the target scenario in addition to the similar dimension of mammography image, trained the
automatically converting the unformatted raw image data into source model using mammography images until convergence,
several raw image slices that can be cycled through to froze the feature layers, add classification layers towards
efficiently visualize the entirety of the raw image data in an developing target model and train the target model using
intuitive manner. The experiment was performed several times Ultrasound images.
on 5 unique pieces of pork and data was collected from DDSM ® Our Data: We down sized high-resolution
several different locations and angles. Additionally, Ferrofluid mammography to the similar dimension of our radar image,
was chosen as the dense injection medium that the sensor trained the source model using mammography images until
would attempt to detect throughout the experiment iterations. convergence, froze the feature layers, add classification layers
This experiment confirmed that a standard Walabot sensor towards developing target model and train the target model
was indeed capable of differentiating between regions of using our radar images.
similar moisture/density within a single object. BUD ® Our Data: We down sized high-resolution
B. Datasets ultrasound to the similar dimension of our radar image, trained
the source model using ultrasound images until convergence,
We have utilized the following datasets: froze the feature layers, add classification layers towards
1. Breast Ultrasound Dataset (BUD) includes breast developing target model and train the target model using our
ultrasound images among women in ages between 25 radar images.
and 75 years old. This data was collected in 2018 [2].
The number of patients is 600 female patients having D. Results
750 images. The images are in PNG format. The We first created training and testing datasets considering the
images are categorized into three classes, which are similar distribution of entire dataset on each sample. Then we
normal, benign, and malignant. We consider the trained each of the model using training and testing samples.
normal and benign types of ultrasound images as Table 1 shows the transfer learning accuracy, specificity,
“No_Cancer” and malignant type of ultrasound sensitivity and final losses. At first, for U-NET model (target
images as “Cancer” cancer labels only), we can see that mammography image-based
classification accuracy outperforms any other classification,
while radar image-based classification alone (target only) cancer screenings of quality equivalent to that of modern
provides the lowest accuracy of detecting breast cancer. We standards is proposed. Through the use of radar sensing
can explain this result as follows: according to the study, technology, deep learning with convolutional neural networks,
mammography has the lowest noises for human body imaging and transfer learning, we believe that this goal is achievable in
while radar has the highest noises involved while creating the very near future. The conduction of transfer learning trials
images of human flesh. After applying the transfer learning with an ultrasound dataset as the source and mammogram
framework, we can see all of the baseline models (target only) dataset as the target confirmed not only that the U-Net Model
have been improved significantly. performs exceedingly well in this particular biomedical
application, but that models implemented using transfer
learning can, at the very least, achieve an accuracy equivalent
to that of a model trained entirely on a target dataset, but with
significantly less target data required. It is our hope that the
impending construction of a Radar image dataset will allow
satisfactory results to be obtained with the use of these modern
architectures and transfer learning techniques.
ACKNOWLEDGMENT
This project has been funded by University of Massachusetts
Lowell Immersive Scholars Program.
Figure 5: Validation Accuracy and losses of target transfer learning in
different use cases: BUD ® DDSM, DDSM ® BUD, DDSM ® Our Data, BUD ® Our REFERENCES
Data

Table 1: Summary of Results [1] G. Eason, B. Noble, and I. N. Sneddon, “On certain integrals of Lipschitz-
Hankel type involving products of Bessel functions,” Phil. Trans. Roy.
BUD ® DDSM ® DDSM ® BUD Soc. London, vol. A247, pp. 529–551, April 1955.
DDSM BUD Our Data ® Our
[2] Walid Al-Dhabyani, Mohammed Gomaa, Hussien Khaled, Aly Fahmy,
Data
Dataset of breast ultrasound images, Data in Brief, Volume 28, 2020.
Target Only 90.33 ± 0.2 91.36± 0.1 69.45± 0.8 69.45± 0.8 [3] Escobar, A. (2020, January). Buying a Breast Ultrasound Machine:
Accuracy 93.56 ± 0.2 95.35± 0.2 85.39± 0.4 83.75± 0.5 Prices, Features, and Advice
Specificity 89.45± 0.1 90.50± 0.3 86.80± 0.5 85.79± 0.4 (https://www.kompareit.com/business/medical-equipment-buying-an-
Sensitivity 92.81± 0.2 93.53± 0.1 83.53± 0.3 82.85± 0.2 ultrasound-machine.html)
Loss 0.12± 0.05 0.08± 0.03 0.17± 0.04 0.19± 0.065 [4] Gu, Jiuxiang et al. “Recent advances in convolutional neural networks.”
Pattern Recognit. 77 (2018): 354-377.
As Mammography based breast cancer detection is highly [5] Lee CH, Dershaw DD, Kopans D, et al. Breast cancer screening with
imaging: recommendations from the Society of Breast Imaging and the
accurate than ultrasound, we also can see that for the transfer ACR on the use of mammography, breast MRI, breast ultrasound, and
learning task from Mammography to Our Radar images other technologies for the detection of clinically occult breast cancer. J
(DDSM ® Our Data) provides higher improvements (85.39%) Am Coll Radiol. 2010;7(1):18-27. doi:10.1016/j.jacr.2009.09.022
than that of from Ultrasound (BUD ® Our Data) based image [6] Ronneberger O., Fischer P., Brox T. (2015) U-Net: Convolutional
models (83.75%). We also can observe that we have achieved Networks for Biomedical Image Segmentation. In: Navab N., Hornegger
J., Wells W., Frangi A. (eds) Medical Image Computing and Computer-
satisfactory specificity and sensitivity in detecting breast Assisted Intervention – MICCAI 2015. MICCAI 2015. Lecture Notes in
cancer using Radar images with promising final loss of Computer Science, vol 9351. Springer, Cham.
training transfer learning model. [7] Tellez D, Litjens G, Bándi P, et al. Quantifying the effects of data
augmentation and stain color normalization in convolutional neural
IV. LIMITATIONS AND FUTURE WORKS networks for computational pathology. Med Image Anal.
[8] L. Torrey & J. Shavlik (2009). Transfer Learning. In E. Soria, J. Martin,
We have appropriate datasets for Ultrasound and R. Magdalena, M. Martinez & A. Serrano, editor, Handbook of Research
Mammography images for breast cancer screening, but we do on Machine Learning Applications. IGI Global.
not have enough data (only 14 samples) for Radar images. [9] Vayyar. (2020). Walabot Technical Brief.
Although, we have achieved promising results using only 14 [10] Wyant, T. (2020, January 8). How Common Is Breast Cancer?: Breast
radar image samples in our U-NET transfer model, the Cancer Statistics
baseline U-NET model (Target only in Table 1), radar image [11] Thigpen D, Kappler A, Brem R. The Role of Ultrasound in Screening
failed significantly. We aim to collect more data from our Dense Breasts-A Review of the Literature and Practical Solutions for
Implementation. Diagnostics (Basel). 2018;8(1):20. Published 2018 Mar
study site, Khulna, Bangladesh in future. At the same time, we 16. doi:10.3390/diagnostics8010020
aim to collect mammography and ultrasound image of breast [12] Radhakrishna S, Agarwal S, Parikh PM, et al. Role of magnetic resonance
cancer screening along with the Radar images to align and imaging in breast cancer management. South Asian J Cancer.
validate more efficient transfer learning models in future. 2018;7(2):69-71. doi:10.4103/sajc.sajc_104_18
[13] Malur S, Wurdinger S, Moritz A, Michels W, Schneider A. Comparison
V. CONCLUSION of written reports of mammography, sonography and magnetic resonance
mammography for preoperative evaluation of breast lesions, with special
In this paper, the prospect of significantly reducing the emphasis on magnetic resonance mammography. Breast Cancer Res.
expense while increasing the accessibility of conducting breast 2001

You might also like