You are on page 1of 31

Submission Cover Sheet

Please use block capitals

Student Name Harry Beggy

Student ID No.
18378271
Module Code EE425

Degree Mechatronic Engineering

Programme ME4

Year Final

Date 03/11/2021

For use by examiners only (students should not write below this line)

------------------------------------------------------------------------------------------------------------------------------

1|Page
Abstract.
This report will detail how image processing techniques have been implemented to solve
specific problems using Collaboratory with the support of Scikit-image documentation. Part
one of the assignment involved resizing an image, converting it to grayscale, finding the
images min, max and mean greyscales which were approximately 0.07,0.99,0.53. Gaussian
noise was applied to the image which was then filtered using 10x10 mean and median rank
filter noise reduction which slightly blurred the image. Then applied Gaussian noise
reduction which produced the clearest image with a value of 0.5 for sigma.
Part two involved taking an image containing different objects and picking the smallest
scissors out of the image. The smallest scissors were found by re-ordering the list of labelled
objects, leaving the smallest scissors first, and blackening out the rest of the objects. The two
scissors were found using their Euler numbers and then the smallest area was found using
‘while’ loops and ‘if’ statements. Then the scissor was highlighted in the original image. The
smallest scissor was located closest to the bottom of the image and had a centroid of
(818,593) and an area of 8555 originally but after the noise reduction techniques were applied
it had an area was increased slightly depending on the level of noise applied to the image.
The program was able to detect the smallest scissors with Gaussian noise applied up to a
variance of 2 and failed at 2.5.
Part three took the resized image from part one and used histogram matching to match the
colour profile of an image to the resized image along with the matched histogram graphs.
These results were displayed and plotted successfully. The matched output was much darker
than the input image.

2|Page
Contents
Abstract. .................................................................................................................................................. 2
Section 1 - Part 1: Filters. ....................................................................................................................... 5
Section 1.1 Introduction. ..................................................................................................................... 5
Section 1.1.1 Part 1(a) Introduction. ............................................................................................... 5
Section 1.1.2 Part1(b) Introduction. ................................................................................................ 5
Section 1.1.3 Part1(c) Introduction. ................................................................................................ 5
Section 1.1.4 Part(d) Introduction. .................................................................................................. 5
Section 1.2 Pseudo Code..................................................................................................................... 5
Section 1.2.1 Part(a) Pseudo Code. ................................................................................................. 5
Section 1.2.2 Part(b) Pseudo Code.................................................................................................. 5
Section 1.2.3 Part(c) Pseudo Code. ................................................................................................. 5
Section 1.2.4 Part(d) Pseudo Code.................................................................................................. 6
Section 1.3 Code Implementation. ...................................................................................................... 6
Section 1.3.1 Part(a) Code implementation. ................................................................................... 7
Section 1.3.2 Part(b) Code implementation. ................................................................................... 7
Section 1.3.3 Part(c) Code implementation .................................................................................... 7
Section 1.3.4 Part(d) Code implementation .................................................................................... 7
Section 1.4 Results. ............................................................................................................................. 8
Section 1.4.1 Part(a) Results. .......................................................................................................... 8
Section 1.4.2 Part(b) Results. .......................................................................................................... 8
Section 1.4.3 Part(c) Results. .......................................................................................................... 9
Section 1.4.4 Part(d) Results. .......................................................................................................... 9
Section 1.5 Discussion and Conclusion. ........................................................................................... 10
Section 1.5.1 Part(a) Discussion and Conclusion. ........................................................................ 10
Section 1.5.2 Part(b) Discussion and Conclusion. ........................................................................ 10
Section 1.5.3 Part(c) Discussion and Conclusion. ........................................................................ 10
Section 1.5.4 Part(d) Discussion and Conclusion. ........................................................................ 11
Section 2 - Part 2: Region Properties. ................................................................................................... 12
Section 2.1 Introduction. ................................................................................................................... 12
Section 2.1.1 Part(a) Introduction. ................................................................................................ 12
Section 2.1.2 Part(b) Introduction. ................................................................................................ 12
Section 2.2 Techniques and Rational. ............................................................................................... 12
Section 2.2.1 Part(a) Techniques and Rational. ............................................................................ 12
Section 2.3.2 Part(b) Techniques and Rational. ............................................................................ 12
Section 2.3 Pseudo Code................................................................................................................... 13

3|Page
Section 2.3.1 Part(a) Pseudo Code. ............................................................................................... 13
Section 2.3.2 Part(b) Pseudo Code................................................................................................ 14
Section 2.4 Code Implementation. .................................................................................................... 14
Section 2.4.1 Part(a) Code Implementation. ................................................................................. 14
Section 2.4.2 Part(b) Code Implementation .................................................................................. 17
Section 2.5 Testing and Results. ....................................................................................................... 18
Section 2.5.1 Part(a) Testing and Results. .................................................................................... 18
Section 2.5.2 Part(b) Testing and Results. .................................................................................... 19
Section 2.6 Discussion and Conclusion. ........................................................................................... 20
Section 2.6.1 Part(a) Discussion and Conclusion. ........................................................................ 20
Section 2.6.2 Part(b) Discussion and Conclusion. ........................................................................ 20
3.3 Part 3: Colour Matching.................................................................................................................. 21
Section 3.1 Introduction. ................................................................................................................... 21
Section 3.2 Pseudo Code................................................................................................................... 21
Section 3.3 Code Implementation. .................................................................................................... 21
Section 3.4 Results and Analysis. ..................................................................................................... 23
Section 3.5 Discussion and Conclusion. ........................................................................................... 24
REFERENCES ..................................................................................................................................... 25
APPENDIX ........................................................................................................................................... 28

4|Page
Section 1 - Part 1: Filters.
Section 1.1 Introduction.
This part of the assignment was about learning how to apply filters and how they work. There
is four sub-parts to this section which are (a), (b), (c) and (d).
Section 1.1.1 Part 1(a) Introduction.
Part (a) required taking an image of my face and resizing to 512x512 pixels. The resized
image had to be converted to greyscale and displayed. The minimum, maximum and mean
greyscale values in the image had to be displayed.
Section 1.1.2 Part1(b) Introduction.
Part (b) involved taking the output image from part (a) and applying noise to it. Gaussian
noise was applied at a mean value of zero and a variance of 0.01. The resultant image had to
be displayed.
Section 1.1.3 Part1(c) Introduction.
Part (c) used the image generated in part (b) to examine noise reduction techniques. 10x10
mean and median rank filter-based noise reduction techniques were used. The results needed
to be displayed and discussed comparing the theoretical and practical outcome.
Section 1.1.4 Part(d) Introduction.
The final part applied a different noise reduction technique to the image from part (b).
Gaussian noise reduction was used varying the standard deviation. The standard deviation
which produced the clearest image had to be found and a comparison between the filtering
technique in part (c) needed to be made.

Section 1.2 Pseudo Code.


• Mount local drive[1] .
• Load appropriate ‘skimage’ packages.
• Load image file(‘my_face.jpg’) .
Section 1.2.1 Part(a) Pseudo Code.
• Resize the image file using resize[2] (Resized thr image to match a certain pixel size
specified in its parameters).
• Convert to grayscale using rgb2gray[3] (Computes luminance of an RGB image)
• Display the minimum, maximum and mean greyscale values .
Section 1.2.2 Part(b) Pseudo Code.
• Apply Gaussian noise to the image from part (a) using random_noise[4](Adds random
noise to an image. Uses ‘gaussian’ mode and has a default variance of 0.01 and mean
of 0).
Section 1.2.3 Part(c) Pseudo Code.
• Apply a 10x10 mean rank filter to the image in part (b) using mean[5] (Returns the
local mean of an image with a function as a parameter called square[6] [generates a
flat square-shaped footprint] to use as the neighbourhood).

5|Page
• Apply a 10x10 median rank filter to the image in part (b) using median[7] (Returns
the local median of an image with a function as a parameter called square[6]).
Section 1.2.4 Part(d) Pseudo Code.
• Apply a Gaussian filter to the image in part (b) using gaussian[8] (Multi-dimensional
Gaussian filter) and vary the standard deviation by changing the ‘sigma’ parameter).

Section 1.3 Code Implementation.


The image being used was stored in a google drive folder. The folder containing the image
had to be accessed.

From google.colab import drive, files


import os

drive.mount(‘/content/gdrive’, force_remount=True)
working_dir = “/content/gdrive/My Drive/images”

# directory of image data


image_dir = os.path.join(working_dir, ‘/content/gdrive/My
Drive/images’)[1]

Figure 1. Accessing the folder containing the image.


As seen in figure one, the image was contained in an ‘images’ folder on google drive and a
working directory has been set up to store the folder. All the functions required needed to be
imported from scikit-image.

from skimage import data, io, filters


from skimage.transform import resize
from skimage.color import rgb2gray
from skimage.util import random_noise
from skimage.filters.rank import mean, median
from skimage.morphology import square
from skimage.filters import gaussian
import matplotlib.pyplot as plt
Figure 2. Loading the required packages.

As seen in figure two, all the functions used are imported from ‘skimage’. These functions
will be called by name when needed. The image was then extracted from the folder.
image_file= io.imread(os.path.join(image_dir, 'my_face.jpg'))[1]

Figure 3. Extracting the image from the image directory.


As seen in figure three, the image being taken is ‘my_face.jpg’ and is being loaded from the
image directory set up in figure one and assigned the variable name ‘image_file’.

6|Page
Section 1.3.1 Part(a) Code implementation.
To resize the image, ‘resize’ was used to change the images dimensions in pixels by passing
the ‘image_file’ and the dimensions as variables of the function. This resized image was
converted to greyscale by passing it through the ‘rgb2gray’ function.
resize = resize(image_file, [512,512])
grayscale = rgb2gray(resize)
Figure 4. Resizing the image and converting it greyscale.
As seen in figure four, the resized image is changed to the pixel dimensions of 512x512 and
assigned the variable name ‘resized’. ‘Resized’ is then passed through ‘rgb2gray’ and given
the variable name ‘grayscale’. The minimum, maximum and medium greyscales were found
by calling the ‘min()’, ‘max()’, and ‘mean()’ functions on ‘grayscale’ and printing the results
using the ‘print()’ function.
print("Min/Max/Mean:", grayscale.min(), grayscale.max(), grayscale.mean
(),)[1]
Figure 5. Finding and displaying the minimum, maximum and mean greyscale values.
As seen in figure five, the functions were called on ‘grayscale’ listed respectively beside the
text.

Section 1.3.2 Part(b) Code implementation.


Gaussian noise was added to ‘grayscale’ by using the ‘random_noise’ function by passing
‘grayscale’ through the function and setting the ‘mode’ parameter to ‘gaussian’.
noisy = random_noise(grayscale, mode = 'gaussian')
Figure 7. Applying Gaussian noise to ‘grayscale’.
As seen in figure seven, ‘grayscale’ has gaussian noise applied to it and has been assigned to
a new variable, ‘noisy’. It had to have a standard deviation of 0.01 and a mean of 0.0 but as
these are their default values in ‘random_noise’, nothing needs to be programmed.
Section 1.3.3 Part(c) Code implementation
A 10x10 mean and median rank filter were applied using the ‘mean’ and ‘median’ rank filter
functions respectively. The ‘noisy’ image was passed through both functions. To get a 10x10
shape neighbourhood, the ‘square’ function was used passing the value 10 through it.
mean_rank = mean(noisy, square(10))
median_rank = median(noisy, square(10))
Figure 8. Mean and median rank filters being applied to ‘noisy’.
As seen in figure eight, ‘noisy’ is being modified using its local mean and medium values and
being reassigned the names ‘mean_rank’ and ‘median_rank’.
Section 1.3.4 Part(d) Code implementation
Gaussian noise was applied to ‘noisy’ for varying values of the parameter ‘sigma’, which is
the standard deviation, which is equal for all axes in ‘noisy’. The ‘mode’ was set to ‘reflect’,
that decides how the array borders are handled. This had to be changed as the functions
default ‘mode’ is ‘nearest’[8].
gauss = gaussian(noisy, sigma=0.5, mode='reflect')
Figure 9. Gaussian noise being applied to ‘noisy’.

7|Page
As seen in figure nine, ‘sigma’ was given a started value of 0.5. This value was increased by
increments of one, four times.
Section 1.4 Results.
Section 1.4.1 Part(a) Results.

Figure 10. Original image resized and converted to greyscale.


As seen in figure 10, the image was resized and converted to greyscale. The minimum,
maximum and mean greyscale values were displayed.

Section 1.4.2 Part(b) Results.

Figure 11. Gaussian noise applied to the greyscale image.


As seen in figure 11, noise has been added to the image.

8|Page
Section 1.4.3 Part(c) Results.

Figure 12. Mean and medium rank filters applied to ‘noisy’.


As seen in figure 12, both mean and medium rank filters have been applied to the image.

Section 1.4.4 Part(d) Results.

Figure 13. Gaussian filter applied for various standard deviations.


The effects of changing standard deviation can be seen in figure 13.

9|Page
Section 1.5 Discussion and Conclusion.
Section 1.5.1 Part(a) Discussion and Conclusion.
The resized image looked a lot squarer then the original image, as seen in figure 10. It looks
like the image has been stretched horizontally and squeezed vertically to give the requested
512x512 pixel dimension. The greyscale image also seen in figure 10, looks as to be
expected. The image with RGB channels has been converted to an image with a single
grayscale channel. This means the value of each greyscale pixel is based off the weighted
sum of the corresponding red, green and blue pixels[9]. The greyscale pixel is created based
off the following equation:
𝐺𝑟𝑒𝑦𝑠𝑐𝑎𝑙𝑒𝑝𝑖𝑥𝑒𝑙 = 0.2125𝑅 + 0.7154𝐺 + 0.0721𝐵 (1)[9]

Equation one was successfully implemented which can be seen in figure 10. The minimum,
maximum and mean greyscale values related to the minimum, maximum and mean values
returned using equation nine.

Section 1.5.2 Part(b) Discussion and Conclusion.


It is clear when comparing figure 11 to figure 10, that Gaussian noise has been added. It is
fuzzy and obscured. The noise is independent as it has been added to the image[10]. The
degraded image seen in figure 11 is a result of the sum of the original image and the additive
noise[10].

Section 1.5.3 Part(c) Discussion and Conclusion.


It is clear from figure 12, that the median rank filter produced a less blurred image then the
mean rank filter with both having produced a very blurred image. Using the mean and median
filters, a significant amount of noise was expected to be removed as a 10x10 neighbourhood
is relatively large. It was expected that the median filter would be the most efficient in noise
reduction as it evaluates a large image neighbourhoods[10]. Median filters order the pixels in
the surrounding neighbourhood into numerical order and replaces the centre pixel with the
median of those values, while the mean simply replaces the centre pixel with the mean of its
surrounding pixels[11]. What was unexpected was the significant blurring of both images in
figure 12.
The blurring was cause by the function ‘mean’ and ‘median’ as they return the local mean or
median of an image[5],[7]. Since local averaging inserts a significant level of image blur, it
is clear why the mean rank filter produced a more blurred image then the median rank filter as
the mean is getting the local average[12].

10 | P a g e
Section 1.5.4 Part(d) Discussion and Conclusion.
From the image in figure 13, a standard deviation of 0.5 produced the clearest image. It is
evident that as the standard deviation increased, the noise reduction became worse and even
appeared to add noise to the image. It filtered significantly better when compared to the mean
and median rank filters in figure 12.
The apparent addition of noise was Gaussian blurring. This a result of increased smoothing.
This type of filter implements “data smoothing by convolving the image data with a mask
that approximates the distribution of a 2D-Gaussian function” [11]. The reason the image
becomes more distorted the higher the standard deviation, is that smoothing intensifies as
standard deviation increases[10]. The image becomes extremely blurred past a standard
deviation of three because 99% of the distribution falls within three standard deviations
meaning the values are very far from the mean[12]. This is even more evident when the
Gaussian function is looked.
𝑥2+𝑦2
1 −( )
𝐺(𝑥, 𝑦) = 2𝜋𝜎2 𝑒 2𝜎2 (2)[12]

Examining equation two, increasing standard deviation for a fixed convolution kernel, the
value of the function decreases.
It is better than the mean or median rank filters as it has improved filter preservation when
compared to the other filters[10].
Full program for this section can be found in the appendix.

11 | P a g e
Section 2 - Part 2: Region Properties.
Section 2.1 Introduction.
The goal of this part of the assignment was to learn how to isolate an object in an image and
to find information about the object in question. The methods used must be robust and
tolerant to noise.
Section 2.1.1 Part(a) Introduction.
This part involves highlighting and displaying the smallest scissors in an image which
contains other scissors, keys, nuts, and other objects. The area and centroid of this object
must be displayed. The code must be robust, fully automated and data drive. There can be no
manual inputs into the program.
Before identifying the smallest scissors, any objects touching the edge of the image must be
removed from the image.
Section 2.1.2 Part(b) Introduction.
This part will test the robustness of the program developed in part (a). An experiment needed
to be conducted to see if the smallest scissors can still be found, to an appropriate range of
image noise.
Section 2.2 Techniques and Rational.
Section 2.2.1 Part(a) Techniques and Rational.
The distinguishing feature of a pair of scissors is that it consists of two holes and one object.
This information was key in finding the scissors in the image. The Euler number of an object
is the number of objects minus the number of holes in a single object. The Euler number of a
scissors is -1(1[object]-2[holes]). This means the scissors can be found based off their Euler
number value.
The smallest scissors can be found by examining their areas. The smaller area corresponds to
the smallest scissors. The smallest scissors will be highlighted in the original image.
To make the program robust, after the image is loaded, deconvolution was done to the image
using the Wiener-Hunt approach. After researching noise reduction and image restoration,
this technique appeared to be the best. This works well because it takes statistical estimate of
an unknown signal using a related signal as an input while filtering the known signal to
produce the estimate as an output signal[13]. Then, dilation was performed on the image to
strengthen the image signal and then erosion was used to return the objects to close to their
original size.
Section 2.3.2 Part(b) Techniques and Rational.
Gaussian noise was added to test the robustness of the program from part(a). It was added to
the image after it was loaded into the program. The parameter ‘var’ was increased until the
program failed to identify the smallest scissors.

12 | P a g e
Section 2.3 Pseudo Code.
• Mount local drive1
• Load appropriate ‘skimage’ packages
• Load image file(‘scissors.jpg’)
Section 2.3.1 Part(a) Pseudo Code.
• Create functions to display and save images
• Convert image to gray scale using rgb2gray[3]. This allows Otsu thresholding to be
applied to the image to separate the background from foreground.
• The image is then deconvolved using restoration.wiener[14] (Returns the
deconvolution image using a Wiener-Hunt approach. Passes the image through it and
uses a ‘psf’ [ point spread function parameter] for the assumed impulse response and a
‘balance’ parameter to avoid noise artifacts).
• Otsu thresholding is then performed on the image using threshold_otsu[15] (Returns a
threshold value based on Otsu’s method. Takes a greyscale image as an input and
gives the upper threshold value with pixels with a higher intensity as foreground).
This value is used to create a binary image. The binary image also allows for faster
dilation and erosion.
• Dilation is then used to strengthen the image signal using
morphology.binary_dilation[16] (Returns fast binary dilatation. It enlarges bright
regions and shrinks dark regions). Takes binary image as a parameter and the disk
size which was created using morphology.disk[17] (Generates a flat, disk-shaped
structuring element).
• Erosion of the image is done to restore it to its original size using
morphology.binary_erosion[18] (Returns fast binary erosion of the image. It shrinks
bright regions and enlarges dark regions). The size of the erosion was determined
using morphology.disk[17].
• Objects touching the border of the image were removed using clear_border[19]
(Clears objects connected to the label image border) which just takes the binary image
as a parameter.
• Labels were assigned to all the objects using label[20] (Label connected regions by
connecting pixels of the same value). These labels will be used to obtain the
properties of all the objects in the image.
• The properties of each object were placed in a list using regionprops[21] (Obtains
various properties of the labelled image regions). This list can be used to find
information regarding the Euler number and area.
• Sort the list of properties in ascending order by Euler number. This will place the
scissors at the top of the list
• An algorithm finds the smallest area and assigns it to the top of the list
• The smallest scissors is isolated by turning all the other objects black
• The smallest scissors is highlighted on the original image using label2rgb[22]
(Returns an RGB image where color-coded labels are painted over an image). It takes
the isolated object as a parameter, the original image, and a parameter ‘bg_label’
which was the background label which was set to zero.
• The area and centroid are displayed using the table of label properties and only
printing the smallest scissors information by using regionprops_table[23] (Finds the

13 | P a g e
region properties and returns them in a pandas-compatible table). It takes the labelled
image as a parameter and the properties that are specified.
Section 2.3.2 Part(b) Pseudo Code.
• Apply Gaussian noise to the image after it has been converted to gray scale using
random_noise[4]. The standard deviation was varied by changing the ‘var’ parameter,
until the program failed to detect the smallest scissors.

Section 2.4 Code Implementation.


Section 2.4.1 Part(a) Code Implementation.
Loading the image folder was programmed as seen in figure one.

from skimage import data, io, filters, morphology, measure,


restoration
from skimage.color import rgb2gray as r2g
from skimage.color import label2rgb as lr2g
from skimage.filters import threshold_otsu as to
from skimage.segmentation import clear_border as cb
from skimage.measure import label, regionprops_table, regionprops
from skimage.filters import gaussian
from skimage.util import random_noise
from skimage.filters.rank import otsu
from skimage.morphology import disk
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
Figure 14. Accessing required functions.

As seen in figure 14, all functions needed were imported from ‘skimage’. Note some function
names have been abbreviated for quicker programming.

def display(name):
plt.imshow(name, cmap=plt.cm.gray)

def save(title):
plt.savefig(os.path.join(image_dir,title))
Figure 15. Functions to display and save an image.

As seen in figure 15, the function called ‘display’ takes the name of a greyscale image and
displays it using ‘imshow’. The function called ‘save’, saves it to the drive using ‘savefig’
and stores it under the name passed through the function. This avoids repetitive code.

14 | P a g e
image_file = io.imread(os.path.join(image_dir, 'scissors.jpg'))
grayscale = r2g(image_file)
Figure 16. Loading the image and converting to greyscale.

The image containing all the objects is called ‘scissors.jpg’ and is assigned the variable name
‘image_file’. It is converted to greyscale using ‘r2g’ and assigned the variable name
‘grayscale’.

psf = np.ones((5,5))/25
decon = restoration.wiener(grayscale, psf, balance=300)24
Figure 17. Deconvolution of the image.

As seen in figure 17, the image is being passed through the function. The ‘psf’ parameter is
given a value to be passed through the function. The ‘balance’ is set at 300 which should get
rid of a lot of noise artefacts then can occur when filtering a noisy image. The restored image
was given the variable name ‘decon’.

otsu = to(decon)
bin = otsu <= decon
Figure 18. Applying Otsu thresholding to create a binary image.

As seen in figure 18, Otsu thresholding had been applied to ‘decon’ and given the variable
name ‘otsu’. Any values above this value are made foreground and the binary image is
assigned the variable name ‘bin’.

disk = morphology.disk(12)
dialation = morphology.binary_dilation(bin, disk)1
cdisk = morphology.disk(12)
eroded = morphology.binary_erosion(dialation, cdisk)
clb = cb(eroded)
Figure 19. Strengthening the image by dilation, returning it to its original size using erosion
and clearing the borders.

As seen in figure 19, the ‘disk’ size of 12 is passed through the dilation function along with
‘bin’. It is assigned the variable name ‘dialation’. To reverse this, ‘cdisk’ is given the same
value as ‘disk’ and passed through the erosion function along with ‘dialation’ and given the
variable name ‘eroded’. The borders are cleared by passing ‘eroded’ through the ‘cb’ and is
assigned the variable name ‘clb’.

15 | P a g e
labeled = label(clb)
im_regions = regionprops(labeled)1
im_regions.sort(key=lambda x: x.euler_number, reverse=False)1
Figure 20. Labelling ‘clb’, getting the properties and sorting the properties by Euler number.

As seen in figure 20, ‘clb’ is being passed through ‘label’ and assigned the name ‘labeled’.
The properties of the objects are being calculated when ‘labeled’ is passed through
‘regionprops’ and the list is given the name ‘im_regions’. The list is sorted by Euler number
in ascending order using the python function ‘sort’, and the parameter ‘euler_number’.

i = 1
a = im_regions[0].area
while im_regions[i].euler_number == -1:
if im_regions[i].area < a:
a = im_regions[i].area
im_regions[i]=im_regions[0]
i+=1
Figure 21. Sorting algorithm to find the scissor with the smallest area.

Figure 21 shows the method used to place the smallest scissor at the top of ‘im_regions.
First, two variables were created. The variable ‘i’ is used for indexing through ‘im_regions’.
The variable ‘a’ is used to store the ‘area’ of the first object in ‘im_regions’. The ‘while’
loop will run as long as the Euler number of an object in ‘im_regions’ is -1. This ensures it
will not take into account the area of other objects in ‘im_regions’. The ‘if’ statement
compares the next element in ‘im_regions’ with the first one. If that element is less than the
first one, it will be placed on the top of the list. It will be stored in ‘a’, so it can be compared
against the rest of the ‘unknown’ amount of scissors in ‘im_regions’. The variable ‘i’ is
increased by one each time to iterate through ‘im_regions’.

lm = labeled
for k in im_regions[1:]:
lm[k.coords[:,0], k.coords[:,1]]=0
lm[lm!=0]=1
smallest_scissors = lm1
Figure 22. Displaying the smallest scissor.

In figure 22, ‘labeled’ is assigned the name ‘lm’. This will be used to iterated though all the
objects in ‘labeled’. The ‘for’ loop slices through ‘im_regions’ ignoring the first object. It
takes the co-ordinates of the objects in the rest of the list and turns then black by equating
them to zero. Any object that does not have the value zero, is left white by equating it to one.
The object left white is assigned the variable name ‘smallest_scissors’.

16 | P a g e
highlighted = lr2g(smallest_scissors,image=image_file,bg_label=0)1
Figure 23. Highlighting the smallest scissors in the original image.

As seen in figure 23, ‘smallest_scissors’ is passed through ‘lr2g’ along with ‘image_file’. The
parameter ‘bg_label’ is set to zero so it does not show up on the original image. The image
with the highlighted smallest scissor is assigned the variable name ‘highlighted’.

Properties =
regionprops_table(smallest_scissor,properties=[‘area’,’centroid’])
list = pd.DataFrame(properties)
print(list)
Figure 24. Finding and displaying the area of the smallest centroid.

As seen in figure 24, ‘regionprops_table’ is used to display the area and centroid of
‘smallest_secissors’. The object is passed through the function along with the parameters
‘area’ and ‘centroid’. This information is assigned to the variable name ‘properties’. A table
is created from ‘properties’ using ‘pd.DataFrame’ and is stored as ‘list’. The information is
displayed on the screen using ‘print’.

Section 2.4.2 Part(b) Code Implementation

noisy = random_noise(grayscale, mode ='gaussian', var=0.5)


Figure 25. Adding Gaussian noise to the image.

As seen in figure 25, Gaussian noise is added to the image. The greyscale image of the input
image is being passed through ‘random_noise’ and the ‘mode’ parameter is set to ‘gaussian’
to set the noise type. The distorted image is given the variable name ‘noisy’. The variable
‘noisy’ replaces ‘grayscale’ in figure 17. The ‘var’ parameter started at a value of 0.5 and
was increased by 0.5 until the smallest scissors could no longer be distinguished.

17 | P a g e
Section 2.5 Testing and Results.
Section 2.5.1 Part(a) Testing and Results.

Figure 26. Displaying the smallest scissor.

As seen in figure 26, the smallest scissors is located roughly in the middle to low part of the
image.

Figure 27. The highlighted smallest scissors.

Figure 27 confirms that what was found was actually a pair of scissors and not a noise
artifact. It also confirms the location of the scissors in the image.

18 | P a g e
Figure 27. Displaying the area and centroid of the smallest scissors.

As seen in figure 27, the area of the smallest scissors is 8555 pixels and its centroid is
(818,593).
Section 2.5.2 Part(b) Testing and Results.

Figure 28. Gaussian noise with a standard deviation of 0.5 applied to the image.

As seen in figure 28, the program was able to detect the smallest scissors with a standard
deviation of 0.5. The images from left to right are noise applied, boarders cleared, scissors
displayed.

Figure 29.
As seen in figure 29, the area and centroids have been altered slightly due to the addition of
the noise and the filtering techniques used.

19 | P a g e
Figure 30. Gaussian noise with a standard deviation of 2.5 being applied to the image.
As seen in figure 30, the program fails to detect the smallest scissors with a standard
deviation of 2.5. The figure has the same format as described for figure 28.

Section 2.6 Discussion and Conclusion.


Section 2.6.1 Part(a) Discussion and Conclusion.
The smallest scissors were displayed and highlighted along with its area and centroid
displayed. All the functions worked as described in Section 2.3.1 Part(a) Pseudo Code.
Section 2.6.2 Part(b) Discussion and Conclusion.
The program was relatively robust and did not fail until the standard deviation in the Gaussian noise
reached 2.5. Since the values were increased in steps of 0.5, it can be concluded that the program
failed for a standard deviation between the values of 2.0 and 2.5.

In figure 30, there are bits of noise artefacts being introduced to the imaged. These are the little
white dots in the middle image. These artefacts can be mistaken for objects in the image that are
not really there. This is different to the noise in the noisy image which obscure features in the image
[25].

The program still works to a point where it is able to detect which object is a pair of scissors. It did
detect the smallest scissors by area, but it was no the ‘right’ pair of scissors because the top part of
the left handle was detached from the object making it smaller in area. That was why the program
picked the top pair of scissors as the smallest. This is evident in figure 30 as the scissors displayed is
missing the top part of its left handle.

The reason the image is being broken apart is the noise is so severe, that when the image is being
deconvolved using the wiener method, the estimated output signal is too far from the original, so
parts of the imaged are not accounted for.

Full program for this section can be found in the appendix.

20 | P a g e
3.3 Part 3: Colour Matching.
Section 3.1 Introduction.
This part of the assignment only has one section. The aim of this section is to examine the
effects of histogram matching using a reference image matched to the resized colour face
image in Section 1.4.1 Part(a), Results, figure 10. After the images have been matched, the
histograms must be plotted referencing the input, reference, and matched output.
Section 3.2 Pseudo Code.
Loading the image was programmed as seen in figure one.
• Mount local drive1[1]
• Load appropriate ‘skimage’ packages
• Load input image(‘my_face_resized.jpg’). This is in colour because the reference
image is in colour and the histogram matching function will not work unless it has the
same number of channels as the reference image[26].
• Load the reference image(‘reference.jpg’)
• Resize the reference image to match the size of the input image using resize[2]. This
ensured all colours in the reference image are being matched to the input image.
• The histograms are matched using match_histograms [26] (Adjusts an image so its
cumulative histogram matches that of another image. Returns the transformed image)
• Plot the red, green, and blue channels of the reference, input, and matched output
histograms.
Section 3.3 Code Implementation.
Loading the image folder was programmed as seen in figure one.

from skimage import data, io, filters


from skimage import exposure
from skimage.exposure import match_histograms as mh
from skimage.transform import resize
import matplotlib.pyplot as plt
Figure 31. Required functions loaded from ‘skimage’.

As seen in figure 31, the functions have been imported from ‘skimage’. Note the change in
nomenclature of ‘match_histograms’ to ‘mh’.

resized = io.imread(os.path.join(image_dir, 'my_face_resize.jpg'))


reference = io.imread(os.path.join(image_dir, 'reference.jpg'))
Figure 32. Loading the input and reference images.

As seen in figure 32, the input image is ‘my_face_resize.jpg’ and has been assigned the name
‘resized’. The reference image being loaded is ‘reference.jpg’ and had been given the
variable name ‘reference’.

21 | P a g e
resizeref = resize(reference, [512,512])
Figure 33. Resizing the reference image.

As seen in figure 33, ‘reference’ has been passed through ‘resize’ and given a pixel size of
[512,512]. It has been reassigned to the name ‘resizeref’. These are the same dimensions
applied to the input image in Section 1.3.1 Part(a), Code Implementation, figure four, of this
report.

matched_histograms = mh(resized, resizeref)


Figure 34. Matching the histograms of the input image and the resized reference image.

As seen in figure 34, ‘reszied’ is matched with ‘resizeref’ using ‘mh’ to produce a matched
output image. The matching is applied separately for each channel.

fig, axes = plt.subplots(nrows=3, ncols=3, figsize=(24,9))

for i, img in enumerate((resizeref, resized, matched_histograms)):


for c, c_color in enumerate(('red','green','blue')):
img_hist, bins = exposure.histogram(img[...,c],
source_range='dtype')
axes[c, i].plot(bins, img_hist / img_hist.max())
img_cdf, bins = exposure.cumulative_distribution(img[..., c])
axes[c, i].plot(bins, img_cdf)
axes[c, 0].set_ylabel(c_color)

axes[0,0].set_title('Reference')
axes[0,1].set_title('Color Input')
axes[0,2].set_title('Matced Output')[27]
Figure 35. Plotting the histograms of the reference, input, and matched output.

The code illustrated in figure 35 implements nested ‘for’ loops to plot the histogram graphs.
It starts with ‘resizeref’ and then enters the next loop and plots the red, green, and blue
histograms for ‘resizeref’. It repeats this process for ‘reszied’ and ‘matched_histograms’. It
plots all the red, green, and blue graphs on the same respective rows and the graphs for the
images in the same respective column.

22 | P a g e
Section 3.4 Results and Analysis.

Figure 36. The matched histogram output image.

As seen in figure 36, the colour input has been changed as a result of matching it with the
histogram of reference image.

Figure 37. Plotted histograms of each channel for the images.

As seen in the ‘Matched Output’ graphs, the histograms of the reference image and the input
image have been matched.

23 | P a g e
Section 3.5 Discussion and Conclusion.
The histograms were matched which is evident in figure 36. This is further evident when the
graphs are examined in figure 37 as the ‘Colour Input’ plots are altered to be closer to the
‘Reference’ plots. The matched output is a lot darker than the input image which can be seen
graphically in the ‘Matched Output’ column in figure 37. The histogram looks at red, blue,
and green as these are the three channels in the images as they are RGB images. The plots
look at the three channels because the ‘match_histograms’ function applies the adjustment
separately for each channel.
The bright light in the background of the input image in figure 36 appears to have an adverse
effect on what the expected matched output would be. The matched output was expected to
have warmer tones in it but instead has what looks like noise and darken features. The
histogram plots from the input image are all nearly identical, indicating there is the same
levels of red, green, and blue in the image. This may be a result of the bright light in the
background. White light consists of red, green, and blue and other secondary colours formed
from mixing these prime colours. The white light may be overpowering the other colour
signals in the image creating a slightly monotoned matched output.
Full program for this section can be found in the appendix.

24 | P a g e
REFERENCES
[1] Paul F. Whelan 2021, “Python for Image Processing & Analysis”, Python for Image processing,
Dublin City University. [pdf].
[Accessed: November 01 2021].
[2] resize, Module: transform, scikit-image. [Online]. Available: https://scikit-
image.org/docs/dev/api/skimage.transform.html?highlight=resize#skimage.transform.resize
[Accessed: November 01 2021].
[3] rgb2gray, Module: color, scikit_image. [Online}. Available: https://scikit-
image.org/docs/dev/api/skimage.color.html?highlight=rgb2gray#skimage.color.rgb2gray
[Accessed: November 01 2021].
[4] random_noise, Module: util, scikit_image. [Online]. Available: https://scikit-
image.org/docs/dev/api/skimage.util.html?highlight=random_noise#skimage.util.random_noise
[Accessed: November 01 2021].
[5] mean, Module: filters.rank, scikit_image. [Online]. Available: https://scikit-
image.org/docs/dev/api/skimage.filters.rank.html?highlight=rank%20mean#skimage.filters.rank.mean
[Accessed: November o1 2021].
[6] square, Module: morphology, scikit-image. [Online]. Available: https://scikit-
image.org/docs/dev/api/skimage.morphology.html?highlight=square#skimage.morphology.square
[Accessed: November 01 2021].
[7] median, Module: filters.rank, scikit-image. [Online]. Available: https://scikit-
image.org/docs/dev/api/skimage.filters.rank.html?highlight=median%20filter#skimage.filters.rank.me
dian
[Accessed: November 01 2021].
[8] gaussian, Module: filters, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.filters.html?highlight=gaussian#skimage.filters.gaussian
[Accessed: November 01 2021].
[9] RGB to grayscale, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/auto_examples/color_exposure/plot_rgb_to_gray.html?highlight=grayscale
[Accessed: November 01 2021].
[10]. Paul F. Whelan 2021, “Image Processing & Analysis Notes” Noise in Images, Dublin City
University. [pdf].
[Accessed: November 01 2021].
[11] N.Rajesh Kumar, J.Uday Kumar. Median Mean Filter, A Satail Mean and Median Filter For
Noise Removal in Digital Images. [Online]. Available: https://www.rroij.com/open-access/a-spatial-
mean-and-median-filter-for-noiseremoval-in-digital-
images.php?aid=43137#:~:text=Median%20Filter%3A%20The%20median%20filter%20is%20norma
lly%20used,filter%20of%20preserving%20useful%20detail%20in%20the%20image.
[Accessed: November 01 2021].

25 | P a g e
[12] May 25 2010, The University of Auckland, New Zealand. [PowerPoint]. Available:
https://www.cs.auckland.ac.nz/courses/compsci373s1c/PatricesLectures/Gaussian%20Filtering_1up.p
df
[Accessed: November 01 2021].

[13] March 12 2021, Description, Weiner filter. [Online]. Available:


https://en.wikipedia.org/wiki/Wiener_filter
[Accessed: November 02 2021].
[14] wiener, Module: restoration, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.restoration.html?highlight=restoration%20wiener#skimag
e.restoration.wiener
[Accessed: November 02 2021].
[15] threshold_otsu, Module: filters, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.filters.html?highlight=threshold_otsu#skimage.filters.thre
shold_otsu
[Accessed. November 02 2021].
[16] binary_dialation, Module: morphology, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.morphology.html?highlight=morphology#skimage.morph
ology.binary_dilation
[Accessed: November 02 2021].
[17] morphology.disk, Module: morphology, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.morphology.html?highlight=morphology%20disk#skimag
e.morphology.disk
[Accessed: November 02 2021].
[18] binary_erosion, Module: morphology, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.morphology.html?highlight=morphology#skimage.morph
ology.binary_erosion
[Accessed: November 02 2021].
[19] clear_border, Module: segmentation, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.segmentation.html?highlight=clear_border#skimage.segm
entation.clear_border
[Accessed: November 02 2021].
[20] label, Module: measure, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.measure.html?highlight=label#skimage.measure.label
[Accessed: November 02 2021].
[21] regionprops, Module: measure, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.measure.html?highlight=regionprops#skimage.measure.re
gionprops

26 | P a g e
[Accessed: November 02 2021].
[22] label2rgb, Module: color, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.color.html?highlight=label2rgb#skimage.color.label2rgb
[Accessed: November 02 2021].
[23] regionprops_table, Module: measure, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.measure.html?highlight=regionprops_table#skimage.meas
ure.regionprops_table
[Accessed: November 02 2021].
[24] Examples, weiner, Module: restoration, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.restoration.html?highlight=restoration%20wiener#skimag
e.restoration.wiener
[Accessed: November 02 2021].
[25] April 2013. In image processing, what is the difference or relationship between noise
and artifact?, StackExchange. [Online]. Available:
https://dsp.stackexchange.com/questions/8149/in-image-processing-what-is-the-difference-
or-relationship-between-noise-and-ar
[Accessed: November 02 2021].
[26] match_histograms, Module: exposure, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/api/skimage.exposure.html?highlight=match_histogram#skimage.expo
sure.match_histograms
[Accessed: November 02 2021].
[27] Histogram matching, scikit-image. [Online]. Available: https://scikit-
image.org/docs/stable/auto_examples/color_exposure/plot_histogram_matching.html?highlig
ht=plot%20histograms
[Accessed: November 02 2021].

27 | P a g e
APPENDIX
from google.colab import drive, files
import os

drive.mount('/content/gdrive', force_remount=True)
working_dir = "/content/gdrive/My Drive/images"

# directory of image data


image_dir = os.path.join(working_dir, '/content/gdrive/My
Drive/images')
!pwd
%cd '/content/gdrive/My Drive/images'
!ls
#load files
from skimage import data, io, filters
from skimage.transform import resize
from skimage.color import rgb2gray
from skimage.util import random_noise
from skimage.filters.rank import mean, median
from skimage.morphology import square
from skimage.filters import gaussian
import matplotlib.pyplot as plt
#load image,greyscale, greyscale values
image_file= io.imread(os.path.join(image_dir, 'my_face.jpg'))
resize = resize(image_file, [512,512])
grayscale = rgb2gray(resize)
print("Min/Max/Mean:", grayscale.min(), grayscale.max(),
grayscale.mean(),)
#add noise, display result
noisy = random_noise(grayscale, mode = 'gaussian')
plt.imshow(noisy, cmap=plt.cm.gray)
plt.axis('Off')
plt.title("Gaussian noise")
plt.savefig(os.path.join(image_dir, 'gaussian_noise.jpg'))
#applying mean,median rank filters
mean_rank = mean(noisy, square(10))
median_rank = median(noisy, square(10))
#applying Gaussian filtering
gauss1 = gaussian(noisy, sigma=0.5, mode='reflect')
gauss2 = gaussian(noisy, sigma=2.5, mode='reflect')
gauss3 = gaussian(noisy, sigma=4.5, mode='reflect')
#displaying results
fig, ax = plt.subplots(ncols=5, figsize=(24, 9))

ax[0] = plt.subplot(1, 3, 1)
ax[1] = plt.subplot(1, 3, 2)
ax[2] = plt.subplot(1, 3, 3)

ax[0].imshow(gauss1, cmap=plt.cm.gray)
ax[0].set_title("Standard deviation = 0.5")
ax[0].axis("Off")

ax[1].imshow(gauss2, cmap=plt.cm.gray)
ax[1].set_title("Standard deviation = 2.5")
ax[1].axis("Off")

28 | P a g e
ax[2].imshow(gauss3, cmap=plt.cm.gray)
ax[2].set_title("Standard deviation = 4.5")
ax[2].axis("Off")

plt.tight_layout()
plt.savefig(os.path.join(image_dir, '0.5_2.5_5.5.jpg'))
plt.show()
Figure 38. Program for Part 1: Filters.

from google.colab import drive, files


import os

drive.mount('/content/gdrive', force_remount=True)
working_dir = "/content/gdrive/My Drive/images"

# directory of image data


image_dir = os.path.join(working_dir, '/content/gdrive/My
Drive/images')
!pwd
%cd '/content/gdrive/My Drive/images'
!ls
#loading functions
from skimage import data, io, filters, morphology, measure,
restoration
from skimage.color import rgb2gray as r2g
from skimage.color import label2rgb as lr2g
from skimage.filters import threshold_otsu as to
from skimage.segmentation import clear_border as cb
from skimage.measure import label, regionprops_table, regionprops
from skimage.filters import gaussian
from skimage.util import random_noise
from skimage.filters.rank import otsu
from skimage.morphology import disk
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
#display and save functions
def display(name):
plt.imshow(name, cmap=plt.cm.gray)

def save(title):
plt.savefig(os.path.join(image_dir,title))
#loading image,greyscale,noise filtering
image_file = io.imread(os.path.join(image_dir, 'scissors.jpg'))
grayscale = r2g(image_file)
noisy = random_noise(grayscale, mode ='gaussian', var=2.5)
display(noisy)
save('noisy image_2')
#deconvolution, Otsu thresholding, binary,dilation
psf = np.ones((5,5))/25
decon = restoration.wiener(noisy, psf, balance=300)
otsu = to(decon)
bin = otsu <= decon

29 | P a g e
disk = morphology.disk(12)
dialation = morphology.binary_dilation(bin, disk)
display(bin)
save('otsu and filtered image')
#eroding,clear border
cdisk = morphology.disk(12)
eroded = morphology.binary_erosion(dialation, cdisk)
clb = cb(eroded)
display(clb)
save('cleared border_2.5')
#finding and displaying smallest scissors
labeled = label(clb)
im_regions = regionprops(labeled)
im_regions.sort(key=lambda x: x.euler_number, reverse=False)

i = 1
a = im_regions[0].area
while im_regions[i].euler_number == -1:
if im_regions[i].area < a:
a = im_regions[i].area
im_regions[i]=im_regions[0]
i+=1

lm = labeled
for k in im_regions[1:]:
lm[k.coords[:,0], k.coords[:,1]]=0
lm[lm!=0]=1
smallest_scissor = lm
display(smallest_scissor)
save('smallest scissor_2.5')
#highlighting the smallest scissors
highlighted = lr2g(smallest_scissor, image=image_file, bg_label=0)
display(highlighted)
save('highlighted smallest scissor')
#displaying the area and centroid of the smallest scissors
properties = regionprops_table(smallest_scissor,
properties=['area','centroid'])
list = pd.DataFrame(properties)
print(list)
Figure 39. Program for Part 2: Region Properties.

from google.colab import drive, files


import os

drive.mount('/content/gdrive', force_remount=True)
working_dir = "/content/gdrive/My Drive/images"

# directory of image data


image_dir = os.path.join(working_dir, '/content/gdrive/My
Drive/images')
!pwd
%cd '/content/gdrive/My Drive/images'
!pip show scikit-image
#loading functions

30 | P a g e
from skimage import data, io, filters
from skimage import exposure
from skimage.exposure import match_histograms as mh
from skimage.transform import resize
import matplotlib.pyplot as plt
#loading input and reference images and resizing reference image
resized = io.imread(os.path.join(image_dir, 'my_face_resize.jpg'))
reference = io.imread(os.path.join(image_dir, 'reference.jpg'))

resizeref = resize(reference, [512,512])


#matching the histograms
matched_histograms = mh(resized, resizeref)
#displaying the images
fig, (ax1, ax2, ax3) = plt.subplots(nrows=1, ncols=3,
figsize=(24,9), sharex=True, sharey=True)

for aa in (ax1, ax2, ax3):


aa.set_axis_off()

ax1.imshow(resizeref)
ax1.set_title('Reference')
ax2.imshow(resized)
ax2.set_title('Colour Input')
ax3.imshow(matched_histograms)
ax3.set_title('Matched Output')

plt.tight_layout
plt.savefig(os.path.join(image_dir,
'Matched_histogram_picture.jpg'))

plt.show()
#dispalying the plots
fig, axes = plt.subplots(nrows=3, ncols=3, figsize=(24,9))

for i, img in enumerate((resizeref, resized, matched_histograms)):


for c, c_color in enumerate(('red','green','blue')):
img_hist, bins = exposure.histogram(img[...,c],
source_range='dtype')
axes[c, i].plot(bins, img_hist / img_hist.max())
img_cdf, bins = exposure.cumulative_distribution(img[..., c])
axes[c, i].plot(bins, img_cdf)
axes[c, 0].set_ylabel(c_color)

axes[0,0].set_title('Reference')
axes[0,1].set_title('Color Input')
axes[0,2].set_title('Matced Output')

plt.savefig(os.path.join(image_dir, 'Matched_histogram_graphs.jpg'))
Figure 40. Program for Part 3: Colour Matching.

31 | P a g e

You might also like