You are on page 1of 4

Assignment 1a. and 1b.

-Project by Aniruddha Paturkar.


-ID: 2017H1240115P

Assignment 1a.

Aim: This experiment tries to compare the amount of information in various types of images. It also
finds whether there is some loss of information or not after the format of the original image is
changed.

Note: In this experiment, I have calculated the Entropy of various images using both, the direct
MATLAB command as well as the typical Entropy formula. Since, both the values are identical, I
haven’t mentioned them separately.

Formula: Entropy is given by,

H(X) = -∑p(xi)*log2(p(xi))
Where, H(X) = Entropy i.e. Average information per source letter.

p(xi) = Probability of occurrence of the symbol xi.

X = Possible output symbols from a source.

Result: Entropy values obtained for various types of images is summarized in table shown below.

Sr. no. Image Entropy of Entropy of


original .tiff modified .jpg
format format
1. Natural 7.6462 7.6461
2. Medical 7.3568 7.3711
3. Remote 6.6296 6.6258
4. Binary 1.1776 1.27

Conclusion: After the comparisons, we find that the histograms (see next page) and entropies of
original .tiff images and formatted .jpg images are almost identical. Hence, it is concluded that there
is no significant loss of information after converting the .tiff images into .jpg format.

Further, Natural image has highest Entropy, followed by Medical image. Remote image has
somewhat lesser Entropy while Binary image has least Entropy. Hence, as the amount of
information i.e. number of grey levels reduces in an image, Entropy is reduced.

The figure on next page shows the histogram plots for various types of images.
Assignment 1b.

Aim: This experiment tries to compare a synthetic Greyscale image with its Binary counterpart
obtained by thresholding. The normalized threshold value used in the experiment was 0.8. We
compare the entropies of both the images to see the effect on the Average amount of information
per symbol in both the images carry.

Note: In this experiment, I have calculated the Entropy of the images using both, the direct MATLAB
command as well as the typical Entropy formula. Since, both the values are identical, I haven’t
mentioned them separately.

Formula: Entropy is given by,

H(X) = -∑p(xi)*log2(p(xi))
Where, H(X) = Entropy i.e. Average information per source letter.

p(xi) = Probability of occurrence of the symbol xi.

X = Possible output symbols from a source.

Result: Entropy values obtained for both the images is summarized in table shown below.

Sr. no. Entropy of Entropy of the


original modified Binary
Greyscale image image (with
normalized
threshold = 0.8)
1. 1.5289 0.8742

Conclusion: It is concluded that as the greyscale image is formatted into binary image, its entropy
has decreased significantly since number of grey levels are now reduced to only two viz. Black and
White.

The figure on next page shows the histogram plots for both the images.