You are on page 1of 6

LCE/7.5.

1/RC 01
TEACHING NOTES
Department: ELECTRONICS & COMMUNICATION ENGINEERING
Unit: I Date:
Topic name: Fundamental Steps in Image Processing - 1 No. of marks
allotted by JNTUK:
Books referred: 01. Digital Image Processing by R C Gonzalez and R E Woods
02. www.wikipedia.org
03. www.google.com
Fundamental Steps in Image Processing:
In general the image processing methods classified as (i) the methods
whose input and output are images and (ii) the methods whose inputs may be
images and outputs are attributes extracted from those images.
In general, the fundamental steps in image processing can be explained
by an illustration given below:

Image acquisition:
The first process is to acquire a digital image. To do this we need image
sensor equipment having the ability to digitize the signal produced by the
sensor. The sensor could be a TV camera, a line-scan camera, etc.

Image Enhancement:
It is a subjective area of image processing which is used to bring out detail
that is obscured or to highlight certain features of interest in an image. For
example, increasing the contrast of an image for better vision.

Image Restoration:
It is also deals with improving the appearance of an image. But it is done
using the mathematical or probabilistic methods of image degradation.

Faculty/Date: HOD/Date:
LCE/7.5.1/RC 01

Page 1 of 6
TEACHING NOTES
Department: ELECTRONICS & COMMUNICATION ENGINEERING
Unit: I Date:
Topic name: Fundamental Steps in Image Processing - 2 No. of marks
allotted by JNTUK:
Books referred: 01. Digital Image Processing by R C Gonzalez and R E Woods
02. www.wikipedia.org
03. www.google.com
Color Image Processing:
As we know, to restore the natural characteristics of an image it is
necessary to preserve the color information associated with an image. For this
purpose we go for color image processing.

Wavelets and Multi-resolution:


This is the foundation for representing images in various degrees of
resolution. Particularly it is employed for image data compression and for
pyramidal representation where images are subdivided into successively into
smaller regions.

Compression:
This technique is used for the storage required to save an image or the
bandwidth required to transmit it which is most important in Internet
applications.

Morphological Processing:
It deals with tools for extracting image components useful in
representation and description of shape.

Segmentation:
It may be defined as portioning an input image into its constituent parts or
objects. It is very important to distinguish between different objects in an image
as in the case of systems employed for traffic control, or crowd control.
In character recognition, the key role of segmentation is to extract
individual characters and words from the background.

Representation and Description:


It is a process which transforms raw data into a form suitable for
subsequent computer processing. The first decision is to choose between
boundary representation and regional representation. Boundary representation
is used when the details of external shape characteristics is important where as
the regional representation is used when the internal properties are important.

Object Recognition:
It is a process that assigns a label to an object based on its descriptors i.e.,
the information provided by its descriptors and the recognized object is
interpreted by assigning a meaning to it.

Knowledge Base:
The function of knowledge base is to guide the operation of each
processing module and control the interaction between them. A feedback
request through the knowledge base to the segmentation stage for another

Page 2 of 6
‘look’ is an example of knowledge utilization in performing image processing
tasks.

Faculty/Date: HOD/Date:
LCE/7.5.1/RC 01
TEACHING NOTES
Department: ELECTRONICS & COMMUNICATION ENGINEERING
Unit: I Date:
Topic name: Gray Level, Sampling & Quantization - 1 No. of marks
allotted by JNTUK:
Books referred: 01. Digital Image Processing by R C Gonzalez and R E Woods
02. www.wikipedia.org
03. www.google.com
Concept of Gray Level:
It is the brightness of a pixel. The value associated with a pixel
representing it’s lightness from black to white. Usually defined as, a value from 0
to 255, with 0 being black and 255 being white.
Concept in Image Sampling and Quantization:
Formation of digital image from a continuous image basically involves two
steps. They are Sampling and Quantization.
An image may be continuous with respect to x and y coordinates and also
in amplitude. Digitizing the coordinate values is called sampling whereas
digitizing the amplitude values is called quantization.

The above figure-1 shows a continuous image f(x,y) which we want to


digital form. Let us take the gray level values of the continuous image along the
line segment AB.

The random variations that are seen are due to image noise. This function
is sampled and quantized to get digital output.

Sampling is done by taking equally spaced samples along the line AB. The
location of each sample is given a vertical tick mark in the bottom part of the
figure-3.
The set of these discrete locations gives the sampled function. However,
the values of the samples still span a continuous range of gray-level values. In
order to form a digital function, the gray-level values also must be converted into
discrete quantities. The right side of the figure-3 shows the gray-level scale

Page 3 of 6
divided into eight discrete levels ranging from black to white. The vertical tick
marks indicates the specific value assigned to each of the eight gray levels. The
continuous gray levels are quantized simply by assigning one of the eight
discrete gray levels to each sample. The assignment is made depending on the
vertical proximity of a sample to a vertical tick mark.

Faculty/Date: HOD/Date:
LCE/7.5.1/RC 01
TEACHING NOTES
Department: ELECTRONICS & COMMUNICATION ENGINEERING
Unit: I Date:
Topic name: Gray Level, Sampling & Quantization - 2 No. of marks
allotted by JNTUK:
Relationship between Pixels
Imaging Geometry - 1
Books referred: 01. Digital Image Processing by R C Gonzalez and R E Woods
02. www.wikipedia.org
03. www.google.com
This will result in digital samples as shown below:

Starting from the top of the image and carrying out this procedure line by
line produces a 2D digital image.
Relationship between Pixels:
A pixel p at coordinates (x, y) has four horizontal and vertical neighbors
whose coordinates are given by (x+1, y), (x-1, y), (x, y+1), (x, y-1). This set of
pixels called the 4-neighbors of p, is denoted by N4(p). Each pixel is a unit
distance from (x, y), and some of the neighbors of p lie outside the digital image
if (x, y) is on the border of the image.
The four diagonal neighbors of p have coordinates (x+1, y+1), (x+1, y-1),
(x-1, y+1), (x-1, y-1) and are denoted by ND(p). These points, together with the
4-neighbors, are called the 8- neighbors of p, denoted by N8(p). As before, some
of the points in ND(p) and N8(p) fall outside the image if (x, y) is on the border of
the image.
Imaging Geometry:
The basic transformations in image geometry are scaling, translation and
rotation. Here all transformation is expressed in 3D coordinate system (x, y, z).
1. Translation:
The point (x, y, z) is translated to new location with coordinates (x0, y0, z0)
which can be done by translation i.e.,
x*=x0+x
y*=y0+y
z*=z0+z

Page 4 of 6
Therefore, x*, y* and z* are the coordinates of new location. The above
equations are written in the form of matrix as:

Faculty/Date: HOD/Date:
LCE/7.5.1/RC 01
TEACHING NOTES
Department: ELECTRONICS & COMMUNICATION ENGINEERING
Unit: I Date:
Topic name: Imaging Geometry - 2 No. of marks allotted by
JNTUK:
Books referred: 01. Digital Image Processing by R C Gonzalez and R E Woods
02. www.wikipedia.org
03. www.google.com
2. Scaling:
The scaling of coordinates (x, y, z) are represented by Sx, Sy and Sz. Then
scaling transformation matrix is,

3. Rotation:
Rotation of arbitrary requires three steps. They are the arbitrary point is
translated to origin, performs rotation and translates the point back to its current
position.
Rotation of a point about x-axis by angle α is denoted by Rα.

Similar transformations are performed for y and z axis at an angle of β and


θ respectively.

Page 5 of 6
Faculty/Date: HOD/Date:

Page 6 of 6