You are on page 1of 5

DIGITAL IMAGE PROCESSING

Chapter 1

What is Digital Image


An image is a two dimensional function f(x, y), where
X and y =spatial (plane) coordinates
Amplitude of f=intensity or gray level of the image at that point
When x, y and amplitude are all finite, the image is called digital image.
A digital image consists of finite number of elements each of which has a particular location
and value. These elements are called pixels.

Image processing

Two applications:

1) Improvement of pictorial information for human interpretation


2) Processing of information for storage, transmission and representation.

In the first case, the output is an improved image. In the second case, output is the data for
transmission and analysis.

There are three levels of image processing:

Low level: Primitive processes such as noise reduction, contrast enhancement, and image
sharpening. In low level image processing, both the input and outputs are images.

Mid level: tasks such as segmentation, description of those objects to reduce them to a form
suitable for computer processing and classification of individual objects. In mid level processing,
inputs are images but outputs are attributes extracted from those images(e.g edges, contours and
identity of individual objects.

High level: input is attributes and output is understanding of image by analysis.

Thus image processing encompasses processes whose inputs and outputs are images, processes
which extracts attributes from images and recognition of objects.

Example of low, mid and high level image processing

Automated analysis of text

Low level processing: preprocessing of the image

Mid level processing: segmenting and extracting individual character


high level processing: computer processing and recognizing individual character

How is image processing done

Energy is radiated from a source and is passed through the object whose image is required. The
radiations either pass or are reflected from the object. These radiations are detected by detector.
They are then processed and data is extracted from the image.

Energy source used for IP:

Electromagnetic energy spectrum: Gamma rays, X-rays, Ultraviolet, Visible, Infrared,


Microwaves, radiowaves

Other sources: acoustic, Ultrasonic and electronic (Electronic beams)

Diagram: EM spectrum

Gamma ray imaging: used in nuclear medicine and astronomical observations.

Approach:

Inject patient with radioactive isotope that emits gamma rays as it decays.

Images are produced from the emissions collected by gamma ray detectors.

Example: PET-Positron Emission Tomography: Positrons are emitted. When positron meets
electron, two gamma rays are given off. These are detected and image is created using
tomographic principles.

X rays imaging: one of oldest source of energy

Cathode is heated- it emits electrons- it hits the nucleus- nucleus releases energy in the form of
x-rays. Penetrating power of x rays are controlled by voltage applied across anode. Number of x-
rays are controlled by current applied to filament in the cathode. Example of x-ray is in medical
field- chest x ray etc. other example is angiography.

Imaging in ultraviolet band

Applications of ultraviolet-lithography, industrial inspection, microscopy, laser, biological


imaging and astronomical observations.

Ultrasonic light itself is not visible. It hits fluorescent material- electrons are excited to higher
energy level. While coming to lower level back, it emits energy in the form of light.

Visible and infrared bands

Imaging in microwave band


Radar application. Unique feature of radar: ability to collect data over virtually any region at any
time regardless of weather and lighting conditions. Radar is the only way to explore inaccessible
regions of earths surface.

Working- it illuminates the object with its radiations and receives it by antenna. Then it does
computer processing.

Imaging in radio band

Used in medicines and astronomy. In medicine it is used in MRI.

MRI- patient is placed in strong magnetic field. Radio waves are passed through him. Are
detected by array of detectors and processed by computers.

Other sources

Ultrasound: used routinely in manufacturing, used in medical field to determine health of


unborn baby.

Fundamental steps in Digital Image Processing

Image acquisition:

It is the first process.image acquisition is the creation of photographic images, such as of a


physical scene or of the interior structure of an object. Acquisition could be as simple as being
given an image that is already in digital form to preprocessing such as scaling.

Image enhancement:

To bring out details that is obscured or simply to highlight certain features of the image. For
example- to increase the contrast of the image because image looks better. Enhancement is
subjective area of image processing.

Image restoration

It deals with improving the appearance of the image. It is not subjective, it is objective.

It is used to remove the degradation of the image by using some techniques and formulations.-
uses techniques based on mathematical and probabilistic models.

Color image processing

It involves acquisition of color images using color sensors- other field is filling color in objects
with different intensities.

Wavelet transforms:
Wavelets are small waves of varying frequency and limited duration. They are used to represent
images with various degree of resolution (multiresolution). Some features of the image are not
visible with one resolution can be obtained using wavelets. Images are subdivided into smaller
regions.

Image compression

Techniques to reduce the storage required to save the image or the bandwidth required to
transmit it. Two types of methods- lossless and lossy. In lossless method, no data is lost in
lossy method, some data is permanently lost.

Morphological processing

It deals with extracting image components that are useful in representation and description of
image region shape. Input is the image and output is the extracted attributes.

Image segmentation

Partition of image into its constituent parts or objects. Level upto which subdivision is carried
depends upon problem being solved. Segmentation should be stopped when object of interest has
been isolated.

Representation and description

It follows the output of segmentation stage which is a raw pixel data. It is either boundary of
object or region inside of the object depending on the requirement of application.

Recognition

It is a process that assigns a label to an object based on descriptors.-pattern recognition.

Knowledge base

It is the prior knowledge required about a problem domain. It may be as simple as detailing
regions of an image where information of interest is known to be located. There is continuous
interaction between processing modules and knowledge base. That is why arrows are double
sided.

Elements of visual perception

Eye is nearly a sphere-avg diamtr of approx 20 mm -3 membranes-cornea, sclera and choroid-


choroid

A simple image formation model

Value of (x,y) is a positive scalar quantity. It must be non zero and finite, that is
0 < f(x,y) <

F(x,y) depends on two components: illumination and reflectance, i(x,y) and r(x,y)