You are on page 1of 146

Welcome to the Wonderful World

Of Digital Image Processing

Digital Image Processing:


Introduction
S.Deepa

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


EC8093 – Digital Image Processing-
Syllabus

OBJECTIVES:
• To become familiar with digital image fundamentals
• To get exposed to simple image enhancement techniques in Spatial and
Frequency domain.
• To learn concepts of degradation function and restoration techniques.
• To study the image segmentation and representation techniques.
• To become familiar with image compression and recognition methods

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


EC8093 – Digital Image Processing-Syllabus
UNIT I DIGITAL IMAGE FUNDAMENTALS
Steps in Digital Image Processing – Components – Elements of Visual Perception – Image
Sensing and Acquisition – Image Sampling and Quantization – Relationships between pixels -
Color image fundamentals – RGB, HSI models, Two-dimensional mathematical preliminaries, 2D
transforms – DFT, DCT.
UNIT II IMAGE ENHANCEMENT
Spatial Domain: Gray level transformations – Histogram processing – Basics of Spatial Filtering–
Smoothing and Sharpening Spatial Filtering, Frequency Domain: Introduction to Fourier
Transform– Smoothing and Sharpening frequency domain filters – Ideal, Butterworth and
Gaussian filters, Homomorphic filtering, Color image enhancement
UNIT III IMAGE RESTORATION
Image Restoration – degradation model, Properties, Noise models – Mean Filters – Order
Statistics – Adaptive filters – Band reject Filters – Band pass Filters – Notch Filters – Optimum
Notch Filtering – Inverse Filtering – Wiener filtering
UNIT IV IMAGE SEGMENTATION
Edge detection, Edge linking via Hough transform – Thresholding – Region based segmentation –
Region growing – Region splitting and merging – Morphological processing- erosion and dilation,
Segmentation by morphological watersheds – basic concepts – Dam construction – Watershed
segmentation algorithm.
UNIT V IMAGE COMPRESSION AND RECOGNITION
Need for data compression, Huffman, Run Length Encoding, Shift codes, Arithmetic coding, JPEG standard,
MPEG. Boundary representation, Boundary description, Fourier Descriptor, Regional Descriptors – Topological
feature, Texture – Patterns and Pattern classes – Recognition based on matching.
EC8093 – Digital Image
Processing-Syllabus
 TEXT BOOKS:
• 1. Rafael C. Gonzalez, Richard E. Woods, ‗Digital Image Processing‘,
Pearson, Third Edition,2010.
• 2. Anil K. Jain, ‗Fundamentals of Digital Image Processing‘, Pearson,
2002.
 REFERENCES:
• 1. Kenneth R. Castleman, ‗Digital Image Processing‘, Pearson, 2006.
• 2. Rafael C. Gonzalez, Richard E. Woods, Steven Eddins, ‗Digital
Image Processing using MATLAB‘, Pearson Education, Inc., 2011.
• 3. D,E. Dudgeon and RM. Mersereau, ‗Multidimensional Digital Signal
Processing‘, Prentice Hall Professional Technical Reference, 1990.
• 4. William K. Pratt, ‗Digital Image Processing‘, John Wiley, New York,
2002
• 5. Milan Sonka et al ‗Image processing, analysis and machine vision‘,
Brookes/Cole, Vikas Publishing House, 2nd edition, 1999

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Introduction
“One picture is worth more than ten thousand words”

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Contents
This lecture will cover:
 What is a digital image?
 What is digital image processing?
 History of digital image processing
 Examples of digital image processing
 Key stages in digital image processing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Images - Examples

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


What is a Digital Image?
An Image is said to be a digital image if it is in computer
readable format.
Definition: An Image f(x,y) is said to be a digital image, if
its spatial coordinates (x,y) and the amplitude values all are
finite and discrete quantities.
A digital image is a representation of a two-dimensional
image as a finite set of digital values, called picture
elements or pixels.
Pixel is the smallest sample of an image. Each
pixel has a specific location and value

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


What is a Digital Image?

A digital image is a representation of a two-dimensional image


as a finite set of digital values, called picture elements or pixels

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


What is a Digital Image? (cont…)

Pixel values typically represent gray level value i.e., the


intensity or brightness of the pixel.
Remember digitization implies that a digital image is an
approximation of a real scene

1 pixel

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Types of Digital Images
Three Types
Binary Image - Pixel value comprises 0 & 1
Gray Scale Image - Pixel value ranges from 0 to 255
Color Image – Each pixel
2K
has three values- one
corresponding to R,G & B components

Binary Image Grayscale Image Color Image


Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
What is Digital Image Processing?

Digital Image Processing


Refers to processing digital images by means of a digital computer

Input Image Digital Output Image

Computer
Digital image processing focuses on two major tasks
 Improvement of pictorial information for human interpretation
 Processing of image data for storage, transmission and
machine perception

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Types of Images
Binary Image Grayscale Image Color Image

0 1 0 1 200 245 56 55


0  60 90 100 67
 0 1 1  
10 10 16 28
 9 656 7026
56  43
3756  78
1 0  85 150 43 92  32 99

1 1
   54 96  67 
70
15  256013902296  67
0 0 0 1  32 65 87 99   21 54 47  42 
  85 85 43  92
32 15 87 39
54  65 65 39 
32 65 87 99

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Image Processing Algorithms - Types
High Level Processing
Input: Attributes
Output: Understanding
Mid Level Processing
Examples:
Input: Image
Scene Understanding
Low Level Processing
Output: Attributes
Autonomous navigation,
Input: Image Examples:
Output: Image Object recognition,
Examples: Character Recognition
Noise removal, Segmentation
Image Sharpening,
Image Restoration
Introduction to Digital Image Processing-
Dr.S.Deepa,
Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
Origin
of
Digital Image Processing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


The first photograph in the world

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


History of Digital Image Processing

Early digital image


Early 1920s: One of the first applications of digital imaging was in
the news-paper industry
 The Bartlane cable picture transmission service
 Images were transferred by submarine cable between London and
New York
 Pictures were coded for cable transfer and reconstructed at the
receiving end on a telegraph printer

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


History of DIP (contd…)
Mid to late 1920s: Improvements to the Bartlane system
resulted in higher quality images
 New reproduction
processes based
on photographic
techniques
 Increased number
of tones in Improved
reproduced images digital image

Early 15 tone digital


image

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


History of DIP (contd…)
1960s: Improvements in computing technology and the onset
of the space race led to a surge of work in digital image
processing
 1964: Computers used to
improve the quality of
images of the moon taken
by the Ranger 7 probe
 Such techniques were used
in other space missions
including the Apollo landings
A picture of the moon taken
by the Ranger 7 probe
minutes before landing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


History of DIP (cont…)
1970s: Digital image processing begins to be used in medical
applications
 1979: Sir Godfrey N.
Hounsfield & Prof. Allan M.
Cormack share the Nobel
Prize in medicine for the
invention of tomography,
the technology behind
Computerised Axial
Tomography (CAT) scans
Typical head slice CAT
image

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


History of DIP (cont…)
1980s - Today: The use of digital image processing techniques
has exploded and they are now used for all kinds of tasks in all
kinds of areas
 Image enhancement/restoration
 Artistic effects
 Medical visualisation
 Industrial inspection
 Law enforcement
 Human computer interfaces
 Remote sensing
 Robotics etc.,

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


The EM Spectrum & Image Processing

Band Applications
Gamma Rays Nuclear medicines & Astronomy
X-rays Medical diagnostics, industry

UV band Lithography, industrial inspection, LASER, biological imaging

Visible & IR band Light microscopy, astronomy, remote sensing, industry

Microwave RADAR
Radio waves Nuclear medicines & Astronomy
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
1.Gamma Rays: PET & Astronomy

Cygnus Loop in the


constellation of Cygnus
Positron Emission Tomography

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


2.X Rays: Medicine

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


3. UV Band - Flouroscence Microscopy

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


4. Visible Band: Photography

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


4.Visible: Motion picture

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


4.Visible: Light Microscopy

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


4.Visible: Remote Sensing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


4.IR Images

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


5. RADAR Images : Microwave Frequency

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


6.Radio Waves - MRI and Astronomy

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Other non EM imaging Modalities
 Acoustic imaging
 Translate “sound waves” into image signals
 Electron microscopy
 Shine a beam of electrons through a specimen
 Synthetic images in Computer Graphics
 Computer generated (non-existent in the real world

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


1.Acoustic Imaging

Acoustic Image Ultra Sound Scan

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


2.Electron Microscopy

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


3.Synthetic image using Computer Graphics

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Steps
in
Digital Image Processing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Fundamental steps in Digital Image Processing
Wavelets &
Multiresolution Compression
Color Image Processing Morphological
Processing Processing

Image Segmentation
Restoration

Image Knowledge Base Object


Enhancement Recognition

Problem Image
Representation
Domain Acquisition & Description

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


1.Image Aquisition

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


2.Image Enhancement

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
3.Image Restoration

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
4.Morphological Processing
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
5. Image Segmentation
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
6. Object Recognition

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
7. Representation & Description
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
8. Image Compression
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
9.Colour Image Processing
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples of
Digital Image Processing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: Image Enhancement
One of the most common uses of DIP techniques: improve
quality, remove noise etc

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: The Hubble Telescope [Image
Restoration]
Launched in 1990 the Hubble
telescope can take images of
very distant objects
However, an incorrect mirror
made many of Hubble’s
images useless
Image processing
techniques were
used to fix this

Introduction to Digital Image Processing-


Dr.S.Deepa,
Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
Examples: Artistic Effects
Artistic effects are used to make
images more visually appealing,
to add special effects and to
make composite images

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: GIS
Geographic Information Systems
 Digital image processing techniques are used extensively to
manipulate satellite imagery
 Terrain classification
 Meteorology

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: Medicine [Image Segmentation]
Take slice from MRI scan of canine heart, and find boundaries
between types of tissue
 Image with gray levels representing tissue density
 Use a suitable filter to highlight edges

Original MRI Image of a Dog Heart Edge Detection Image

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: GIS (cont…)

Night-Time Lights of the World data


set
 Global inventory of human
settlement
 Not hard to imagine the kind of
analysis that might be done using
this data

Introduction to Digital Image Processing- Dr.S.Deepa,


Mrs.B.Sathyabhama,Mrs.V.Subashree PEC
Examples: Industrial Inspection
Circuit Board Controller
Human operators are expensive,
slow and unreliable
Make machines do the job instead Packaged Pills
Industrial vision systems
are used in all kinds of industries
Can we trust them?
Bottles
Air Bubbles in a clear
plastic product
Cereal

Introduction to Digital Image Processing- Dr.S.Deepa,


Mrs.B.Sathyabhama,Mrs.V.Subashree PEC Image of Intraocular Implant
Examples: PCB Inspection
Printed Circuit Board (PCB) inspection
 Machine inspection is used to determine that all components are
present and that all solder joints are acceptable
 Both conventional imaging and x-ray imaging are used

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: Law Enforcement
Image processing techniques are
used extensively by law enforcers
 Number plate recognition for
speed cameras/automated toll
systems
 Fingerprint recognition
 Enhancement of CCTV images

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Examples: HCI
Try to make human computer interfaces
more natural
 Face recognition
 Gesture recognition
 These tasks can be extremely difficult
without image processing system

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Summary
We have looked at:
 What is a digital image?
 What is digital image processing?
 History of digital image processing
 State of the art examples of digital image processing
 Key stages in digital image processing

Introduction to Digital Image Processing- Dr.S.Deepa, Mrs.B.Sathyabhama,Mrs.V.Subashree PEC


Unit 1
Digital Image Fundamentals

PART 2

Dr.S.Deepa
Mrs.V.Subashree
Mrs.B.Sathyabhama
Panimalar Engineering College
1.4
Components
of an
Image Processing System
Components of an
Image Processing System
1. Image Sensors
Image Sensors- Physical Devices used to acquire digital images
Two elements are required to acquire digital images.
The Sensor
The Digitiser
THE SENSOR
The physical device that is sensitive to the energy radiated by the
object we wish to image

THE DIGITISER
Device for converting the output of the sensor into digital form.
2.Specialized Image Processing Hardware

Specialized Image Processing Hardware is called the Front


End Subsystem

Its most important characteristic is its speed.


Consists of Digitizer and ALU.

Used to average images

This unit performs functions that require fast


data throughputs that the main computer
cannot handle
3. Computer

General Purpose computer which can range


from a PC to a supercomputer

Super computers
4. Image Processing Software

Software for image processing consists of a


specialized module that performs specific tasks.
A well designed software package provides specialized
modules which can be used to perform specific image
processing operations.

Example: MATLAB, PYTHON, C , C++


5. Mass Storage
When dealing with thousands of images, providing adequate
storage in an image processing system is a big challenge

Types:
1.Short Term Storage
2.Online Storage
3.Archival Storage
Short term storage devices are used to access images at the time of
processing. Example: Computer Memory, Frame Buffers

Frame Buffers are specialised boards that can store one or more images
that can be
Online accessed
Storage: To rapidly at video
store images rates
that (30 images
require /sec)
frequent and fast recall
Frame buffers allow image zoom, image scroll and image pan
5. Mass Storage
Archival Storage Used for massive storage requirements which are
accessed rarely.
Example: Magnetic Tapes , Optical disks housed in ‘Jukeboxes’

Archival Storage Juke Box


6. Image Displays
Physical devices used to display the final output of
the image processing system.
Example:
Computer Monitors
Television Monitors
Stereo displays

Stereo displays are special display devices that are implemented


in the form of headgear containing two small displays
embedded in goggles

Stereo Displays
7. Hard Copy

Devices used for recording images.

Example:
Printers
Film Cameras
Heat Sensitive Devices
Inkjet Devices
Digital Units
Optical and CDROM disks
8.Networking
Used to share large amount of data involved
in image processing applications.

In dedicated networks, transmission bandwidth is not


a problem but for remote communication via Internet,
transmission bandwidth is the key consideration.

Evolution of optical fibers and other broadband


technologies have greatly influenced the transmission
process .
1.5
Elements
of
Visual Perception
Elements of Visual Perception
Human intuition and analysis play a central role in the
choice of digital image processing techniques for a
specific application.

The choice of image processing algorithms is made based on


subjective visual judgments.

Hence understanding Human Visual System (HVS) and


Human Visual Perception is important for the better
knowledge of image processing concepts.
Crucial Factors
The following factors play a crucial role in
understanding the image processing concepts

•Image formation in the eye and human perception

•Difference between human and electronic imaging in


terms of resolution,

•Ability of the imaging system to adapt to changes in


illumination
Structure of Human Eye
Structure of Human Eye (Contd…)
Eye is nearly a sphere with an average diameter of app 20mm

3 membranes enclose the eye


The cornea and sclera outer cover
The choroid
The retina

Iris Pupil Sclera

Cornea
1.Cornea and Sclera
Cornea is a tough , transparent tissue that covers the
anterior surface of the eye

Sclera is continuous with the cornea, it is an opaque


membrane that encloses the remainder of the
optic globe
2. Choroid
Lies directly below the sclera

This membrane contains a network of blood vessels


that serve as the major source of nutrition to the
eye
The choroid coat is heavily pigmented and helps
to reduce backscatter

At its anterior choroid is divided into


▪ Ciliary body
▪ Iris
2. Choroid. (Contd…)
Iris – contracts or expands to control the
amount of light that enters the eye.
Pupil- the centre opening of the eye – varies in
dia-from 2 mm to 8mm
Lens is made up of concentric layers of fibrous
cells and is suspended by fibers that attach to
the ciliary body
▪ It contains 60 -70 percent of water
▪ 6 percent of fat and more protein
▪ Colored by slightly yellow pigmentation that
increases with age
3. Retina
The innermost membrane of the eye

Discrete light receptors over the surface of the


retina afford the pattern vision

Receptors – two types

Cones (Photopic Vision or Bright light vision

Rods( Scotopic vision or dim light vision)


Distribution of rods & cones

▪Except for the blindspot, the distribution of receptors is radially symmetric about
the fovea.
▪Receptor density is measured in degrees from visual axis
▪Cones are most dense in the center area of the fovea which is the reason for
photopic vision.
▪Rods increase in density from the center out to approximately 20° off axis and
then decrease in density out to the extreme periphery of the retina.
Rods and Cones
Rods
Rods are responsible for scotopic or dim light vision.
About 75 to 150 million rods are distributed over
the retinal surface. Several rods are connected to
a single nerve end which reduces the amount of
detail discernible by these receptors.
Rods are not involved in color vision and are
sensitive to low levels of illumination.
That is the reason brightly colored objects in
daylight appears as colorless objects when
seen by moonlight.
Rods and Cones
Cones
Cone vision is called photopic or bright-light vision.

There are about 6 to 7 million cones present in


each eye that are located primarily in the fovea.
Cones are highly sensitive to color and are
responsible for color vision.
Each cone is connected to its own nerve
which helps us to resolve fine details.

Blind Spot
The photoreceptors are absent at the spot from which the optic nerve emerges
and it is known as blind spot.
Image Formation

In the figure the observer is looking at a tree 15 m high at a


distance of 100 m. If h is the height in mm of that object in
the retinal image, then

15/100=h/17

h=2.55 mm.
Perceived Brightness
Perceived Brightness is not a simple
function of intensity

Two phenomena clearly demonstrate this


▪ Mach Band Effect
▪ Simultaneous Contrast
Machband Effect
▪Mach band effect describes
an effect where the human
mind subconsciously
increases the contrast
between two surfaces with
different luminance.

▪It is based on the fact that


Human Visual System tends
to overshoot or undershoot
around the boundary of
regions of different
intensities

.
Although the intensity of the stripes shown in figure is constant,
HVS perceives a brightness pattern that is strongly scalloped,
especially near the boundaries. These seemingly scalloped bands are
called Mach Band after Ernst Mach(1865).
Simultaneous Contrast

This phenomenon is related to the fact that a region’s


perceived brightness does not depend simply on its
intensity. All the square have exactly the same intensity.
However, they appear to become darker as the
background gets lighter.
Optical Illusion
Optical Illusion

Optical Illusion
Eye fills in non existing information or wrongly perceives geometrical
properties of objects.
Brightness Adaptation and Discrimination

▪Shows a plot between subjective


brightness and logarithmic intensity

▪The long solid line shows the


entire range of intensities from
scotopic threshold to glare limit

▪HVS cannot operate at such a large


range simultaneously

▪Rather the transition from scotopic


to photopic takes place gradually
Weber Ratio

Weber ratio : is a measure of the ability of eye to discrminate changes in light


intensity at any specific adaptation level

Weber Ratio = ΔIc /I

▪ I → the background illumination


▪ ΔIc → the increment of illumination
▪ Small Weber ratio indicates good discrimination
▪ Larger Weber ratio indicates poor discrimination
1.6
Image Sensing
And
Acquisition
Image Acquisition Process
Types of Sensors
▪ Motionless imaging
▪ Sensor is kept still during the acquisition (e.g., CCD cameras)
▪ Motion-aided imaging
▪ Sensor moves along a line or rotates around a center during
the acquisition (e.g., document scanning and MRI scanning)

Types of Sensors
▪ Single Sensor
▪ Line Sensor
▪ Array Sensor
Single Sensor
▪The sensing material transforms the
illumination energy into an electrical
voltage
▪Filters are used to increase the
selectivity of the sensor element.
▪Example: Photodiodes are single
element sensors that convert light
energy into electrical energy.

▪A film negative is mounted onto


a drum
▪Mechanical rotation provides
displacement in one dimension.
▪The single sensor is mounted on
a lead screw that provides motion
in the perpendicular direction.
▪Produces high-resolution
images.
Line Sensor (Sensor Strips)
Line Sensor

▪ Sensor strips consists of an in-line arrangement of sensors


▪ Both 2-D and 3-D images can be acquired using sensor strips.

Acquisition of 2-D Image

▪ The strip provides imaging


elements in one direction.
▪ Motion perpendicular to the
strip provides imaging in the
other direction resulting in 2-D
image.
▪ This is the type of
arrangement used in most flat
bed scanners. Image Acquisition using Linear Sensor Strip

Application
Used in airborne imaging applications, in which the imaging system is mounted on an aircraft
that flies at a constant altitude and speed over the geographical area to be imaged.
Image Acquisition using Circular Sensor Strip
▪To obtain 3D images
▪Sensor strips mounted in a ring configuration
▪A rotating X-ray source provides illumination
▪Portion of the sensors opposite to the source
collect the X-ray energy that pass through the
object
▪The output of the sensors is processed by
reconstruction algorithms to transform the sensed
data into meaningful cross-sectional images.
▪3-D digital volume consisting of stacked images
is generated

Applications
•Medical and industrial Computerized Axial Tomography (CAT)
•Magnetic Resonance Imaging (MRI) and
•Positron Emission Tomography (PET) imaging.
Array Sensor
▪Image sensors arrays are used in
electromagnetic, ultrasonic sensing devices and
in digital cameras.
▪CCD and CMOS sensors are used to form the
sensor array.
▪The response of each sensor is proportional to
the integral of the light energy projected onto the
surface of the sensor.
▪Since the sensor array is two dimensional, the
complete image can be obtained by focusing the
energy pattern onto the surface of the array.
Digitised Image using Sensor Array
Digital Camera
1.7
Image Sampling
and
Quantisation
Image Sampling and Quantisation
Conversion of Analog to Digital Images involves two steps
▪Sampling
▪Quantisation

Sampling:
Digitizing the coordinate values is called sampling.

Quantisation
Digitizing the amplitude values is called quantization.

Analog Image Quantisation Digital


Image Sampling Image
Sampling

•The one-dimensional function shown in Fig (b) is a plot of amplitude


(gray level) values of the continuous image along the line segment AB in
(a).
•The random variations are due to image noise.
•To sample this function, we take equally spaced samples along line AB,
as shown in Fig. 2.16(c).
•The location of each sample is given by a vertical tick mark in the bottom
part of the figure.
•The samples are shown as small white squares superimposed on the
function. The set of these discrete locations gives the sampled function.
Quantisation

•In order to form a digital function, the gray-level values also must be
converted (quantized) into discrete quantities.
•The right side of Fig. 2.16(c) shows the gray-level scale divided into eight
discrete levels, ranging from black to white.
•The vertical tick marks indicate the specific value assigned to each of the
eight gray levels.
•The continuous gray levels are quantized simply by assigning one of the
eight discrete gray levels to each sample. The assignment is made
depending on the vertical proximity of a sample to a vertical tick mark.
•The digital samples resulting from both sampling and quantization are
shown in Fig. 2.16(d).
Using Sensor Array

When a sensing array is used for image acquisition, there is no motion and the
number of sensors in the array establishes the limits of sampling in both
directions.
Figure (a) shows a continuous image projected onto the plane of an array
sensor.
Figure 2.17(b) shows the image after sampling and quantization.
Clearly, the quality of a digital image is determined to a large degree by the
number of samples and discrete gray levels used in sampling and quantization.
Resolution

Resolution of a digital image depend on two factors


i)The number of picture elements in the image (M x N)
which in turn depends on the sampling rate
ii)The number of quantisation levels (L)

Two types of Resolution


▪ i)Spatial Resolution
▪ ii)Intensity Resolution
Spatial Resolution
Definition : Spatial resolution is the smallest discernible detail in an
image.
unit: line pairs/unit distance (or) dots per unit distance (or)
dpi(dots per inch)
Sampling is the principal factor determining the spatial resolution of an
image.
Aliasing in Image
If the function is under sampled then a (Moire Pattern)
phenomenon called aliasing corrupts the
sampled images.
The corruption is in the form of additional
frequency components being introduced into
the sampling function.
These are called aliasing frequencies.
The effect of aliased frequencies can be seen in
the form of ‘Moire Patterns’
Effect of resolution on image interpretation

(a) 8x8 image (b) 32x32 image (c) 256x256 image


Spatial Resolution
Image Resampling (Checkerboard Pattern)
Checkerboard Pattern is a degradation that results due to the reduction in
spatial resolution

1024 x 1024 512 x 512 256 x 256

128 x 128 64 x 64 32 x 32
Intensity Resolution
▪It refers to the smallest discernible change in gray level and
it depends on the number of gray levels.
▪Quantization is the principal factor determining the Intensity
resolution of an image.
▪The most commonly used is 28 that is 256 gray levels.
False Contouring
256 Levels 16 Levels 4 Levels

Use of insufficient number of gray levels in smooth areas of a digital


image causes False Contouring
Basic Relationship between Pixels

Pixels are basic elements which form the image and their
proper manipulation gives rise to different appearances.

▪Neighbors of a pixels
▪Adjacency
▪Connectivity
▪Path or Curve
▪Regions and Boundaries
▪Edge
▪Distance Measures
Neighbors of Pixels – Conventional Indexing Method

Conventional Indexing
Neighborhood relation is used for analyzing regions in an Image

Three types
▪4 –Neighbors
▪D-Neighbors
▪8-Neighbors
4 Neighbors of a Pixel

▪ Any pixel p(x, y) has two vertical and two horizontal neighbors, given by (x+1, y),
(x-1, y), (x, y+1), (x, y-1)
▪This set of pixels are called the 4-neighbors of P, and is denoted by N4(P).
▪Each of them are at a unit distance from p(x, y)

(x,y-1) 4-neighbors of p:

(x-1,y)
(x-1,y) p (x+1,y)
(x+1,y)
N4(p) = (x,y-1)
(x,y+1)
(x,y+1)

4-neighborhood relation considers only vertical and horizontal neighbors.

Note: q  N4(p) implies p  N4(q)


Diagonal Neighbors of a Pixel

• The four diagonal neighbors of p(x,y) are given by,


(x+1, y+1), (x+1, y-1), (x-1, y+1), (x-1 ,y-1)
• This set is denoted by ND(P).

(x-1,y-1) (x+1,y-1) Diagonal neighbors of p:

(x-1,y-1)
p
(x+1,y-1)
ND(p) = (x-1,y+1)
(x+1,y+1)
(x-1,y+1) (x+1,y+1)

▪Diagonal -neighborhood relation considers only diagonal neighbor pixels.


▪Each of them are at Euclidean distance of 1.414 from P
8 Neighbors of a Pixel

8-neighborhood relation considers all neighbor pixels.


Both 4 neighbors and D Neighbors together form the 8 Neighbors of the
pixel

(x-1,y-1) (x,y-1) (x+1,y-1) 8-neighbors of p:

(x-1,y-1)
(x-1,y) p (x+1,y)
(x,y-1)
(x+1,y-1)
(x-1,y)
(x-1,y+1)(x,y+1)(x+1,y+1) (x+1,y)
N8(p) = (x-1,y+1)
(x,y+1)
(x+1,y+1)
Connectivity

Connectivity is adapted from neighborhood relation.


Two pixels are connected if they are in the same class (i.e. the
same color or the same range of intensity) and they are
neighbors of one another.

For p and q from the same class


w 4-connectivity: p and q are 4-connected if q  N4(p)
w 8-connectivity: p and q are 8-connected if q  N8(p)
w mixed-connectivity (m-connectivity):
p and q are m-connected if q  N4(p) or
q  ND(p) and N4(p)  N4(q) = 
Adjacency

A pixel p is adjacent to pixel q is they are connected.


Two image subsets S1 and S2 are adjacent if some pixel in S1 is adjacent to some
pixel in S2

S1
S2

We can define three type of adjacency:


▪4-adjacency,
▪8-adjacency
▪m-adjacency
Path

A path from pixel p at (x,y) to pixel q at (s,t) is a sequence


of distinct pixels:
(x0,y0), (x1,y1), (x2,y2),…, (xn,yn)
such that
(x0,y0) = (x,y) and (xn,yn) = (s,t)
and
(xi,yi) is adjacent to (xi-1,yi-1), i = 1,…,n

We can define 3 types of path


▪4-path
▪8-path
▪m-path
depending on type of adjacency.
q
p
Distance

For pixel p, q, and z with coordinates (x,y), (s,t) and (u,v),

D is a distance function or metric if

w D(p,q)  0 , (D(p,q) = 0 if and only if p = q)

w D(p,q) = D(q,p)

w D(p,z)  D(p,q) + D(q,z)

Types
▪Euclidean Distance
▪Cityblock Distance
▪Chessboard Distance

Euclidean distance

De ( p, q) = ( x - s ) 2 + ( y - t ) 2
City-block distance

D4-distance (city-block distance) is defined as

D4 ( p, q) = x - s + y - t

2 1 2

2 1 0 1 2

2 1 2

Pixels with D4(p) = 1 is 4-neighbors of p.


Chessboard Distance

D8-distance (chessboard distance) is defined as

D8 ( p, q) = max( x - s , y - t )

2 2 2 2 2

2 1 1 1 2

2 1 0 1 2

2 1 1 1 2

2 2 2 2 2

Pixels with D8(p) = 1 is 8-neighbors of p.


1.9
Color Models
 Why use color in image processing?
• Color is a powerful descriptor
 Object identification and extraction
 eg. Face detection using skin colors
 Two category of color image processing
• Full color processing
 Images are acquired from full-color sensor or
equipments
• Pseudo-color processing
 In the past decade, color sensors and processing
hardware are not available
 Colors are assigned to a range of monochrome
intensities
 1666, Isaac Newton
 Chromaticlight span the electromagnetic
spectrum (EM) from 400 to 700 nm
 The color that human perceive in an
object = the light reflected from the
object

scene
Illumination source

reflection
eye
 Radiance: total amount of energy that flow from the
light source, measured in watts (W)
 Luminance: amount of energy an observer
perceives from a light source, measured in lumens
(lm)
 Brightness: subjective descriptor that is hard to
measure, similar to the achromatic notion of
intensity
 6~7M Cones are the sensors in the eye
 3 principal sensing categories in eyes
• Red light 65%, green light 33%, and blue
light 2%
 In1931, CIE(International Commission on
Illumination) defines specific wavelength
values to the primary colors
• B = 435.8 nm, G = 546.1 nm, R = 700 nm
• However, we know that no single color may be
called red, green, or blue
 Secondary
colors: G+B=Cyan, R+G=Yellow,
R+B=Magenta
Primary and secondary colors
 Color TV
色度圖
Color model, color space, color system
• Specify colors in a standard way
• A coordinate system that each color is
represented by a single point

Types:
▪RGB model - Used in hardware systems like monitors, Television
▪CMY model – Color Printing
▪CYMK model-Color Printing
▪HSI model – Human perception & Image Processing Algorithm
 Pixel depth: the number of bits used to
represent each pixel in RGB space
 Full-color image: 24-bit RGB color image
• (R, G, B) = (8 bits, 8 bits, 8 bits)
Totally how many colors can be represented ???
1,67,77,216 Colors
 Subset of colors is enough for some
application
 Safe RGB colors (safe Web colors, safe
browser colors) → colors that can be
faithfully reproduced by any system

(6)3 = 216 colors are called safe colors, defact standard for Internet Applications
Full color cube Safe color cube
 CMY: secondary colors of light, or
primary colors of pigments
 Used to generate hardcopy output

 C  1  R 
 M  = 1 - G 
    
 Y  1  B 
 Will you describe a color using its R, G, B
components?
 Human describe a color by its hue,
saturation, and brightness
• Hue : color attribute
• Saturation: purity of color (white->0, primary
color->1)
• Brightness: achromatic notion of intensity
 RGB -> HSI model Colors on this triangle
Have the same hue

Intensity
line
HSI model

You might also like