You are on page 1of 113

Machine Vision & its Applications

Vision systems configuration


Illumination
Lighting three acceptance criteria
Maximize the contrast on those features of interest

Minimize the contrast elsewhere

Provide for a measure of robustness

4 cornerstones of vision illumination


Geometry - The 3-D spatial relationship among sample, light and camera.

Structure, or Pattern - The shape of the light projected onto the sample.

Wavelength, or Color - How the light is differentially reflected or absorbed by


the sample and its immediate background.

Filters - Differentially blocking and passing wavelengths and/or light


directions.
Voltage output analog

ADC

Camera Digital values


Photodiodes
ADC – 8 bit

28 -256 intensity levels

216 - gray levels


Contrast

Imax – I min

0-255

Imax = 255 –
white

I min =0 (Dark)
Factors to be considered for lighting selection
Is it a monochrome or colour application?

Is it a high speed or low speed application?

What physical size (area) needs to be illuminated?

What is the nature of the object - geometry, reflectivity?

What is the feature of particular interest?

What does the service life of the light need to be?

What are the mechanical constraints/environmental considerations


(background, physical size)?
Basic Lighting Strategy
If the material is opaque, front lighting
may be needed. If it is translucent or
transparent, backlighting tends to be the
better option
A transparent material generally will
require a diffuse lighting source.
Otherwise, any unevenness in the lighting
could begin to look like a component
feature or a defect to the inspection
system.

When inspecting translucent material such as paper, backlighting is still


probably the best option
Transmitting light but causing sufficient diffusion to prevent perception of
distinct images.
Machine Vision Lighting Techniques
Dark field Illumination
Bright field Illumination
On-Axis (Coaxial) Lighting
Structured Lighting
Back Lighting
Few Examples of practical Applications
Few Examples of practical Applications
Camera
A camera can be thought of as a device to convert an optical image into
an electronic or film rendition of that optical image.

Line scan and


area scan

Intelligent
Camera
Area Scan Camera

Random access method for transferring the


pixel information rather than using the shift
register

Light sensitive photo diodes positioned


accurately in a matrix with associate shift
register being used to transfer the charge
CCD Concepts

Willard Boyle George E. Smith


Awarded the Noble Prize in Physics for the
invention of the CCD. October 6, 2009
CCD VS CMOS
Interline transfer CCD
Fast shuttering/Electronic shuttering
Low fill factor and dynamic range
Low image quality
Frame Transfer CCD

Full frame CCD


CCD VS CMOS

CCD Charge Transfer CMOS Charge Transfer

Timing of the charge transfer to the vertical CCD depends on the frame or
field rate of the camera.
Color Imaging with a Single CCD Array
To achieve a color image, we must measure light at three different wavelength ranges
known as color – red, green, and blue.

There are two basic methods of color imaging using CCD arrays. One uses a single
CCD, the other requires three CCDs, one for each color.

Green is chosen to have twice the number


of pixels as red or blue because our eyes are
most sensitive to green light
Four pixels must be combined to form
one color dot, the resolution of the image
is less than a monochrome image.

Bayer Pattern for Color CCD Imaging Arrays.


Color Imaging

3-CCD Camera Captures Each


Color on a Separate CCD Array.

For digital image processing, the Hue refers to the wavelength of the color, seen by the
camera. Hue is the parameter of color that allows us to distinguish between colors
Lens
Camera will require a lens to project the correct image onto the
camera sensor

Basic points to be considered for lens selection

The illumination in use

The specifications of the camera

The object size & geometry

Physical constraints i.e. working distance and


available space
Aperture

Small aperture results in


a small amount of background
blur, which typically is ideal for
things like landscape and
architectural
Aperture

Aperture is a fraction.
Shutter speed
Quick shutter speeds freeze action, while long shutter speeds create an effect of
motion when you photograph moving objects.

1/8000th of a second 30 seconds

Shutter speed: 1/1600th second (a fast shutter speed)


Shutter speed
Shutter speed

Shutter speed: 1/2000 second (quite fast)


Focal Length

Focal length, usually represented in millimeters (mm), is the basic description of a


photographic lens. It is not a measurement of the actual length of a lens, but a
calculation of an optical distance from the point where light rays converge to form a
sharp image of an object to the digital sensor or 35mm film at the focal plane in the
camera. The focal length of a lens is determined when the lens is focused at infinity.
The focal length tells us the angle of view—how much of the scene will be captured—
and the magnification—how large individual elements will be. The longer the focal
length, the narrower the angle of view and the higher the magnification. The shorter
the focal length, the wider the angle of view and the lower the magnification
Focal Length
Observe and infer?
LENS ELEMENTS
Lens Transfer function
Lenses act as low pass filters

Amount of attenuation of a given frequency


(or detail) is classified in terms of
Modulation Transfer Function (MTF) which
gives an indication of the lens‟ transfer
efficiency

Resolving power of the lens and is usually


quoted in line-pairs per millimeter (lp/mm)

Lens is capable of resolving down to the


individual pixels on the camera sensor.
A Modulation Transfer Function (MTF) quantifies how well
a subject's regional brightness variations are preserved
when they pass through a camera lens
A Modulation Transfer Function (MTF) quantifies how well a subject's regional
brightness variations are preserved when they pass through a camera lens

Rayleigh limit (line pairs per mm) = 1/(1.22 N ω)


DIFFRACTION

Diffraction
Pattern
Spherical Aberration
These spherical lens elements do not focus the light into a single point, but
spread it out, creating an image that appears de-focused, or blurred.
Chromatic Aberration
Chromatic aberration is the result of
different wavelengths of light (colour)
being refracted by different amounts as the
light passes through glass elements within
the lens.
LENS TYPES
Standard Resolution C-Mount Macro C-Mount
Fixed focal lengths from 4.8 - 200mm
Very good depth of field, high MTF
MTF of 70-90 lp/mm and negligible distortion make them
ideal for many machine vision
applications.

High Resolution C-Mount


50mm, with an MTF in excess of 120
lp/mm and very low distortion
LENS TYPES Telecentric Lenses
Large Format Lenses
F-Mount 42 and 72mm Mounts

Micro Head
12mm & 17mm Mounts

Remote head cameras


Telecentric Lenses
When a thick object (thickness > 1/10
FOV diagonal) must be measured
When different measurements on
different object planes must be carried
out
When the object-to-lens distance is
not exactly known or cannot be
predicted
When holes must be inspected or
measured
When the profile of a piece must be
extracted
When the image brightness must be
very even
When a directional illumination and a
directional “point of view” are
required.
Conventional machine-vision lens passes light through an aperture stop,
through the glass lens itself, and projects an image onto a camera detector

object-sided telecentric lens, the Image-sided telecentric lens maintains the


principle ray remains parallel to the dimensional information in an image within
optical axis for objects within the the telecentric depth.
telecentric depth.
bilateral telecentric lens includes elements of object-sided and
image-sided telecentric lenses.
FRAME GRABBER
The functional block that synchronizes image acquisition and transfers
data from the camera to the computer.

Peripheral component interconnect (PCI) bus provided high-speed data


transfer and reduced or eliminated the need for local storage for some
applications
Block Diagram of a Camera Link Frame Grabber
FRAME GRABBER
Frame grabbers performed digitization, synchronization, data formatting, local
storage, and transfer to the computer

Block Diagram of an Analog Video Frame Grabber


Interface Signal Types
USB (Universal Serial Bus)
Parallel Digital

Cameras that use a parallel interface have


a signal for each bit of data and also for
each control signal
USB does away with the need to
Differential signals (either LVDS or RS- install an interface card as the
422) data I/O is handled by the host
PC
This interface is a „point-to-point‟
structure rather than a bus system, the USB 2.0 increases the speed of
timing between the camera and the the peripheral-to-PC connection
framegrabber is deterministic, ie the time from 12Mb/s to 60 Mb/s.
between image capture and that image
arriving in the host PC is predictable.
FRAME GRABBER
IEEE -1394 FireWire
DCAM stands for "1394-based Digital Camera Specification

One asynchronous channel for camera register control and an


isochronous (asynchronus data transfer over a synchronous
link) channel which handles video data

Camera Link™
Camera Link™ is a new concept for digital camera interfacing that enables
higher data transfer speeds using fewer wires with standardized connectors,
cabling and data protocols.
Gigabit Ethernet Analogue
Image Transfer Methods
Direct Transfer Frame grabber components

Formatting the image data


ADC

Framegrabber

Framegrabber with Pre- Processing


Image Formation
Continuous tone image
A typical black and white photograph is composed of shades of gray spanning
from black and white is known as a continuous tone image .

Pixel
A Sample is often referred as pixel because it represents a discrete element
of the digital image

Row coordinate represents the line number and the column represents the
pixel number in that line
Image formation
• Optical parameters of the lens
– lens type
– focal length
– field of view
• Photometric parameters
– type, intensity, and direction of illumination
– reflectance properties of the viewed surfaces
• Geometric parameters
– type of projections
– position and orientation of camera in space
– perspective distortions introduced by the imaging process
Spatial Resolution
How many pixels our digital image is divided in to?

Establishing the number of samples required to in a digital image –Spatial frequency

Spatial frequency

The rate at which the brightness of an image changes from dark to light

Spatial resolution takes in to account the rate of change of brightness in an


image going from left to right as well as top to bottom

Sampling rate (Nyquist criterion)


Twice the highest spatial frequency
of the detail in the image
How are images represented in the computer?

Color Image
Image sampling (example)
original image sampled by a factor of 2

sampled by a factor of 4 sampled by a factor of 8


CCD Camera and image Transfer

Tiny solid state cells convert light


energy into electrical charge.
The image plane acts as a digital
memory that can be read row by
row by a computer.

Usually, a CCD camera plugs into a computer


board (frame grabber).
The frame grabber digitizes the signal and
stores it in its memory (frame buffer).
Image digitization
Sampling means measuring the
value of an image at a finite
number of points.
Quantization is the representation
of the measured value at the
sampled point by an integer
Digital image
• An image is represented by a rectangular array of integers.
• An integer represents the brightness or darkness of the image
at that point.
• N: # of rows, M: # of columns, Q: # of gray levels
– N= ,M= ,Q= (q is the # of bits/pixel)
– Storage requirements: NxMxQ (e.g., N=M=1024, q=8, 1MB)
2n 2m 2q

f (0,0) f (0,1) ... f (0, M  1)


f (1,0) f (1,1) ... f (1, M  1)
... ... ... ...
f ( N  1,0) f ( N  1,1) ... f ( N  1, M  1)
Image Quantization
Image quantization:

Discretize continuous pixel values into discrete numbers

Color resolution/ color depth/ levels:

- No. of colors or gray levels or


- No. of bits representing each pixel value
- No. of colors or gray levels Nc is given by

Nc  2 b

where b = no. of bits


Intensity Level Resolution
•Intensity level resolution refers to the number of intensity levels used to
represent the image
– The more intensity levels used, the finer the level of detail discernable
in an image
– Intensity level resolution is usually given in terms of the number of bits
used to store each intensity level

Number of Intensity
Number of Bits Examples
Levels
1 2 0, 1
2 4 00, 01, 10, 11
4 16 0000, 0101, 1111
8 256 00110011, 01010101
16 65,536 1010101010101010
Quantization function
Nc-1

Quantization level Nc-2

1
0
Light intensity
Darkest Brightest
Intensity Level Resolution
256 grey levels (8 bits per pixel) 128 grey levels (7 bpp) 64 grey levels (6 bpp) 32 grey levels (5 bpp)

16 grey levels (4 bpp) 8 grey levels (3 bpp) 4 grey levels (2 bpp) 2 grey levels (1 bpp)
Image quantization(example)
• 256 gray levels (8bits/pixel) 32 gray levels (5 bits/pixel) 16 gray levels (4 bits/pixel)

• 8 gray levels (3 bits/pixel) 4 gray levels (2 bits/pixel) 2 gray levels (1 bit/pixel)


Image Preprocessing
History of image processing

1964 – NASA’s Jet Propulsion Laboratory began working on computer


algorithms to improve images of the moon.
- images were transmitted by Ranger 7 probe.
- corrections were desired for distortions inherent in on-board camera

Function of a eye glasses

Corrective eye glasses serve to alter observed pictorial scenes in such a


way that aberrations created by eye are compensated for by correcting
the image before its contact with eye.

Brightness and contrasts control in image or in television set

Water in the pond alter the form of an image


Objective of image processing

To visually enhance or statistically evaluate some aspect of an image not


readily apparent in its original form

Techniques for image processing

Brightness and Flexibility and


contrast control in power for image
TV processing
Image Processing –Basics
Image operation

An image operation is any action upon an image that is defined from an


application‟s standpoint

Image Process

Defines how a given operation to be implemented

Modify the appearance Produce numeric Code an image in to new


or quality of the image information based on image form
Image Quality Enhancement
Subjective
To make an image more visually
appealing and may be applied until an
image achieves this goal

Objective

To correct an image for know


degradations and does not necessarily
Deal with Modify the attempt to make the image more
alterations of content of detail appealing
brightness within within an image
an image
Human Eye’s response to light intensity

Illumination intensity of the


viewed object changed, the
viewer will not perceive an
equal change in brightness

A simple darkening of bright regions can bring out previously undetectable


details
Visual perception of brightness

Simultaneous contrast
It is an Illusion where perceived brightness of a region is dependent on the intensity
of the surrounding area.
Average intensity darker
,perceived brightness
increased

Two inner rectangles are exactly the same shade


of grey, but the upper one appears to be a lighter
grey than the lower one due to the background
provided by the outer rectangles.

Average intensity lighter


,perceived brightness
decreased
Mach Band Effect

A
B
Intensity

Position
In area A, brightness perceived is darker while in area B is brighter. This
phenomenon is called Mach Band Effect.
Mach Band Effect
Contrast Enhancements
Contrast deals with the distribution of brightness within an image

High contrast image is that of intense boldness


Low contrast image characterized by the washed out look

A well balanced image of “good” contrast is composed of gray tones


stretching from the dark blacks through the grays, to the bright whites.

To alter the overall brightness of the an image


Spatial degradations and enhancements
Associated with the presentation of the image details.

Spatial deals with the Two dimensional nature of an image scene

Image noise and geometrical distortions also fall in this category

Extraction of object features not visible in the original

Image histogram

The histogram relates the brightness distribution present in the image

Brightness and contrast information provided by the histogram invaluable


when attempting to correct image for these degradations.
Image coding
Operations serve to reduce the amount of information necessary to describe
an image.

Image compression is to reduce irrelevance and redundancy of the image


data in order to be able to store or transmit data in an efficient form.
Situated behind the pupil is a colorless,
Human eye transparent structure called the crystalline
lens.
The retina is the innermost of three tissue
layers that make up the eye. The outermost
layer, called the sclera, is what gives most
of the eyeball its white color
The middle layer between the retina and
sclera is called the choroid. The choroid
contains blood vessels that supply the
Human eye focuses light onto a light
retina with nutrients and oxygen and
sensitive membrane called the retina.
removes its waste products.
Cornea is a transparent structure Embedded in the retina are
found in the very front of the eye that millions of light sensitive
helps to focus incoming light cells, which come in two main
varieties: rods and cones.
Behind the cornea is a colored ring-shaped Rods are good for
membrane called the iris. The iris has an monochrome vision in poor
adjustable circular opening called the pupil, light, while cones are used for
which can expand or contract depending on the color and for the detection of
amount of light entering the eye fine detail
Image frame processing
Image frame is used to denote an image in its entirety

Contrast and
spatial
enhancements
Histogram of an Image
An image histogram is a graphical representation of the tonal distribution in a digital
image.
Histogram sliding
Shift a complete histogram rightwards or leftwards. Due to shifting or sliding of
histogram towards right or left , a clear change can be seen in the image

Original image

After histogram
sliding by adding
50
After histogram sliding
Decreasing brightness using histogram sliding

All the pixel values has been shifted towards right and thus, it can be
validated from the image that new image is darker and now the original
image look brighter as compare to this new image.
Histogram Stretching Stretching function

Contrast = 225.

Contrast = 240

Histogram stretching the overall shape of


histogram remains same
Examples
Probability Mass Function
PMF stands for probability mass function. It gives the probability of each
number in the data set or you can say that it basically gives the count or
frequency of each element.

0 2 2/25
1 2 7 5 6 1 4 4/25
7 2 3 4 5 2 3 3/25
0 1 5 7 3 3 3 3/25
4 2 2/25
1 2 5 6 7
5 4 4/25
6 1 0 3 4 6 3 3/25
7 4 4/25
Probability Mass Function & CDF

Histogram PMF

CDF
Histogram equalization
Histogram equalization is used for enhancing the contrast of the images.

Gray Level CDF


Value
0 0.11
1 0.22
2 0.55
3 0.66
4 0.77
5 0.88
6 0.99
7 1
Gray Level CDF CDF * (Levels-1) Gray Level Frequency
Value Value
0 0.11 0 0 2
1 0.22 1 1 4
2 0.55 3 2 6
3 0.66 4 3 8
4 0.77 5 4 10
5 0.88 6 5 12
6 0.99 6 6 14
7 1 7 7 16
Gray New Gray Frequency
Level Level Value
Value
0 0 2
1 1 4
2 3 6
3 4 8
4 5 10
5 6 12
6 6 14
7 7 16
Histogram equalization the
overall shape of the histogram
changes
Simple Matlab Implementation
Texture Analysis

An image obeying some statistical properties


Similar structures repeated over and over again
Often has some degree of randomness
Texture Analysis
•Structural approach
•Statistical approach
•Fourier approach

A texture is a set of texture elements or texels occurring in some regular or


repeated pattern
Size/Granularity (sand versus pebbles versus boulders)
Directionality/Orientation
Random or regular (stucco versus bricks)
Problem with Structural Approach
What/Where are the texels?

Extracting texels in real images may be difficult or impossible


Process monitoring in grinding

E. Jannone da Silva, M. Biffi, J. F. G. de Oliveira [2004]


Automated inspection of ground surfaces

Machine settings (Speed, Feed, Depth of cut)


Grinding Process
Diamond
Dresser
Texture
Features
Grinding Feature
CCD Preprocessing
Wheel Extraction

Dresser Driver
Regrinding
Quantitative Learning Models
Assessment
End product

Controller

Desired Values

Block diagram represents the scheme for automated assessment


of surface texture

08-May-22 94
Why surface inspection?

Providing 100% quality inspection 365 days a year, 24 hours a day

Improving real-time process control

Reducing costly reject production and customer claims

Identifying need for preventive maintenance

Facilitating quality grading of products

Making sure only quality products are shipped to customers

08-May-22 95
Machined surfaces
Manufactured surface deviates from ideal shape due to

 Manufacturing process variations


 Imperfections in machine tools
 Errors in component setting, alignment
Shape deviations can be classified as
 Roughness
 Waviness
 Form

Errors in a manufactured surface


Machined surfaces

Roughness: Irregularities in the surface texture

1.Tool feed rate


2.Tool chatter

Waviness : Slow undulation in the surface texture

1. Workpiece or Tool deflection


2. Non-Linear feed motion
3. Vibration or Relative motion in the machine

Form : Inaccuracies in machine tool guideways


Contact-type measurement
Electrical signal
Transducer A/D converter

Stylus

Computer
Digitized points

Principle of stylus profilometer for surface roughness measurement


Non-contact measurement
Test piece
Collimated light

Viewing camera

Surface texture inspection by A surface image and corresponding intensity


machine vision distributions

Graham et.al. (1981) Precision Engg.

08-May-22 99
Non-contact : machine vision system

1.Camera & Lens 2.Lighting 3. Proximity senor


4 .Frame Grabber 5. PC 6. Software 7.Digital I/O

08-May-22 100
Lighting configuration selection
Camera
Camera
Light Source
Lens
Lens
θ Fluorescent ring light

φ
Object
Work piece

Fluorescent ring light


Point light source

Camera

Lens
LED ring light
Beam splitter

Work piece LED white light

Work piece
LED ring light
08-May-22 101
Co-axial illumination
Lighting configuration selection

1) LED ring light 2) Single point light source 3) Fluorescent ring light

Lighting system Inhomogeneity Image


Indicator (QI) sharpness

Point light source 1.32 5.45


LED ring light 2.45 4.32
Fluorescent ring light 1.24 2.51
3) Co-axial illumination Co-axial Illumination 1.21 2.34

Captured images of the ground surface using different lighting


configurations
08-May-22 102
Illumination compensation

Mean Intensity image

Contrast image

i1 ( x, y) and i2 ( x, y) = distortions of illumination


Illumination compensation

08-May-22 104
Evaluation of optical parameters
Variance
It represents the square of mean deviations of the pixel value from the mean
value
2

  I (i, j )  I 
M N
1
V
M N i 1 j 1
M N
1
where I 
M N
 I  i, j 
i 1 j 1

Gray level average Ga 

Gray level average of an image of resolution M X N is represented as given


below
 
M N
1
Ga   I (i, j)  I
M  N i 1 j 1

08-May-22 105
Results
Correlation coefficient between the image parameters and the
stylus roughness values

Optical parameters Variance Gray level average

Surface roughness Ra Rq Rz Ra Rq Rz
(µm) (µm) (µm) (µm) (µm) (µm)

Co-axial Illumination 0.776 0.775 0.835 0.758 0.755 0.819

After correction 0.849 0.837 0.866 0.804 0.790 0.836

Point light source .702 .765 .721 .712 .736 .734

After correction .814 .822 .816 .802 .745 .826

Fluorescent light .682 .693 .652 .704 .713 .676

After correction .745 .752 .728 .792 .760 .754

08-May-22 106
Texture Analysis for Grinding Wheel
wear Assessment

08-May-22 107
Direct imaging method

Grinding wheel Specification and the


CCD Camera Cutting parameters
Wheel
Halogen Lamps AA60 K5 V8
LED Ring Illuminator specifications
Diameter- 300 mm
Wheel size
Width - 25 mm
Table Traverse 15 mm/min
Cutting speed 32 m/sec
Cutting condition Dry
Grinding Wheel
Work material Mild steel (0.25 C)
Depth of cut per
20 m
stroke

Experimental setup for capturing


grinding wheel Images

08-May-22 108
Grinding wheel images

(a) Fresh wheel (b) After 30 minutes (c) After 60 minutes

(d) After 90 minutes (e) Worn-out wheel- 120 minutes

08-May-22 109
Power spectral density plots

(a) Fresh wheel (b) After 30 minutes

(c) After 60 minutes (d) After 90 minutes

(e) Worn-out wheel


Texture measures with the grinding time

a) For a ring region (4 ≤ r ≤ 8 ) b) For a ring region ( 8 ≤ r ≤ 16 )

c) For a ring region (16 ≤ r ≤ 32) d) For a ring region (32 ≤ r ≤ 64)

08-May-22 111
Texture parameter selection & classification
Variance Normalized Class Separation Distance (Kerr et al (2004)

 xj   xk ASM

Dxjk  Energy of ring1


Fresh wheel

 xj  2 2
xk
Energy of ring2
Used wheel
Energy of ring3
Worn-out wheel
Diagonal moment
S.No Texture parameters Dxjk
Variance of amplitude

1. Angular second moment 6.26 Horizontal detail


energy

2. Diagonal moment 6.32


Neural network architecture used for
3. Variance of amplitude 6.03 classification
4. Energy of ring region1 5.67 Classification Results

5. Energy of ring region 2 5.78 Grinding wheel Classification


S.No
condition %
6. Energy of ring region 2 5.54 1 0 (Fresh wheel) 100
2 1 (Used wheel) 91.67
3 2(Worn-out wheel) 100
7. Horizontal detail Energy 5.66
4 Mean 97.223
References
1.N.Arunachalam, B.Ramamoorthy, “Texture analysis for grinding wheel wear Assessment using Machine vision”,
Proceedings of Institution of Mechanical Engineers Part: B Engineering Manufacture, Vol:221, No: 3 ,419-430,
2007

2.N.Arunachalam, B.Ramamoorthy, “Fourier transform based texture measures for grinding wheel condition
monitoring using Machine Vision” International Journal of Manufacturing Technology and Management , Vol.
21, Nos. 1/2, 2010,112-121.

3.N.Arunachalam, B.Ramamoorthy, “Vision based surface roughness evaluation of ground components using
machine vision”, XVIII IMEKO World Congress, Rio de Janeiro, Brazil, Sep. 17-22. (Paper No.631).

4. N.Arunachalam, B.Ramamoorthy, “Characterisation of ground components quality using Machine vision for
Grinding wheel condition monitoring, Proceedings of International Conference on Global Manufacturing and
Innovation, organized by CIT and University of Massachusetts-Dartmouth,USA,pp.19,2006.

5.N.Arunachalam,B.Ramamoorthy, Fourier transform based texture measures for grinding wheel wear
assessment using machine vision, Proceedings of All India Machine Tool Design and Research Conference,
AIMTDR-2006,IITRoorkee,Dec.22-25,pp.659-664.

6.N. Arunachalam, B. Ramamoorthy, Studies on the Surface Condition of Grinding Wheel using Machine Vision,
COPEN-2005,pp.419 - 424

You might also like