You are on page 1of 51

MACHINE VISION SYSTEM

Definition
• A machine vision system (MVS) is a type of
technology that enables a computing device to
inspect, evaluate and identify still or moving
images.
• It is a field in computer vision and is quite similar to
surveillance cameras, but provides automatic image
capturing, evaluation and processing capabilities.
Definition
• A machine vision system primarily enables a computer to
recognize and evaluate images. It is similar to voice
recognition technology, but uses images instead.
• A machine vision system typically consists of digital cameras
and back-end image processing hardware and software. The
camera at the front end captures images from the
environment or from a focused object and then sends them to
the processing system. Depending on the design or need of
the MVS, the captured images are either stored or processed
accordingly.
Benefits of Machine Vision
Components of Machine Vision
Lighting
• Lighting is one key to successful machine vision results. Machine
vision systems create images by analyzing the reflected light from
an object, not by analyzing the object itself. A lighting technique
involves a light source and its placement with respect to the part
and the camera. A particular lighting technique can enhance an
image such that it negates some features and enhances others, by
silhouetting a part which obscures surface details to allow
measurement of its edges, for example.
Lenses
• The lens captures the image and delivers it to the image sensor in
the camera. Lens will vary in optical quality and price, the lens used
determines the quality and resolution of the captured image. Most
vision system cameras offer two main types of lenses:
interchangeable lenses and fixed lenses. Interchangeable lenses
are typically C-mounts or CS-mounts. The right combination of lens
and extension will acquire the best possible image. A fixed lens as
part of a standalone vision system typically uses autofocus, which
could be either a mechanically adjusted lens or a liquid lens that
can automatically focus on the part. Autofocus lenses usually have
a fixed field of view at a given distance.
Image Sensor
• The camera’s ability to capture a correctly-illuminated image of the
inspected object depends not only on the lens, but also on the image sensor
within the camera. Image sensors typically use a charge coupled device
(CCD) or complementary metal oxide semiconductor (CMOS) technology to
convert light (photons) to electrical signals (electrons). Essentially the job of
the image sensor is to capture light and convert it to a digital image
balancing noise, sensitivity and dynamic range. The image is a collection of
pixels. Low light produces dark pixels, while bright light creates brighter
pixels. It’s important to ensure the camera has the right sensor resolution
for the application. The higher the resolution, the more detail an image will
have, and the more accurate measurements will be. Part size, inspection
tolerances, and other parameters will dictate the required resolution.
Vision Processing
• Processing is the mechanism for extracting information from a
digital image and may take place externally in a PC-based system,
or internally in a standalone vision system. Processing is performed
by software and consists of several steps. First, an image is
acquired from the sensor. In some cases, pre-processing may be
required to optimize the image and ensure that all the necessary
features stand out. Next, the software locates the specific features,
runs measurements, and compares these to the specification.
Finally, a decision is made and the results are communicated.
Communication
• Since vision systems often use a variety of off-the-shelf
components, these items must coordinate and connect to other
machine elements quickly and easily. Typically this is done by either
discrete I/O signal or data sent over a serial connection to a device
that is logging information or using it. Discrete I/O points may be
connected to a programmable logic controller (PLC), which will use
that information to control a work cell or an indicator such as a
stack light or directly to a solenoid which might be used to trigger a
reject mechanism.
Types of Machine Vision System
• 1D Vision Systems
1D vision analyzes a digital signal one line at a time instead of
looking at a whole picture at once, such as assessing the variance
between the most recent group of ten acquired lines and an earlier
group. This technique commonly detects and classifies defects on
materials manufactured in a continuous process, such as paper,
metals, plastics, and other non-woven sheet or roll goods
• 2D VISION SYSTEMS
• Most common inspection cameras perform area scans that involve
capturing 2D snapshots in various resolutions, as shown in Figure
11. Another type of 2D machine vision–line scan–builds a 2D image
line by line, as shown in Figure 12
• 2D Vision Systems
Area Scan VS Line Scan
• In certain applications, line scan systems have specific advantages
over area scan systems. For example, inspecting round or
cylindrical parts may require multiple area scan cameras to cover
the entire part surface. However, rotating the part in front of a
single line scan camera captures the entire surface by unwrapping
the image. Line scan systems fit more easily into tight spaces for
instances when the camera must peek through rollers on a
conveyor to view the bottom of a part. Line scan systems can also
generally provide much higher resolution than traditional cameras.
Since line scan systems require parts in motion to build the image,
they are often well-suited for products in continuous motion
• 3D Systems
3D machine vision systems typically comprise multiple cameras or
one or more laser displacement sensors. Multi-camera 3D vision in
robotic guidance applications provides the robot with part
orientation information. These systems involve multiple cameras
mounted at different locations and “triangulation” on an objective
position in 3-D space.
In contrast, 3D laser-displacement sensor applications typically
include surface inspection and volume measurement, producing 3D
results with as few as a single camera. A height map is generated
from the displacement of the reflected lasers’ location on an object.
The object or camera must be moved to scan the entire product
similar to line scanning. With a calibrated offset laser, displacement
sensors can measure parameters such as surface height and
planarity with accuracy within 20 µm.
Machine Vision Platforms
• Machine vision implementation comes on several physical
platforms, including PCbased systems, vision controllers designed
for 3D and multi-camera 2D applications, standalone vision
systems, simple vision sensors, and image-based barcode readers.
Choosing the right machine vision platform generally depends on
the application’s requirements, including development
environment, capability, architecture, and cost.
PC-BASED MACHINE VISION
• PC-based systems easily interface with direct-connect cameras or
image acquisition boards and are well supported with configurable
machine vision application software. In addition, PCs provide a
wealth of custom code development options using familiar and
well-supported languages such as Visual C/C++, Visual Basic, and
Java, plus graphical programming environments. However,
development tends to be long and complicated, so is usually
limited to large installations and appeal mostly to advanced
machine vision users and programmers
VISION CONTROLLERS
• Vision controllers offer all of the power and flexibility of PC-based
system, but are better able to withstand the rigors of harsh factory
environments. Vision controllers allow for easier configuration of
3D and multi-camera 2D applications, perhaps for one-off tasks
where a reasonable amount of time and money is available for
development. This allows for more sophisticated applications to be
configured in a very cost-effective way.
STANDALONE VISION SYSTEMS
• Standalone vision systems are cost effective and can be quickly and
easily configured. These systems come complete with the camera
sensor, processor, and communications. Some also integrate
lighting and autofocus optics. In many cases these systems are
compact and affordable enough to be installed throughout the
factory. By using standalone vision systems at key process points,
defects can be caught earlier in the manufacturing process and
equipment problems can be identified more quickly. Some
standalone vision systems provide both development
environments allowing for easy set up with the added power, and
flexibility of programming and scripting for greater control of
system configuration and handling of vision-application data.
VISION SENSORS AND IMAGE-BASED
BARCODE READERS
• Vision sensors and image-based barcode readers generally require
no programming, and provide user-friendly interfaces. Most are
easily integrated with any machine to provide single-point
inspections with dedicated processing, and offer built-in Ethernet
communications for factory-wide networkability
IMAGE
Definition
• In image processing, it is defined as the action of retrieving an image
from some source, usually a hardware-based source for processing.
It is the first step in the work flow sequence because, without an
image, no processing is possible. The image that is acquired is
completely unprocessed.

• Now the incoming energy is transformed into a voltage by the


combination of input electrical power and sensor material that is
responsive to a particular type of energy being detected. The output
voltage waveform is the response of the sensors and a digital
quantity is obtained from each sensor by digitizing its response.
• Image Acquisition using a single sensor:
• Example of a single sensor is a photodiode. Now obtain a two-
dimensional image using a single sensor, the motion should be in
both x and y directions.
• Rotation provides motion in one direction.
• Linear motion provides motion in the perpendicular direction.
• This is an inexpensive method and we can obtain high-resolution
images with high precision control. But the downside of this
method is that it is slow.
• Image Acquisition using a line sensor (sensor strips):
• The sensor strip provides imaging in one direction.
• Motion perpendicular to the strip provides imaging in other
direction.
• Image Acquisition using an array sensor:
• In this, individual sensors are arranged in the form of a 2-D array.
This type of arrangement is found in digital cameras. e.g. CCD array
• In this, the response of each sensor is proportional to the integral of
the light energy projected onto the surface of the sensor. Noise
reduction is achieved by letting the sensor integrate the input light
signal over minutes or even hours.
• Advantage: Since sensor array is 2D, a complete image can be
obtained by focusing the energy pattern onto the surface of the
array.
• The sensor array is coincident with the focal plane, it produces an
output proportional to the integral of light received at each sensor.
• Digital and analog circuitry sweep these outputs and convert them
to a video signal which is then digitized by another section of the
imaging system. The output is a digital image.
Concept of
• When objects reflect light they are said to be illuminated. The moon
is a perfect example of an object that is illuminated.
• The moon does not emit visible light. It reflects light from the sun.
This is why it is hard to see the moon during the day.
• The brightness of an illuminated object depends on two things.
1. The brightness of the luminous source shining light on the
illuminated object
2. The distance between the illuminated object and the luminous
source
• If a Light source is brighter it will illuminate objects more brightly.
This should be common sense for all of us who use light bulbs in our
house. The brighter the bulb above your desk, the more brightly your
workspace will be lit. The relationship between the brightness of the
lighting source and the illumination is direct. If a light gets twice as
bright the illumination will double. If a lighting source is three times
as bright the illumination will triple.
Common unit of light intensity is candela. It is Luminous intensity in the Perpendicular
direction of a surface, 1 / 600,000 of a black body at temperature of solidification or Freezing
of Platinum under Standard Atmospheric pressure. It is abbreviated as Cd. It is indicative of
Light Radiating Capacity of a source of Lamp.
 
• Thus Luminous Intensity over 1 Str. by 1, Cd, we call it 1 lumen ≈ 1
lm. Basic unit of Light Flux. ∴ Total Flux = 4 π lumens, out of the
sphere in Fig 2. If the Solid Angle be dω and Luminous Intensity I
Cd at the center then Luminous flux in dω = dφ = I dω lm.
• Yet another important unit is MSLI. It means Mean Spherical
Luminous Intensity. Average value of Luminous Intensity in all
directions. Therefore for the case in Fig 2.
• φ = I 4π lumens 
• Now we define Luminous intensity on a surface. It is known as
Illuminance. It is Luminous Flux per unit area or lumens per sq m. =
lumen / m2 = lm / m2 = lux (lx).
Law of Illumination
• The Inverse Square Law of Illuminance
• This law states that the Illuminance (E) at any point on a plane
perpendicular to the line joining the point and source is inversely
proportional to the square of the distance between the source and
plane.
Where, I is the luminous intensity in a given direction.
• Suppose a source is present with luminous intensity I in any
direction. From this source two distances are taken as the radius
making this source as centre.
• the two radiuses are r1 and r2. At distance r1 dA1 is the elementary
surface area taken. In this direction of dA1, dA2 is considered at r2
distance.
dA1 and dA2 are within same solid angle Ω with same distributed
luminous flux Φ.
Area dA1 at r1 receives the same amount of luminous flux as area
dA2 at r2 as the solid are the same.
This indicates the well known inverse square law
relationship for point source.
It is seen that Illuminance varies inversely as the square
of the illuminated point from the source.
If the light source is not a point source, then we can
assume this large source as the summation of many
point sources.
This relationship can be applied to all light sources.
The Cosine Law of Illuminance
• The law states that Illuminance at a point on a plane is proportional
to the cosine of the angle of light incident (the angle between the
direction of the incident light and the normal to the plane).

• It is the point source Illuminance equation.


Where, Iθ is the luminous intensity of the source in the direction of the
illuminated point, Ɵ is the angle between the normal to the plane
containing the illuminated point and the line joining the source to the
illuminated point, and d is the distance to the illuminated point.
• But for non point source, the cosine law of Illuminance can be
analyzed in term of luminous flux instead of luminous intensity.
The Illuminance or the surface density of the light flux received by
an elementary area varies with the distance from the light source
and the angle of the elementary area with respect to the direction
of the light flux.
The maximum Illuminance occurs when the element of area
receives the light flux normal to its surface.
When the element of area is tilted with respect to the direction of
the light flux, the Illuminance or flux density on the elementary
surface is reduced. This can be thought of in two ways.
• The tilted elementary area (δA) cannot intercept all the light flux it
previously received and so the Illuminance falls.
• If the elementary area (δA) increases, the Illuminance falls
Frechner’s law
• Weber in 1830 found that I – Stimulus (Intensity) produces dI –
Least perceptible increment affecting sense organs. Then the ratio
Here I0 is the threshold intensity. This is known as
Frechner’s Law. The same percentage change in stimulus
Calculated from the least amount perceptible. Gives same
change in sensation. Sensation produced by optic nerves
have logarithmic dependence or relationship to Light
Radiation producing the sensation.
• • Unit of luminous intensity is Candela (Cd), it is the
luminous intensity of a surface which is1/600,000 of a
blackbody, at the solidification temp. of Platinum (1773 °C)
under standard atmospheric pressure.
• • Luminous intensity over 1 steradian solid angle by a
source of 1 Cd is called as 1 lumen flux (lm) • MSLI =
average intensity x solid angle (mean spherical Luminous
intensity).
• • Luminous Flux = luminous intensity × solid angle
• • Illuminance is luminous flux per unit area
• Frechner’s Law – the same percentage change in stimulus
calculated from the least amount perceptible gives the
same change in sensation.
• • Inverse Square Law – The intensity of illumination
produced by a point source varies inversely as square of
the distance from the source.
• • Lambert’s Cosine Law of Incidence – 2 I×cosα E = D
• • Lambert’s Cosine law of Emission – mI = I×cosα
• 
THANKYOU!
By:De Castro, Jerricho Joshua L. ME 5-3
1400846

You might also like