You are on page 1of 168

TRƯỜNG ĐẠI HỌC BÁCH KHOA HÀ NỘI

XỬ LÝ ẢNH TRONG CƠ ĐIỆN TỬ


Machine Vision

Giảng viên: TS. Nguyễn Thành Hùng


Đơn vị: Bộ môn Cơ điện tử, Viện Cơ khí

Hà Nội, 2021 1
Chapter 1. Introduction to machine vision

1. Introduction
2. Basic elements of machine vision system
3. Classification Phân đoạn ảnh

4. Technical specifications
5. Designing a Machine Vision System

2
1. Introduction

❖Definition
➢ Machine vision (MV) is the technology and methods used to provide imaging-
based automatic inspection and analysis for such applications as automatic
inspection, process control, and robot guidance, usually in industry.
➢ Machine vision is a term encompassing a large number of technologies,
software and hardware products, integrated systems, actions, methods and
expertise.
➢ Machine vision as a systems engineering discipline can be considered distinct
from computer vision, a form of computer science.
➢ It attempts to integrate existing technologies in new ways and apply them to
solve real world problems.
3
1. Introduction

❖Definition
➢ The overall machine vision process includes planning the details of the
requirements and project, and then creating a solution. During run-time, the
process starts with imaging, followed by automated analysis of the image and
extraction of the required information.

4
1. Introduction

❖Definition

https://en.wikipedia.org/wiki/Glossary_of_machine_vision 5
1. Introduction

❖Application: Locate
➢ To find the object and report its position and orientation.

6
1. Introduction

❖Application: Measure
➢ To measure physical dimensions of the object

7
1. Introduction

❖Application: Inspect
➢ To validate certain features

8
1. Introduction

❖Application: Inspect

9
1. Introduction

❖Application: Identify

10
Chapter 1. Introduction to machine vision

1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System

11
2. Basic elements of machine vision system

2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis

12
2. Basic elements of machine vision system

❑ 2.1. Overview

http://www.digikey.com/en/articles/techzone/2012/jan/versatile-leds-drive-machine-vision-in-automated-manufacture 13
2. Basic elements of machine vision system

❑ 2.1. Overview

OK NG

How to automatically detect the defect?


14
2. Basic elements of machine vision system

❑ 2.1. Overview
❖Illumination
➢ Illumination: is the way an object is lit up and lighting is the actual lamp that
generates the illumination.

❖Imaging (Camera and lens)


➢ The term imaging defines the act of creating an image.

15
2. Basic elements of machine vision system

❑ 2.1. Overview
❖Image processing and analysis
➢ This is where the desired features are extracted automatically by algorithms and
conclusions are drawn.
➢ A feature is the general term for information in an image, for example a
dimension or a pattern.
➢ Algorithms are also referred to as tools or functions.

16
2. Basic elements of machine vision system

2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis

17
2. Basic elements of machine vision system

❑ 2.2. Illumination
❖The goal of lighting in machine vision is to obtain a robust application by:
➢ Enhancing the features to be inspected.
➢ Assuring high repeatability in image quality.

SICK IVP, “Machine Vision Introduction,” 2006. 18


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Illumination Principles
Light can be described as waves with three properties:
➢ Wavelength or color, measured in nm (nanometers)
➢ Intensity
➢ Polarization.

SICK IVP, “Machine Vision Introduction,” 2006. 19


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Illumination Principles
➢ The spectral response of a sensor is the sensitivity curve for different
wavelengths.

Spectral response of a
gray scale CCD sensor.
Maximum sensitivity
is for green (500 nm).

SICK IVP, “Machine Vision Introduction,” 2006. 20


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Illumination Principles
➢ The optical axis is a thought line through the center of the lens, i.e. the direction
the camera is looking.

SICK IVP, “Machine Vision Introduction,” 2006. 21


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Illumination Principles

SICK IVP, “Machine Vision Introduction,” 2006. 22


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Ring Light
➢ A ring light is mounted around the optical axis of the lens, either on the camera or
somewhere in between the camera and the object.

SICK IVP, “Machine Vision Introduction,” 2006. 23


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Ring Light

Pros Cons
• Easy to use • Direct reflections, called hot spots, on reflective surfaces
• High intensity and short exposure time possible

Ring light. The printed matte s


urface is evenly illuminated. Ho
t spots appear on shiny
surfaces (center), one for each
of the 12 LEDs of the ring light.

Ambient light.
SICK IVP, “Machine Vision Introduction,” 2006. 24
2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Spot Light
➢ A spot light has all the light emanating from one direction that is different from the
optical axis. For flat objects, only diffuse reflections reach the camera.

Mainly diffuse reflections


reach the camera

Object

Spot light

SICK IVP, “Machine Vision Introduction,” 2006. 25


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Spot Light

Pros Cons
• No hot spots • Uneven illumination
• Requires intense light since it is
dependent on diffuse reflections

SICK IVP, “Machine Vision Introduction,” 2006. 26


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Backlight
➢ The backlight principle has the object being illuminated from behind to produce a
contour or silhouette.

SICK IVP, “Machine Vision Introduction,” 2006. 27


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Backlight

Pros Cons
• Very good contrast • Dimension must be
• Robust to texture, color, and ambient light larger than object

Backlight: Enhances contours


Ambient light. by creating a silhouette
SICK IVP, “Machine Vision Introduction,” 2006. 28
2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Darkfield
➢ Darkfield means that the object is illuminated at a large angle of incidence.

SICK IVP, “Machine Vision Introduction,” 2006. 29


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Darkfield

Pros Cons
• Good enhancement of scratches, protruding • Mainly works on flat surfaces with small features
edges, and dirt on surfaces • Requires small distance to object
• The object needs to be somewhat reflective

Ambient light. Darkfield: Enhances relief co


ntours, i.e., lights up edges
SICK IVP, “Machine Vision Introduction,” 2006. 30
2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> On-Axis Light
➢ When an object needs to be illuminated parallel to the optical axis, a semi-
transparent mirror is used to create an on- axial light source.

SICK IVP, “Machine Vision Introduction,” 2006. 31


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> On-Axis Light
Pros Cons
• Very even illumination, not hot spots • Low intensity requires long exposure times
• High contrast on materials with different • Cleaning off semi-transparent mirror (beam-
reflectivity splitter) often needed

Inside of a can Inside of the same can


as seen with as seen with a coaxial
ambient light (on-axis) light

SICK IVP, “Machine Vision Introduction,” 2006. 32


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Dome Light
➢ The dome light produces the needed uniform light intensity inside of the dome
walls.

SICK IVP, “Machine Vision Introduction,” 2006. 33


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Dome Light
Pros Cons
• Works well on highly reflective • Low intensity requires
materials long exposure times
• Uniform illumination, except for the • Dimensions must be
darker middle of the image. No hot larger than object
spots • Dark area in the middle of
the image

SICK IVP, “Machine Vision Introduction,” 2006. 34


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Dome Light

Ambient light. On top of the key numbers is a curved,


transparent material causing direct reflections.

The direct reflections are eliminated by


the dome light’s even illumination.

SICK IVP, “Machine Vision Introduction,” 2006. 35


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Laser Light
➢ A 2D camera with a laser line can provide a cost efficient solution for low-contrast
and 3D inspections.
Pros Cons
• Robust against ambient light • Laser safety issues
• Allows height measurements (z parallel • Data along y is lost in favor of z (height) data
to the optical axis). • Lower accuracy than 3D cameras
• Low-cost 3D for simpler applications

SICK IVP, “Machine Vision Introduction,” 2006. 36


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Types >> Laser Light

Ambient light. Contract lens containers, the left The laser line clearly shows the
is facing up (5mm high at cross) and the right is height difference.
facing down (1mm high at minus sign.

SICK IVP, “Machine Vision Introduction,” 2006. 37


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Strobe or Constant light
➢ A strobe light is a flashing light.
➢ Strobing allows the LED to emit higher light intensity than what is achieved with a
constant light by turbo charging.

SICK IVP, “Machine Vision Introduction,” 2006. 38


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Diffusor Plate
➢ The diffusor plate converts direct light into diffuse.
➢ The purpose of a diffusor plate is to avoid bright spots in the image, caused by
the direct light's reflections in glossy surfaces.

Two identical white bar lights, with diffusor


plate (top) and without (bottom).

SICK IVP, “Machine Vision Introduction,” 2006. 39


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Variants and Accessories >> LED Color
➢ LED lightings come in several colors. Most common are red and green. There are
also LEDs in blue, white, UV, and IR.
➢ Different objects reflect different colors. A blue object appears blue because it
reflects the color blue.
➢ Therefore, if blue light is used to illuminate a blue object, it will appear bright in a
gray scale image.

SICK IVP, “Machine Vision Introduction,” 2006. 40


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Variants and Accessories >> LED Color

SICK IVP, “Machine Vision Introduction,” 2006. 41


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Optical Filters
➢ An optical filter is a layer in front of the sensor or lens that absorbs certain
wavelengths (colors) or polarizations.
➢ Two main optical filter types are used for machine vision:
Band-pass filter: Only transmits light of a certain color, i.e. within a certain
1
wavelength interval. example, a red filter only lets red through
Polarization filter: Only transmits light with a certain polarization. Light changes its
2
polarization when it is reflected, which allows us to filter out unwanted reflections.

SICK IVP, “Machine Vision Introduction,” 2006. 42


2. Basic elements of machine vision system

❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Optical Filters

Original image Image seen by gray Red light and a Green light and a
scale camera with red band-pass filter green band-pass filter
ambient light and
without filter

SICK IVP, “Machine Vision Introduction,” 2006. 43


2. Basic elements of machine vision system

2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis

44
2. Basic elements of machine vision system

❑ 2.3. Imaging
➢ The term imaging defines the act of creating an image.
➢ Imaging has several technical names: Acquiring, capturing, or grabbing
➢ To grab a high-quality image → the number one goal for a successful vision
application.

SICK IVP, “Machine Vision Introduction,” 2006. 45


2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts
➢ A simplified camera setup consists of camera, lens, lighting, and object.

SICK IVP, “Machine Vision Introduction,” 2006. 46


2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
➢ A sensor chip is used to grab a digital image.
➢ On the sensor there is an array of lightsensitive pixels.

Sensor chip with an array


of light-sensitive pixels.

SICK IVP, “Machine Vision Introduction,” 2006. 47


2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
There are two technologies used for digital image sensors:
➢ CCD (Charge-Coupled Device)
➢ CMOS (Complementary Metal Oxide Semiconductor).

SICK IVP, “Machine Vision Introduction,” 2006. 48


2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging

http://www.f4news.com/2016/05/09/ccd-vs-cmos-infographic/ 49
2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The lens (Objective) focuses the light that enters the camera in a way that
creates a sharp image.

Focused or sharp image. Unfocused or blurred image.

SICK IVP, “Machine Vision Introduction,” 2006. 50


2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The angle of view determines how much of the visual scene the camera sees.

SICK IVP, “Machine Vision Introduction,” 2006. 51


2. Basic elements of machine vision system

❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The focal length is the distance between the lens and the focal point.
➢ When the focal point is on the sensor, the image is in focus.

SICK IVP, “Machine Vision Introduction,” 2006. 52


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Lenses and Focal Length


▪ Focal length is related to angle of view in that a long focal length corresponds to a
small angle of view, and vice versa.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Field of View in 2D
▪ The FOV (Field of View) in 2D systems is the full area that a camera sees. The FOV
is specified by its width and height.
The
▪ spectral distanceof
response
The object a sensor
is the is between
distance the sensitivity
the lenscurve forobject.
and the different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Aperture and F-stop


▪ The aperture is the opening in the lens that controls the amount of light that is let
onto the sensor. In quality lenses, the aperture is adjustable.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Aperture and F-stop


▪ The size of the aperture is measured by its F-stop value. A large F-stop value means
a small aperture opening, and vice versa.
The
▪ spectral response
For standard of a sensor
CCTV lenses, is the
the F-stop sensitivity
value curve
is adjustable in for different
the range between F1.4
wavelengths.
and F16.
Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Depth of Field
▪ The minimum object distance (sometimes abbreviated MOD) is the closest
distance in which the camera lens can focus and maximum object distance is the
The spectral response of a sensor is the sensitivity curve for different
farthest distance.
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Depth of Field
▪ The focal plane is found at the distance where the focus is as sharp as possible.
▪ Objects closer or farther away than the focal plane can also be considered to be in
The spectral response
focus. This of a sensor
distance interval whereisgood-enough
the sensitivity
focuscurve for different
is obtained is called depth of
wavelengths.
field (DOF).
Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Depth of Field

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Depth of Field
▪ The depth of field depends on both the focal length and the aperture adjustment.

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Depth of Field
▪ By adding a distance ring between the camera and the lens, the focal plane (and
thus the MOD) can be moved closer to the camera. A distance ring is also referred to
The spectral response
as shim, spacer, of a sensor
or extension is the sensitivity curve for different
ring.
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Camera Concepts

Depth of Field
▪ A side-effect of using a distance ring is that a maximum object distance is
introduced and that the depth of field range decreases.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Pixels and Resolution


▪ A pixel is the smallest element in a digital image. Normally, the
pixel in the image corresponds directly to the physical pixel on
The spectral response of a sensor is the sensitivity curve for different
the sensor.
wavelengths. Camera
▪ To the right is an example of a very small image with dimension
sensors can have a different spectral response than the human eye.
8x8 pixels. The dimensions are called x and y, where x
corresponds to the image columns and y to the rows.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Pixels and Resolution


▪ Typical values of sensor resolution in 2D machine
vision are:
The
➢ spectral Graphicsof
response
VGA (Video a sensor
Array): is the
640x480 sensitivity curve for different
pixels
wavelengths. Camera
➢ XGA (Extended Graphics Array): 1024x768 pixels
sensors can have a different spectral response than the human eye.
➢ SXGA (Super Extended Graphics Array):
1280x1024 pixels

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Pixels and Resolution


▪ The object resolution is the physical dimension on the object that corresponds to
one pixel on the sensor. Common units for object resolution are μm (microns) per
The spectral response
pixel and mm of a sensor is the sensitivity curve for different
per pixel.
wavelengths. Camera
▪ Example: Object Resolution Calculation: FOV width = 50 mm, Sensor resolution =
sensors can have a different spectral response than the human eye.
640x480 pixels, Calculation of object resolution in x:

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Intensity
▪ The brightness of a pixel is called intensity. The intensity information is stored for
each pixel in the image and can be of different types. Examples:
The
➢ spectral response
Binary: One of a sensor is the sensitivity curve for different
bit per pixel.
wavelengths. Camera
sensors can have a different spectral response than the human eye.

➢ Gray scale: Typically one byte per pixel.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Intensity
➢ Color: Typically one byte per pixel and color. Three bytes are needed to obtain full
color information. One pixel thus contains three components (R, G, B).
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Intensity
▪ When the intensity of a pixel is digitized and described by a byte, the information is
quantized into discrete levels. The number of bits per byte is called bit-depth.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Exposure
▪ Exposure is how much light is detected by the photographic film or sensor. The
exposure amount is determined by two factors:
The
➢ spectral
Exposureresponse of a of
time: Duration sensor is the sensitivity
the exposure, measured curve for different
in milliseconds (ms). Also
wavelengths. Camera
called shutter time from traditional photography.
sensors can have a different spectral response than the human eye.
➢ Aperture size: Controls the amount of light that passes through the lens.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Exposure
▪ If the exposure time is too short for the sensor to capture enough light, the image is
said to be underexposed. If there is too much light and the sensor is saturated, the
The spectral response
image is said of a sensor is the sensitivity curve for different
to be overexposed.
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Gain
▪ Gain amplifies the intensity values after the sensor has already been exposed, very
much like the volume control of a radio (which doesn’t actually make the artist sing
The spectral tradeoff of
response
louder). The a sensor is the
of compensating sensitivity
insufficient curvewith
exposure for adifferent
high gain is
wavelengths. Camera
amplified noise.
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Contrast and Histogram


▪ Contrast is the relative difference between bright and dark areas in an image.
Contrast is necessary to see anything at all in an image.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Contrast and Histogram


▪ A histogram is a diagram where the pixels are sorted in order of increasing intensity
values.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.3. Imaging
Basic Image Concepts

Contrast and Histogram

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis

75
2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis

➢ The basic stages in image processing include: preprocessing, image


segmentation, feature extraction, and recognition and analysis.
2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Preprocessing

▪ The captured image may have low contrast, noise, or contain some unnecessary
information.
▪ The main function of the preprocessing is to filter noise, increase contrast to make
The spectral response
images clearer of a sensor is the sensitivity curve for different
and sharper.
wavelengths. Camera
▪ Some other functions such as converting color images to grayscale images,
sensors can have a different spectral response than the human eye.
extracting areas of interest (ROI - Region of Interest).

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Preprocessing

ROI Extraction

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

One ROI is created to verify the logotype (blue) and A ROI is placed around each pill in the blister pack
another is created for barcode reading (green). and the pass/fail analysis is performed once per ROI.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Preprocessing

Pixel Counting

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

Automotive part with crack. The crack is found using a darkfield illumination
and by counting the dark pixels inside the ROI.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Preprocessing

Digital Filters

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

Noisy version of original image. Image (left) after noise reduction.


SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Image segmentation

▪ Image segmentation is to split an input image into component areas to


represent analysis and image recognition.
▪ This is the most difficult part of image processing and is also error-prone,
The spectral response of a sensor is the sensitivity curve for different
losing the accuracy
wavelengths. Camera of the image processing. The result of object
sensors can havedepends
identification a different
veryspectral
much on response than the human eye.
this stage.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Image segmentation

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

Original intensity-coded 3D image. Image after a binarization operation. Image after edge enhancement.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Image segmentation

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Feature Extraction

▪ The output of the segmentation contains pixels of the image area


(segmented image) plus the code associated with the neighborhood.
▪ Features for image rendering called Feature Selection are associated with
The spectral response of a sensor is the sensitivity curve for different
separatingCamera
wavelengths. image properties in the form of quantitative information.
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Feature Extraction

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

Digital boundary with resampling


Result of resampling. 8-directional chain-coded boundary.
grid superimposed.

R. C. Gonzalez and R. E. Woods, “Digital Image Processing,” 4th edition, Prentice Hall, 2018.
2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Feature Extraction

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.
The smallest axis-parallel enclosing The smallest enclosing rectangle
rectangle of a region. of arbitrary orientation. The smallest enclosing circle.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Image recognition and analysis

▪ Image recognition is the process of identifying images.


▪ This process is usually obtained by comparing with the standard sample
that has been learned (or saved) before.
The spectral response of a sensor is the sensitivity curve for different
▪ Interpolation
wavelengths. is a judgment based on the meaning of identification.
Camera
sensors can haveofa parameter
➢ identification different spectral response than the human eye.

➢ identification of structure

SICK IVP, “Machine Vision Introduction,” 2006.


2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Image recognition and analysis

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

Reference image for teaching. Matching in new image.


SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system

❑ 2.4. Image processing and analysis


Image recognition and analysis

The spectral response of a sensor is the sensitivity curve for different


wavelengths. Camera
sensors can have a different spectral response than the human eye.

SICK IVP, “Machine Vision Introduction,” 2006.


Chapter 1. Introduction to machine vision

1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System

90
3. Classification

3.1. 1D vision systems


3.2. 2D vision systems
3.3. 3D vision systems

91
3. Classification

❑ 3.1. 1D vision systems


➢ 1D vision analyzes a digital signal one line at a time instead of looking at a whole
picture at once.
➢ This technique commonly detects and classifies defects on materials
manufactured in a continuous process, such as paper, metals, plastics, and other
non-woven sheet or roll goods.

COGNEX, “Introduction to Machine Vision,” 2016. 92


3. Classification

❑ 3.1. 1D vision systems

1D vision systems scan one


line at a time while the
process moves. In the
above example, a defect in
the sheet is detected.

COGNEX, “Introduction to Machine Vision,” 2016. 93


3. Classification

❑ 3.2. 2D vision systems


➢ Most common inspection cameras perform area scans that involve capturing 2D
snapshots in various resolutions.

2D vision systems can


produce images with
different resolutions

COGNEX, “Introduction to Machine Vision,” 2016. 94


3. Classification

❑ 3.2. 2D vision systems


➢ Another type of 2D machine vision–line scan–builds a 2D image line by line.

Line scan techniques build the 2D image one line at a time.


COGNEX, “Introduction to Machine Vision,” 2016. 95
3. Classification

❑ 3.3. 3D vision systems


➢ 3D machine vision systems typically comprise multiple cameras or one or more
laser displacement sensors.
➢ Multi-camera 3D vision in robotic guidance applications provides the robot with
part orientation information.
➢ These systems involve multiple cameras mounted at different locations and
“triangulation” on an objective position in 3-D space.

COGNEX, “Introduction to Machine Vision,” 2016. 96


3. Classification

❑ 3.3. 3D vision systems

3D vision systems typically employ 3D inspection system using


multiple cameras a single camera
COGNEX, “Introduction to Machine Vision,” 2016. 97
3. Classification

❑ 3.3. 3D vision systems


➢ 3D laser-displacement sensor applications typically include surface inspection
and volume measurement, producing 3D results with as few as a single camera.
➢ A height map is generated from the displacement of the reflected lasers’ location
on an object.
➢ The object or camera must be moved to scan the entire product similar to line
scanning.

COGNEX, “Introduction to Machine Vision,” 2016. 98


Chapter 1. Introduction to machine vision

1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System

99
4. Technical Specifications

4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment

100
4. Technical Specifications

❑ 4.1. Parts
➢ Discrete parts or endless material (i.e., paper or woven goods) minimum and
maximum dimensions
➢ Changes in shape
➢ Description of the features that have to be extracted
➢ Changes of these features concerning error parts and common product variation
➢ Surface finish
➢ Color
➢ Corrosion, oil films, or adhesives
➢ Changes due to part handling, i.e., labels, fingerprints

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 101


4. Technical Specifications

4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment

102
4. Technical Specifications

❑ 4.2. Parts Presentation


➢ Regarding part motion, the following options are possible:
▪ indexed positioning
▪ continuous movement
➢ If there is more than one part in view, the following topics are important:
▪ number of parts in view
▪ overlapping parts
▪ touching parts

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 103


4. Technical Specifications

4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment

104
4. Technical Specifications

❑ 4.3. Performance Requirements


➢ The performance requirements can be seen in the aspects of:
▪ accuracy and
▪ time performance

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 105


4. Technical Specifications

❑ 4.3. Performance Requirements


➢ Time performance:
▪ cycle time
▪ start of acquisition
▪ maximum processing time
▪ number of production cycles from inspection to result using (for result
buffering)

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 106


4. Technical Specifications

4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment

107
4. Technical Specifications

❑ 4.4. Information Interfaces


➢ User interface for handling and visualizing results
➢ Declaration of the current part type
➢ Start of the inspection
➢ Setting results
➢ Storage of results or inspection data in log files or databases
➢ Generation of inspection protocols for storage or printout

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 108


4. Technical Specifications

4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment

109
4. Technical Specifications

❑ 4.5. Installation Space


➢ The possibility of aligning the illumination and the camera
➢ Is an insight into the inspection scene possible?
➢ What variations are possible for minimum and maximum distances between the
part and the camera?
➢ The distance between the camera and the processing unit

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 110


4. Technical Specifications

4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment

111
4. Technical Specifications

❑ 4.6. Environment
➢ Ambient light
➢ Dirt or dust that the equipment needs to be protected from shock or vibration that
affects the part of the equipment heat or cold
➢ Necessity of a certain protection class
➢ Availability of power supply

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 112


Chapter 1. Introduction to machine vision

1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System

113
5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 114
5. Designing a Machine Vision System

❑ 5.1. Camera Type


➢ Line scan camera
➢ Area scan camera
➢ 3D camera

Directions for a line scan camera.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 115


5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 116
5. Designing a Machine Vision System

❑ 5.2. Field of View


The field of view is determined by the following factors:
➢ maximum part size
➢ maximum variation of part presentation in
translation and orientation
➢ margin as an offset to part size
➢ aspect ratio of the camera sensor

Field of view.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 117
5. Designing a Machine Vision System

❑ 5.2. Field of View


Example: Horizontal Vertical
maximum part size 10 mm 6 mm
tolerance in positioning 1 mm
margin 2 mm
aspect ratio 4:3
3
𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 = 10 𝑚𝑚 + 1 𝑚𝑚 + 2 𝑚𝑚 = 13 𝑚𝑚 → 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 = 𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 = 9.75 𝑚𝑚
4
𝐹𝑂𝑉𝑣𝑒𝑟_𝑐𝑎𝑙 = 6 𝑚𝑚 + 1 𝑚𝑚 + 2 𝑚𝑚 = 9 𝑚𝑚 < 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡

𝐹𝑂𝑉ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟_𝑐𝑎𝑙 𝑎𝑛𝑑 𝐹𝑂𝑉𝑣𝑒𝑟 = 𝐹𝑂𝑉𝑣𝑒𝑟_𝑒𝑠𝑡 𝐹𝑂𝑉 = 13 𝑚𝑚 × 9.75 𝑚𝑚


Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 118
5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 119
5. Designing a Machine Vision System

❑ 5.3. Resolution
➢ camera sensor resolution
➢ spatial resolution
➢ measurement accuracy

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 120


5. Designing a Machine Vision System

❑ 5.3. Resolution
➢ Calculation of Resolution

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 121


5. Designing a Machine Vision System

❑ 5.3. Resolution
➢ Example: Measure the dimension of the object 10mmx6mm above with accuracy
0.01 mm.
▪ Using edge detection for dimention measurement → Nf = 1/3 pixel. But there is a
tolerance in positioning, the number of pixels for the smallest feature is set to 1
pixel (Nf = 1 pixel).
▪ Size of the smallest feature Sf = 0.01 mm

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 122


5. Designing a Machine Vision System

❑ 5.3. Resolution
▪ Camera resolution:
𝑁𝑓 1 𝑝𝑖𝑥𝑒𝑙
𝑅𝐶_ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟 = 13 𝑚𝑚 = 1300 𝑝𝑖𝑥𝑒𝑙𝑠
𝑆𝑓 0.01 𝑚𝑚

𝑁𝑓 1 𝑝𝑖𝑥𝑒𝑙
𝑅𝐶_𝑣𝑒𝑟 = 𝐹𝑂𝑉𝑣𝑒𝑟 = 9.75 𝑚𝑚 = 975 𝑝𝑖𝑥𝑒𝑙𝑠
𝑆𝑓 0.01 𝑚𝑚

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 123


5. Designing a Machine Vision System

❑ 5.3. Resolution
▪ Object resolution (spatial resolution): assuming that a lens with a field of view of
14 mm ( > 13 mm) was chosen, a camera with resolution of 1440x1080 was
chosen.
𝐹𝑂𝑉 14 𝑚𝑚
𝑅𝑠 = = = 0.01 𝑚𝑚/𝑝𝑖𝑥𝑒𝑙
𝑅𝐶 1440 𝑝𝑖𝑥𝑒𝑙

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 124


5. Designing a Machine Vision System

❑ 5.3. Resolution
➢ Calculation of Resolution: Resolution for a Line Scan Camera

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 125


5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 126
5. Designing a Machine Vision System

❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform


➢ Camera Model
▪ color sensor
▪ interface technology
▪ progressive scan for area cameras
▪ packaging size
▪ price and availability

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 127


5. Designing a Machine Vision System

❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform


➢ Frame Grabber
▪ compatibility with the pixel rate
▪ compatibility with the software library
▪ number of cameras that can be addressed
▪ utilities to control the camera via the frame grabber
▪ timing and triggering of the camera
▪ availability of on-board processing
▪ availability of general purpose I/O
▪ price and availability
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 128
5. Designing a Machine Vision System

❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform


➢ Pixel Rate
▪ This is the speed of imaging in terms of pixels per second.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 129


5. Designing a Machine Vision System

❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform


➢ Pixel Rate
▪ For an area camera:

An overhead of 10% to 20% should be considered


due to additional bus transfer.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 130


5. Designing a Machine Vision System

❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform


➢ Pixel Rate
▪ For a line scan camera:

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 131


5. Designing a Machine Vision System

❑ 5.4. Choice of Camera, Frame Grabber, and Hardware Platform


➢ Hardware Platform
• Compatibility with frame grabber

• Operating system
▪ smart cameras
• Development process
▪ compact vision systems
• means for a user-friendly human machine
interface ▪ PC-based systems
• Processing load

• Miscellaneous points: available interfaces,


memory, packaging size, price, and availability
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 132
5. Designing a Machine Vision System

❑ Example about a camera

https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 133
5. Designing a Machine Vision System

❑ Example about a camera

https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 134
5. Designing a Machine Vision System

❑ Example about a camera

https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 135
5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 136
5. Designing a Machine Vision System

❑ 5.5. Lens Design standoff distance focus distance

➢ Focal Length

lens extension

Magnification

focal length
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 137
5. Designing a Machine Vision System

❑ 5.5. Lens Design


➢ Focal Length

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 138


5. Designing a Machine Vision System

❑ 5.5. Lens Design


➢ Lens Flange Focal Distance
▪ This is the distance between the lens mount face and the image plane.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 139


5. Designing a Machine Vision System

❑ 5.5. Lens Design


➢ Extension Tubes
▪ The lens extension l can be increased using the focus adjustmentof the lens.
▪ If the distance cannot be increased, extension tubes can be used to focus close
objects. As a result, the depth of view is decreased.
▪ For higher magnifications, such as from 0.4 to 4, macro lenses offer better image
quality.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 140


5. Designing a Machine Vision System

❑ 5.5. Lens Design


➢ Lens Diameter and Sensor Size

Areas illuminated by the lens and camera;


the left side displays an appropriate choice.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 141
5. Designing a Machine Vision System

❑ 5.5. Lens Design


➢ Sensor Resolution and Lens Quality
▪ As for high resolution cameras, the requirements on the lens are higher than
those for standard cameras.
▪ Using a low-budget lens might lead to poor image quality for high resolution
sensors, whereas the quality is acceptable for lower resolutions.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 142


5. Designing a Machine Vision System

❑ Example about lens

https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 143
5. Designing a Machine Vision System

❑ Example about lens

https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/ 144
5. Designing a Machine Vision System

❑ Example about lens

https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/ 145
5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 146
5. Designing a Machine Vision System

❑ 5.6. Choice of Illumination


➢ Concept: Maximize Contrast
▪ Direction of light: diffuse from all directions or directed from a range of angles
▪ Light spectrum
▪ Polarization: effect on surfaces, such as metal or glass

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 147


5. Designing a Machine Vision System

❑ 5.6. Choice of Illumination


➢ Illumination Setups
▪ Backlight and
▪ Frontlight
• Diffused light

• Directed light

• Confocal frontlight

• Bright field

• Dark field
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 148
5. Designing a Machine Vision System

❑ 5.6. Choice of Illumination


➢ Light Sources
▪ Fluorescent tubes
▪ Halogen and xenon lamps
▪ LED
▪ Laser

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 149


5. Designing a Machine Vision System

❑ 5.6. Choice of Illumination


➢ Approach to the Optimum Setup
▪ A confirmation of the setup based on experiments with sample parts is
mandatory.
▪ The alignment of light, the part and the camera needs to be documented.
▪ To balance between similar setups images have to be captured and compared
for the maximum contrast.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 150


5. Designing a Machine Vision System

❑ 5.6. Choice of Illumination


➢ Interfering Lighting
▪ The influences of different lamps on the images have to be checked.
▪ To avoid interfering, a spatial separation can be achieved by using different
camera stations.
▪ The part is imaged with different sets of cameras and illuminations.

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 151


5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 152
5. Designing a Machine Vision System

❑ 5.7. Mechanical Design


➢ As the cameras, lenses, standoff distances, and
illumination devices are determined, the mechanical
conditions can be defined.
➢ As for mounting of cameras and lights the
adjustment is important for installation, operation,
and maintenance.
➢ The devices have to be protected against vibration
or shock.
➢ The position of cameras and lights should be
changed easily.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 153
5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 154
5. Designing a Machine Vision System

❑ 5.8. Electrical Design


➢ The power supply
➢ The housing of cameras and illumination
➢ The length of cables as well as their laying

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 155


5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 156
5. Designing a Machine Vision System

❑ 5.9. Software
➢ selection of a software library
➢ design and implementation of the application-specific software

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 157


5. Designing a Machine Vision System

❑ 5.9. Software
➢ Software Library
➢ Software Structure
▪ Image acquisition
▪ Preprocessing
▪ Feature localization
▪ Feature extraction
▪ Feature interpretation
▪ Generation of results
▪ Handling interfaces
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 158
5. Designing a Machine Vision System

❑ 5.9. Software
➢ General Topics
▪ Visualization of live images for all cameras
▪ Possibility of image saving
▪ Maintenance modus
▪ Log files for the system state
▪ Detailed visualization of the image processing
▪ Crucial processing parameters

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 159


5. Designing a Machine Vision System

5.1. Camera Type


5.2. Field of View
5.3. Resolution
5.4. Choice of Camera, Frame Grabber, and Hardware Platform
5.5. Lens Design
5.6. Choice of Illumination
5.7. Mechanical Design
5.8. Electrical Design
5.9. Software
5.10. Costs
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 160
5. Designing a Machine Vision System

❑ 5.10. Costs
➢ The development costs
▪ project management
▪ base design
▪ hardware components
▪ software licenses
▪ software development
▪ installation
▪ test runs, feasibility tests, and acceptance test
▪ training
▪ documentation

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 161


5. Designing a Machine Vision System

❑ 5.10. Costs
➢ The operating costs
▪ maintenance, such as cleaning of the optical equipment

▪ change of equipment, such as lamps

▪ utility, for instance electrical power or compressed air if needed

▪ costs for system modification due to product changes

Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 162


Quiz 1
OX Example Select
Quiz Number 1 Quiz Type

Choose the lighting for measuring the radius R and r of following


Question object:
R

A. Dome Light B. On-Axis Light r


Example
C. Darkfield D. Backlight

Answer

Feedback
Quiz 2
OX Example Select
Quiz Number 2 Quiz Type

Assume that the object size = 10 cm x 20 cm, sensor resolution =


Question
640x480 pixels. Calculate the object resolution?

A. 0.31 mm/pixel B. 0.21 mm/pixel C. 0.17 mm/pixel D. 0.42


Example
mm/pixel

Answer

Feedback
Quiz 3

OX Example Select
Quiz Number 3 Quiz Type

Question Splitting an input image into component areas is called:

A. Image preprocessing B. Image segmentation


Example
C. Image recognition D. Image representation

Answer

Feedback
Quiz 4
OX Example Select
Quiz Number 4 Quiz Type

Question The performance requirements of a machine vision system are:

A. Accuracy B. Time performance


Example
C. Both A and B D. None of the above

Answer

Feedback
Quiz 5

Quiz Number 5 Quiz Type OX Example Select

Diameter Inspection of Rivets:


+ The nominal size of the rivets lies in a range of 3 mm to 4 mm
+ The required accuracy is 0.1 mm
+ The tolerance of part positioning is less than ±1 mm across the
Question
optical axis and ±0.1 mm in the direction of the optical axis. The
belt stops for 1.5 s.
+ The maximum processing time is 2 s; the cycle time is 2.5s.
+ The maximum space for installing equipment is 500 mm.
Quiz 5

Quiz Number 5 Quiz Type OX Example Select

Question Bearing with rivet and disk

You might also like