Professional Documents
Culture Documents
Hà Nội, 2021 1
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification Phân đoạn ảnh
4. Technical specifications
5. Designing a Machine Vision System
2
1. Introduction
❖Definition
➢ Machine vision (MV) is the technology and methods used to provide imaging-
based automatic inspection and analysis for such applications as automatic
inspection, process control, and robot guidance, usually in industry.
➢ Machine vision is a term encompassing a large number of technologies,
software and hardware products, integrated systems, actions, methods and
expertise.
➢ Machine vision as a systems engineering discipline can be considered distinct
from computer vision, a form of computer science.
➢ It attempts to integrate existing technologies in new ways and apply them to
solve real world problems.
3
1. Introduction
❖Definition
➢ The overall machine vision process includes planning the details of the
requirements and project, and then creating a solution. During run-time, the
process starts with imaging, followed by automated analysis of the image and
extraction of the required information.
4
1. Introduction
❖Definition
https://en.wikipedia.org/wiki/Glossary_of_machine_vision 5
1. Introduction
❖Application: Locate
➢ To find the object and report its position and orientation.
6
1. Introduction
❖Application: Measure
➢ To measure physical dimensions of the object
7
1. Introduction
❖Application: Inspect
➢ To validate certain features
8
1. Introduction
❖Application: Inspect
9
1. Introduction
❖Application: Identify
10
Chapter 1. Introduction to machine vision
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
11
2. Basic elements of machine vision system
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
12
2. Basic elements of machine vision system
❑ 2.1. Overview
http://www.digikey.com/en/articles/techzone/2012/jan/versatile-leds-drive-machine-vision-in-automated-manufacture 13
2. Basic elements of machine vision system
❑ 2.1. Overview
OK NG
❑ 2.1. Overview
❖Illumination
➢ Illumination: is the way an object is lit up and lighting is the actual lamp that
generates the illumination.
15
2. Basic elements of machine vision system
❑ 2.1. Overview
❖Image processing and analysis
➢ This is where the desired features are extracted automatically by algorithms and
conclusions are drawn.
➢ A feature is the general term for information in an image, for example a
dimension or a pattern.
➢ Algorithms are also referred to as tools or functions.
16
2. Basic elements of machine vision system
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
17
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖The goal of lighting in machine vision is to obtain a robust application by:
➢ Enhancing the features to be inspected.
➢ Assuring high repeatability in image quality.
❑ 2.2. Illumination
❖Illumination Principles
Light can be described as waves with three properties:
➢ Wavelength or color, measured in nm (nanometers)
➢ Intensity
➢ Polarization.
❑ 2.2. Illumination
❖Illumination Principles
➢ The spectral response of a sensor is the sensitivity curve for different
wavelengths.
Spectral response of a
gray scale CCD sensor.
Maximum sensitivity
is for green (500 nm).
❑ 2.2. Illumination
❖Illumination Principles
➢ The optical axis is a thought line through the center of the lens, i.e. the direction
the camera is looking.
❑ 2.2. Illumination
❖Illumination Principles
❑ 2.2. Illumination
❖Lighting Types >> Ring Light
➢ A ring light is mounted around the optical axis of the lens, either on the camera or
somewhere in between the camera and the object.
❑ 2.2. Illumination
❖Lighting Types >> Ring Light
Pros Cons
• Easy to use • Direct reflections, called hot spots, on reflective surfaces
• High intensity and short exposure time possible
Ambient light.
SICK IVP, “Machine Vision Introduction,” 2006. 24
2. Basic elements of machine vision system
❑ 2.2. Illumination
❖Lighting Types >> Spot Light
➢ A spot light has all the light emanating from one direction that is different from the
optical axis. For flat objects, only diffuse reflections reach the camera.
Object
Spot light
❑ 2.2. Illumination
❖Lighting Types >> Spot Light
Pros Cons
• No hot spots • Uneven illumination
• Requires intense light since it is
dependent on diffuse reflections
❑ 2.2. Illumination
❖Lighting Types >> Backlight
➢ The backlight principle has the object being illuminated from behind to produce a
contour or silhouette.
❑ 2.2. Illumination
❖Lighting Types >> Backlight
Pros Cons
• Very good contrast • Dimension must be
• Robust to texture, color, and ambient light larger than object
❑ 2.2. Illumination
❖Lighting Types >> Darkfield
➢ Darkfield means that the object is illuminated at a large angle of incidence.
❑ 2.2. Illumination
❖Lighting Types >> Darkfield
Pros Cons
• Good enhancement of scratches, protruding • Mainly works on flat surfaces with small features
edges, and dirt on surfaces • Requires small distance to object
• The object needs to be somewhat reflective
❑ 2.2. Illumination
❖Lighting Types >> On-Axis Light
➢ When an object needs to be illuminated parallel to the optical axis, a semi-
transparent mirror is used to create an on- axial light source.
❑ 2.2. Illumination
❖Lighting Types >> On-Axis Light
Pros Cons
• Very even illumination, not hot spots • Low intensity requires long exposure times
• High contrast on materials with different • Cleaning off semi-transparent mirror (beam-
reflectivity splitter) often needed
❑ 2.2. Illumination
❖Lighting Types >> Dome Light
➢ The dome light produces the needed uniform light intensity inside of the dome
walls.
❑ 2.2. Illumination
❖Lighting Types >> Dome Light
Pros Cons
• Works well on highly reflective • Low intensity requires
materials long exposure times
• Uniform illumination, except for the • Dimensions must be
darker middle of the image. No hot larger than object
spots • Dark area in the middle of
the image
❑ 2.2. Illumination
❖Lighting Types >> Dome Light
❑ 2.2. Illumination
❖Lighting Types >> Laser Light
➢ A 2D camera with a laser line can provide a cost efficient solution for low-contrast
and 3D inspections.
Pros Cons
• Robust against ambient light • Laser safety issues
• Allows height measurements (z parallel • Data along y is lost in favor of z (height) data
to the optical axis). • Lower accuracy than 3D cameras
• Low-cost 3D for simpler applications
❑ 2.2. Illumination
❖Lighting Types >> Laser Light
Ambient light. Contract lens containers, the left The laser line clearly shows the
is facing up (5mm high at cross) and the right is height difference.
facing down (1mm high at minus sign.
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Strobe or Constant light
➢ A strobe light is a flashing light.
➢ Strobing allows the LED to emit higher light intensity than what is achieved with a
constant light by turbo charging.
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Diffusor Plate
➢ The diffusor plate converts direct light into diffuse.
➢ The purpose of a diffusor plate is to avoid bright spots in the image, caused by
the direct light's reflections in glossy surfaces.
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> LED Color
➢ LED lightings come in several colors. Most common are red and green. There are
also LEDs in blue, white, UV, and IR.
➢ Different objects reflect different colors. A blue object appears blue because it
reflects the color blue.
➢ Therefore, if blue light is used to illuminate a blue object, it will appear bright in a
gray scale image.
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> LED Color
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Optical Filters
➢ An optical filter is a layer in front of the sensor or lens that absorbs certain
wavelengths (colors) or polarizations.
➢ Two main optical filter types are used for machine vision:
Band-pass filter: Only transmits light of a certain color, i.e. within a certain
1
wavelength interval. example, a red filter only lets red through
Polarization filter: Only transmits light with a certain polarization. Light changes its
2
polarization when it is reflected, which allows us to filter out unwanted reflections.
❑ 2.2. Illumination
❖Lighting Variants and Accessories >> Optical Filters
Original image Image seen by gray Red light and a Green light and a
scale camera with red band-pass filter green band-pass filter
ambient light and
without filter
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
44
2. Basic elements of machine vision system
❑ 2.3. Imaging
➢ The term imaging defines the act of creating an image.
➢ Imaging has several technical names: Acquiring, capturing, or grabbing
➢ To grab a high-quality image → the number one goal for a successful vision
application.
❑ 2.3. Imaging
❖Basic Camera Concepts
➢ A simplified camera setup consists of camera, lens, lighting, and object.
❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
➢ A sensor chip is used to grab a digital image.
➢ On the sensor there is an array of lightsensitive pixels.
❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
There are two technologies used for digital image sensors:
➢ CCD (Charge-Coupled Device)
➢ CMOS (Complementary Metal Oxide Semiconductor).
❑ 2.3. Imaging
❖Basic Camera Concepts: Digital Imaging
http://www.f4news.com/2016/05/09/ccd-vs-cmos-infographic/ 49
2. Basic elements of machine vision system
❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The lens (Objective) focuses the light that enters the camera in a way that
creates a sharp image.
❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The angle of view determines how much of the visual scene the camera sees.
❑ 2.3. Imaging
❖Basic Camera Concepts: Lenses and Focal Length
➢ The focal length is the distance between the lens and the focal point.
➢ When the focal point is on the sensor, the image is in focus.
❑ 2.3. Imaging
Basic Camera Concepts
❑ 2.3. Imaging
Basic Camera Concepts
Field of View in 2D
▪ The FOV (Field of View) in 2D systems is the full area that a camera sees. The FOV
is specified by its width and height.
The
▪ spectral distanceof
response
The object a sensor
is the is between
distance the sensitivity
the lenscurve forobject.
and the different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Camera Concepts
❑ 2.3. Imaging
Basic Camera Concepts
❑ 2.3. Imaging
Basic Camera Concepts
Depth of Field
▪ The minimum object distance (sometimes abbreviated MOD) is the closest
distance in which the camera lens can focus and maximum object distance is the
The spectral response of a sensor is the sensitivity curve for different
farthest distance.
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Camera Concepts
Depth of Field
▪ The focal plane is found at the distance where the focus is as sharp as possible.
▪ Objects closer or farther away than the focal plane can also be considered to be in
The spectral response
focus. This of a sensor
distance interval whereisgood-enough
the sensitivity
focuscurve for different
is obtained is called depth of
wavelengths.
field (DOF).
Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Camera Concepts
Depth of Field
❑ 2.3. Imaging
Basic Camera Concepts
Depth of Field
▪ The depth of field depends on both the focal length and the aperture adjustment.
❑ 2.3. Imaging
Basic Camera Concepts
Depth of Field
▪ By adding a distance ring between the camera and the lens, the focal plane (and
thus the MOD) can be moved closer to the camera. A distance ring is also referred to
The spectral response
as shim, spacer, of a sensor
or extension is the sensitivity curve for different
ring.
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Camera Concepts
Depth of Field
▪ A side-effect of using a distance ring is that a maximum object distance is
introduced and that the depth of field range decreases.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Image Concepts
❑ 2.3. Imaging
Basic Image Concepts
❑ 2.3. Imaging
Basic Image Concepts
❑ 2.3. Imaging
Basic Image Concepts
Intensity
▪ The brightness of a pixel is called intensity. The intensity information is stored for
each pixel in the image and can be of different types. Examples:
The
➢ spectral response
Binary: One of a sensor is the sensitivity curve for different
bit per pixel.
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Image Concepts
Intensity
➢ Color: Typically one byte per pixel and color. Three bytes are needed to obtain full
color information. One pixel thus contains three components (R, G, B).
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Image Concepts
Intensity
▪ When the intensity of a pixel is digitized and described by a byte, the information is
quantized into discrete levels. The number of bits per byte is called bit-depth.
The spectral response of a sensor is the sensitivity curve for different
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Image Concepts
Exposure
▪ Exposure is how much light is detected by the photographic film or sensor. The
exposure amount is determined by two factors:
The
➢ spectral
Exposureresponse of a of
time: Duration sensor is the sensitivity
the exposure, measured curve for different
in milliseconds (ms). Also
wavelengths. Camera
called shutter time from traditional photography.
sensors can have a different spectral response than the human eye.
➢ Aperture size: Controls the amount of light that passes through the lens.
❑ 2.3. Imaging
Basic Image Concepts
Exposure
▪ If the exposure time is too short for the sensor to capture enough light, the image is
said to be underexposed. If there is too much light and the sensor is saturated, the
The spectral response
image is said of a sensor is the sensitivity curve for different
to be overexposed.
wavelengths. Camera
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Image Concepts
Gain
▪ Gain amplifies the intensity values after the sensor has already been exposed, very
much like the volume control of a radio (which doesn’t actually make the artist sing
The spectral tradeoff of
response
louder). The a sensor is the
of compensating sensitivity
insufficient curvewith
exposure for adifferent
high gain is
wavelengths. Camera
amplified noise.
sensors can have a different spectral response than the human eye.
❑ 2.3. Imaging
Basic Image Concepts
❑ 2.3. Imaging
Basic Image Concepts
❑ 2.3. Imaging
Basic Image Concepts
2.1. Overview
2.2. Illumination
2.3. Imaging
2.4. Image processing and analysis
75
2. Basic elements of machine vision system
▪ The captured image may have low contrast, noise, or contain some unnecessary
information.
▪ The main function of the preprocessing is to filter noise, increase contrast to make
The spectral response
images clearer of a sensor is the sensitivity curve for different
and sharper.
wavelengths. Camera
▪ Some other functions such as converting color images to grayscale images,
sensors can have a different spectral response than the human eye.
extracting areas of interest (ROI - Region of Interest).
ROI Extraction
One ROI is created to verify the logotype (blue) and A ROI is placed around each pill in the blister pack
another is created for barcode reading (green). and the pass/fail analysis is performed once per ROI.
SICK IVP, “Machine Vision Introduction,” 2006.
2. Basic elements of machine vision system
Pixel Counting
Automotive part with crack. The crack is found using a darkfield illumination
and by counting the dark pixels inside the ROI.
Digital Filters
Original intensity-coded 3D image. Image after a binarization operation. Image after edge enhancement.
R. C. Gonzalez and R. E. Woods, “Digital Image Processing,” 4th edition, Prentice Hall, 2018.
2. Basic elements of machine vision system
➢ identification of structure
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
90
3. Classification
91
3. Classification
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
99
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
100
4. Technical Specifications
❑ 4.1. Parts
➢ Discrete parts or endless material (i.e., paper or woven goods) minimum and
maximum dimensions
➢ Changes in shape
➢ Description of the features that have to be extracted
➢ Changes of these features concerning error parts and common product variation
➢ Surface finish
➢ Color
➢ Corrosion, oil films, or adhesives
➢ Changes due to part handling, i.e., labels, fingerprints
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
102
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
104
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
107
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
109
4. Technical Specifications
4.1. Parts
4.2. Part Presentation
4.3. Performance Requirements
4.4. Information Interfaces
4.5. Installation space
4.6. Environment
111
4. Technical Specifications
❑ 4.6. Environment
➢ Ambient light
➢ Dirt or dust that the equipment needs to be protected from shock or vibration that
affects the part of the equipment heat or cold
➢ Necessity of a certain protection class
➢ Availability of power supply
1. Introduction
2. Basic elements of machine vision system
3. Classification
4. Technical specifications
5. Designing a Machine Vision System
113
5. Designing a Machine Vision System
Field of view.
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 117
5. Designing a Machine Vision System
❑ 5.3. Resolution
➢ camera sensor resolution
➢ spatial resolution
➢ measurement accuracy
❑ 5.3. Resolution
➢ Calculation of Resolution
❑ 5.3. Resolution
➢ Example: Measure the dimension of the object 10mmx6mm above with accuracy
0.01 mm.
▪ Using edge detection for dimention measurement → Nf = 1/3 pixel. But there is a
tolerance in positioning, the number of pixels for the smallest feature is set to 1
pixel (Nf = 1 pixel).
▪ Size of the smallest feature Sf = 0.01 mm
❑ 5.3. Resolution
▪ Camera resolution:
𝑁𝑓 1 𝑝𝑖𝑥𝑒𝑙
𝑅𝐶_ℎ𝑜𝑟 = 𝐹𝑂𝑉ℎ𝑜𝑟 = 13 𝑚𝑚 = 1300 𝑝𝑖𝑥𝑒𝑙𝑠
𝑆𝑓 0.01 𝑚𝑚
𝑁𝑓 1 𝑝𝑖𝑥𝑒𝑙
𝑅𝐶_𝑣𝑒𝑟 = 𝐹𝑂𝑉𝑣𝑒𝑟 = 9.75 𝑚𝑚 = 975 𝑝𝑖𝑥𝑒𝑙𝑠
𝑆𝑓 0.01 𝑚𝑚
❑ 5.3. Resolution
▪ Object resolution (spatial resolution): assuming that a lens with a field of view of
14 mm ( > 13 mm) was chosen, a camera with resolution of 1440x1080 was
chosen.
𝐹𝑂𝑉 14 𝑚𝑚
𝑅𝑠 = = = 0.01 𝑚𝑚/𝑝𝑖𝑥𝑒𝑙
𝑅𝐶 1440 𝑝𝑖𝑥𝑒𝑙
❑ 5.3. Resolution
➢ Calculation of Resolution: Resolution for a Line Scan Camera
• Operating system
▪ smart cameras
• Development process
▪ compact vision systems
• means for a user-friendly human machine
interface ▪ PC-based systems
• Processing load
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 133
5. Designing a Machine Vision System
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 134
5. Designing a Machine Vision System
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 135
5. Designing a Machine Vision System
➢ Focal Length
lens extension
Magnification
focal length
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 137
5. Designing a Machine Vision System
https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2440-75uc/ 143
5. Designing a Machine Vision System
https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/ 144
5. Designing a Machine Vision System
https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c23-1620-5m-p-f16mm/ 145
5. Designing a Machine Vision System
• Directed light
• Confocal frontlight
• Bright field
• Dark field
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 148
5. Designing a Machine Vision System
❑ 5.9. Software
➢ selection of a software library
➢ design and implementation of the application-specific software
❑ 5.9. Software
➢ Software Library
➢ Software Structure
▪ Image acquisition
▪ Preprocessing
▪ Feature localization
▪ Feature extraction
▪ Feature interpretation
▪ Generation of results
▪ Handling interfaces
Alexander Hornberg, “Handbook of Machine Vision,” WILEY-VCH, Weinheim, 2006. 158
5. Designing a Machine Vision System
❑ 5.9. Software
➢ General Topics
▪ Visualization of live images for all cameras
▪ Possibility of image saving
▪ Maintenance modus
▪ Log files for the system state
▪ Detailed visualization of the image processing
▪ Crucial processing parameters
❑ 5.10. Costs
➢ The development costs
▪ project management
▪ base design
▪ hardware components
▪ software licenses
▪ software development
▪ installation
▪ test runs, feasibility tests, and acceptance test
▪ training
▪ documentation
❑ 5.10. Costs
➢ The operating costs
▪ maintenance, such as cleaning of the optical equipment
Answer
Feedback
Quiz 2
OX Example Select
Quiz Number 2 Quiz Type
Answer
Feedback
Quiz 3
OX Example Select
Quiz Number 3 Quiz Type
Answer
Feedback
Quiz 4
OX Example Select
Quiz Number 4 Quiz Type
Answer
Feedback
Quiz 5