You are on page 1of 59

Vision

Systems
S PR ING ED I TI O N
Contents

3 — Discrete Sensors 101: Sensor types and best practices

9 — Magnets, sensors create paths for automated guided


vehicles

24 — Artificial intelligence tools can aid sensor systems

46 — Vision and drone autonomy add to industrial process,


safety, inspection applications

51 — AI vision for monitoring manufacturing and industrial

environments

2
Discrete Sensors 101: Sensor
types and best practices Discrete Sensors 101:
Sensor types and best
practices
Measuring success: Understand which type of discrete sensors to use for
Magnets, sensors create
what applications, terminology, and tips. paths for automated guided
vehicles

D iscrete or digital sensing is ubiquitous in automation. It has been used since the
days of relay logic, before programmable logic controllers (PLCs) even existed,
and its use today continues to simplify logic in the PLC. A discrete sensor sends an on/
Artificial intelligence tools
can aid sensor systems

Vision and drone autonomy


off (yes/no) signal, often allowing the PLC to ignore analog threshold, deadband, de- add to industrial process,
tection speed, and other complexities. safety, inspection
applications

That signal could mean “I see a part,” “machine air pressure is above 80 psi,” “actu- AI vision for monitoring
manufacturing and industrial
ator has reached position,” “heater at temperature,” or a number of other situations.
environments
Robust machine function is highly dependent on using the right sensors in the right
ways. Each one of these conditions is likely to use a different type of sensor.

Common types of sensors in automation


Below are many common types of sensors used for automation.

Limit switches
Limit switches, which are still in use today, have a mechanical switch that’s turned on or
off when it’s in contact with a part. They can be found in various shapes and sizes and
offer options like redundant contacts. Despite their simplicity and availability, many ap-
plications have transitioned to non-contact, solid-state sensors for their flexibility and
long life. It can also be an inconvenience that limit switches require contact with the
part they sense.
3
Discrete Sensors 101: Sensor types and best practices

Reed switches Discrete Sensors 101:


Reed switches, which are mostly used in pneumatics have a mechanical switch that’s
Sensor types and best
practices
turned on/off by a magnet. These are typically mounted on the cylinder where the
piston has a magnet in it. Note that it’s not always best practice to sense the cylinder Magnets, sensors create
paths for automated guided
position. For example, when a cylinder drives a linkage that drives a plate which push- vehicles
es a part into position. What if the pin comes out of the linkage? What if the linkage
Artificial intelligence tools
has some “slop” or backlash in its motion? It’s better to sense the plate that touches can aid sensor systems
the part, rather than sensing the position of the cylinder. Since these are mechanical
Vision and drone autonomy
devices, there’s the same question of longevity as with limit switches. There are solid add to industrial process,
state versions of cylinder switches that can be used instead. safety, inspection
applications
Proximity switches AI vision for monitoring
Proximity switches are another common sensor that usually operates on an inductive manufacturing and industrial
environments
principle, which requires metal—preferably containing iron—to function. Non-ferrous
metals such as aluminum and copper can also be used, but these metals don’t detect
as well as iron. In this case, there would be a shorter sensing range, and require a larg-
er target to sense at all (sometimes to the point where it’s not very useful). There are
two ways to improve detection in this case:

1. Put a steel screw in the non-ferrous target for the prox to see.

2. Use a “long-range” or “unshielded” prox. These are two names for a prox that is
more sensitive because it has less metal shrouding on the tip of the sensor.

Other varieties exist that function on non-inductive principles (capacitive and ultra-
sonic) and can sense non-metallic parts. However, this is unusual enough that when a
proximity switch comes up in conversation, it’s usually assumed to be inductive. 4
Discrete Sensors 101: Sensor types and best practices

Photoelectric or photo eye sensors Discrete Sensors 101:


Photo eye (PE) sensors have a light “emitter” and “receiver.” Sometimes they’re in the
Sensor types and best
practices
same package, and sometimes they’re separate. These are usually an inexpensive way
to track parts in a system. Sometimes the light is guided through fiber optic lines and Magnets, sensors create
paths for automated guided
sometimes it’s used directly from the emitter/receiver. Parts can be detected either by vehicles
reflecting light back to the receiver (a reflective application) or by blocking the light
Artificial intelligence tools
beam from reaching the receiver (a through-beam application). [subhead] can aid sensor systems

Vision and drone autonomy


Choosing a type of sensor add to industrial process,
When purchasing a sensor, there are many options. Once a sensor is chosen that fits safety, inspection
applications
mechanically and is the general type, there are other considerations to consider during
the selection process: AI vision for monitoring
manufacturing and industrial
environments
• PNP versus NPN: This is a required option for all solid-state devices. It describes
the direction of current flow. PNP is typical in the U.S., but if there’s equipment
from other origins, it’s important to know what the PLC input is expecting. If the
PLC manual says “sinking input,” it’s PNP; if it says “sourcing input,” it’s NPN.
Some input modules can be configured as either. In that case, look at what’s con-
nected to the “common” terminal. If the common terminal is 0 V dc, it’s PNP; 24 V
dc is NPN.

• 2-wire versus 3-wire: This is mostly a choice between a mechanical contact (2-
wire) and a solid-state contact (3-wire).

• Quick disconnect versus integrated cable: Many sensors offer the option to
have a permanently connected cable or a quick disconnect. For a slightly higher
5
Discrete Sensors 101: Sensor types and best practices

cost, the quick disconnect option usually makes maintenance a lot easier. If the Discrete Sensors 101:
sensor breaks, a new cable isn’t necessary.
Sensor types and best
practices

There was a time when discrete sensors were truly digital in nature such as a mechani- Magnets, sensors create
paths for automated guided
cal pressure switch using a spring-loaded diaphragm, or a mercury-based thermostat, vehicles
but the line is blurring. Modern discrete sensors often measure things such as pres-
Artificial intelligence tools
sure, temperature, inductance, and brightness in analog and convert to a digital yes or can aid sensor systems
no using a tiny computer. Remarkably, many very simple sensors are now able to pass
Vision and drone autonomy
that analog information back to the PLC using technologies like IO-Link. If the data add to industrial process,
exists, and there’s a computer in there already, why not take advantage of it? This is a safety, inspection
relatively new trend and hasn’t yet found a strong foothold in the market. The PLC and applications
ladder logic programming language were founded on the concept of discrete signals. AI vision for monitoring
manufacturing and industrial
environments
Basic overview of where, when, how it simplifies programming:

• Proxes

• PEs

• Lasers

• More advanced
• Color recognition
• Pattern matching
• Vision
• Temperature controller.
6
Discrete Sensors 101: Sensor types and best practices

Key terms and application notes for PEs Discrete Sensors 101:
Sensor types and best
• Visible versus infrared light: Users will usually have to aim the light, and that’s a practices
lot easier if it can be seen, so consider visible light unless there’s a reason to use
Magnets, sensors create
infrared (IR).
paths for automated guided
vehicles
• Crosstalk: The light from these sensors can interfere with other sensors and light
Artificial intelligence tools
curtains. Consider that the light won’t be a line, it’ll be a cone and may affect other can aid sensor systems
devices that use the same wavelength (Figure 2). Several strategies can help com-
Vision and drone autonomy
bat crosstalk: add to industrial process,
• Alternate light direction when PEs are close to each other (Figure 3). For exam- safety, inspection
applications
ple, where two PEs are measuring across a conveyor 6-in. apart, the first one
has its emitter on the left side, the second on the right. AI vision for monitoring
manufacturing and industrial
environments
• Use different light wavelengths. For example, light curtains typically use infra-
red light. If PEs are used near a light curtain, use visible light PEs.

• Put apertures on the emitters to narrow the cone of light (Figure 4).

• Light-on versus dark-on: Should the sensor be on when it sees light, or when it
doesn’t see light? This is usually adjustable with a screwdriver or button.

• Precision: Simple PE applications don’t give very precise locations of the object
being sensed. For example, sensing boxes on a conveyor with a reflective PE may
only be accurate within an inch or two. Using fiber optic lines or apertures to make
the emitted light and receiving area both as small as possible can help improve
accuracy.
7
Discrete Sensors 101: Sensor types and best practices

• Lasers: These cost a little more, but they can do things other sensors can’t such as Discrete Sensors 101:
detect distance rather than only light blocking/reflecting. They also can sense clear
Sensor types and best
practices
parts, like plastic or glass.
Magnets, sensors create
paths for automated guided
Lessons learned vehicles
• Proximity switches
Artificial intelligence tools
• Work best with magnetic materials (steel/iron) can aid sensor systems
• Use a bolt head in other materials as a flag
Vision and drone autonomy
• “Long range” or “unshielded” proxs work better on non-magnetic materials add to industrial process,
(still has to be metallic) safety, inspection
applications

• Photo eyes AI vision for monitoring


manufacturing and industrial
• Can interfere with light curtain
environments
• Can look through material, even if it’s not transparent – it can then watch for
“step” change in opacity (like in label strip)
• Light on vs. dark on
• Lasers are better at seeing transparent objects
• Visible spectrum is easier to set up (aim)
• Apertures – different shapes for different applications

• Limit switches have moving parts, so they wear out faster

Jon Breen
Jon Breen is the owner at Breen Machine Automation Services LLC.

8
Magnets, sensors create paths
for automated guided vehicles Discrete Sensors 101:
Sensor types and best
practices
Application Update: An automated guided vehicle (AGV) can be built with Magnets, sensors
as little as two components: a magnetic guide sensor and a dual-channel create paths for
motor controller. automated guided
vehicles

A n automated guided vehicle (AGV) can be built with as little as two components:
a magnetic guide sensor and a dual-channel motor controller. The AGV will follow
a track made of an adhesive magnetic tape affixed on the floor. The magnetic sensor
Artificial intelligence tools
can aid sensor systems

Vision and drone autonomy


will measure how far from the center of the tape it is and provide the information to add to industrial process,
safety, inspection
the motor controller, which will then adjust the steering so the vehicle remains at the
applications
center of the track.
AI vision for monitoring
manufacturing and industrial
Magnetic markers positioned on the left and right side of the track give the AGV loca- environments
tion information that will be used to make stop and fork left/right decisions.

Magnetic track guiding


Magnetic tapes are one of several line-following techniques. The other two main tech-
niques are induction wire guide and optical. A table compares each of these tech-
niques.

Automated guided vehicles table: Tracking comparison


Chassis design
When designing the vehicle, there are four basic ways of providing drive and steering,
shown in the diagrams. Some types are easier to build; others have better steering
characteristics. Two of these designs are fully symmetrical and may be operated in 9
Magnets, sensors create paths for automated guided vehicles

both directions. The chassis design table lists the characteristics of each design. Discrete Sensors 101:
Sensor types and best
practices
Automated guided vehicles table: Chassis design
Magnets, sensors
Sensor mount create paths for
The sensor should be placed as shown in the above diagrams for each chassis design. automated guided
For the first two chassis types, the sensor must be placed near the front edge of the vehicles
chassis. On long AGVs, this means that a little steering will cause a wide swing at the Artificial intelligence tools
front and will make the steering control more difficult. can aid sensor systems

Vision and drone autonomy


On the steerable drive wheel design, the sensor can be placed on the chassis. Or it add to industrial process,
may be made part of the wheel assembly and turn with it. safety, inspection
applications

For best results, place the sensor at 30 mm above the floor and ensure that the height AI vision for monitoring
manufacturing and industrial
fluctuates within +/-10 mm at most as the AGV moves along the track. environments

Sensor-motor controller interface


The MGS1600C has several output types. Features and typical uses are shown in the
sensor attributes table.

Automated guided vehicles: Sensor attributes table


In the MutliPWM mode, the sensor data is output on one wire in a series of variable
width pulses, containing the track detect signal, track position, and left and right mark-
er detect signals. This pulse can be connected to any of the Roboteq motor controller’s
pulse inputs. Once the pulse input is configured as “Magsensor,” the sensor informa-
tion is transferred transparently and continuously to the motor controller, from where it
can be processed using the MicroBasic scripting language, or accessed by an external
10
Magnets, sensors create paths for automated guided vehicles

computer or PLC via the controller’s serial or USB port. Discrete Sensors 101:
Sensor types and best
practices
Electrical wiring
Magnets, sensors
The wiring diagram shows the magnetic guide sensor and motor controller in a typ-
create paths for
ical four-wheel-drive chassis. This diagram is applicable to all Roboteq dual-channel automated guided
brushed motor controllers. vehicles
Artificial intelligence tools
A figure shows the controller’s connector details. This wiring is compatible with all can aid sensor systems
Roboteq controllers equipped with a 15-pin DSub connector. The sensor and button
Vision and drone autonomy
can be connected to any other pulse and digital inputs. Refer to the product datasheet add to industrial process,
for the list of available signals and pin outs. The pulse output is on the blue wire of the safety, inspection
applications
sensor cable.
AI vision for monitoring
manufacturing and industrial
Configuration, testing
environments
The sensor and controller must be configured so that they will each function as de-
sired, and communicate with each other.

The sensor is configured by default to output MultiPWM pulse and therefore can be
used without further configuration if the track is made of Roboteq-supplied 25 mm
magnetic tape. Once powered, the Tape Detect LED will flash at a low rate if no tape
is present. When tape is in range, the LED will be on steady, and its color will change
when the tape is at the right or left.

For configuration, monitoring, and troubleshooting, connect the sensor to the PC


via the USB connector located under the screw plug. Run the Magsensor PC utility
to change the tape width if using a 50 mm tape, or use the waveform display view to
11
Magnets, sensors create paths for automated guided vehicles

monitor the shape of the magnetic field. Discrete Sensors 101:


Sensor types and best
practices
With no tape present, the PC utility must show a nearly flat line. For best results, always
Magnets, sensors
perform a zero calibration when operating the sensor in a new environment.
create paths for
automated guided
Moving a tape under the sensor will cause a “bell” curve to appear on the chart. The vehicles
curve must be on the positive (up) direction. If the curve is going down, change the
Artificial intelligence tools
Tape Polarity setting in the configuration menu. Makers will also cause a bell curve, but can aid sensor systems
the curve should be going down. A screenshot shows the resulting curve when placing
Vision and drone autonomy
a left marker and a centered track. add to industrial process,
safety, inspection
applications
Motor controller configuration
To receive and recognize data from the sensor, the controller must first be connected AI vision for monitoring
manufacturing and industrial
to a PC running the Roborun+ PC utility. In the configuration menu, the pulse input
environments
that is connected to the sensor must be enabled and configured as “Magsensor.”

Next, the controller must be configured to operate in mixed mode so that the steering
command will apply a different amount of power to the left and right motor for making
turns.

Magsensor to controller
When the sensor and the motor controller are connected to each other using the sin-
gle wire and MutiPWM mode, the sensor data are transferred periodically, and in the
background, into the motor controller from which they can be accessed and used.

An additional set of queries is available for reading this information from the motor
12
Magnets, sensors create paths for automated guided vehicles

controller. The queries can be sent either from the motor controller’s serial port, or Discrete Sensors 101:
Sensor types and best
from within a MicroBasic script running in the motor controller. (See the query set ta- practices
ble.)
Magnets, sensors
create paths for
Automated guided vehicle table: Query set automated guided
Implementing the AGV controls vehicles
Once the sensor and motor controllers are verified to work, we can proceed to the au- Artificial intelligence tools
tomatic mode. In this article, all the computation is done in the motor controller using can aid sensor systems
the MicroBasic scripting language Vision and drone autonomy
add to industrial process,
Steering control safety, inspection
applications
The sensor outputs a value that is the tape’s distance from the center of the track. This
information is then used to correct the steering. If the tape is centered, the value is 0, AI vision for monitoring
manufacturing and industrial
and no steering correction is needed. The farther the track is from the center, in one or environments
the other direction, the stronger the steering change. In this example, a proportional
control is implemented. For best precision and response time, the control algorithm
may be improved to a full PID (proportional-integral-derivative).

Throttle control
How the throttle power is controlled (when to start, stop, accelerate, slow down) is
application dependent. In this example, the AGV will be made to move when a tape
is detected, take left or right forks, and stop at precise locations. The AGV will then
resume moving after a set time, or when a user button is pressed. The AGV will stop
when the track is no longer present.

In a practical implementation, the AGV throttle will be controlled by an external de-


13
Magnets, sensors create paths for automated guided vehicles

vice, such as a PLC. The PLC must then be connected to one of the motor controller’s Discrete Sensors 101:
Sensor types and best
inputs. The throttle information can be an analog voltage or a variable duty cycle PWM practices
signal.
Magnets, sensors
create paths for
Fork, merge management automated guided
The sensor has an algorithm for detecting and managing up to two-way forks and vehicles
merges along the track. Internally, the controller always assumes that two tracks are
Artificial intelligence tools
present: a left track and a right track. When following one track, the sensor considers can aid sensor systems
that the two tracks are superimposed. When entering forks, the track widens, and so
Vision and drone autonomy
does the distance between the left and right tracks. When approaching merges, the add to industrial process,
sensor will report a sudden spread of the left and right tracks but will otherwise oper- safety, inspection
applications
ate the same way as at forks.
AI vision for monitoring
manufacturing and industrial
Localization using markers
environments
Magnetic markers are a piece of magnetic tape of opposite polarity located left and/
or right of the center track. Markers provide a very simple and cost-effective method to
identify specific locations along the track.

This application uses markers on the left or right side to indicate which track to follow
at a fork. Markers located at the left and right side will indicate a stop location.

More elaborate marker arrangements can be made to carry more information about a
location on the track. An example of multi-level markers is provided in a diagram.

Manual steering override


It is common to require that the AGV be driven manually, to place it in position, or
14
Magnets, sensors create paths for automated guided vehicles

to move it along an untracked path. Buttons, a joystick, a PLC, or an RC radio can be Discrete Sensors 101:
Sensor types and best
connected directly to the motor controller’s free inputs. The program running inside practices
the motor controller can easily be made to switch from automatic to manual command.
Magnets, sensors
Manual override is not described here.
create paths for
automated guided
Test track description vehicles
The test track figure shows a simple AGV track with several loading stations and one
Artificial intelligence tools
stop station. For simplicity, the AGV here will stop 30 seconds at every station, or until can aid sensor systems
the operator presses the push button.
Vision and drone autonomy
add to industrial process,
The flow chart shows the structure of the MicroBasic program that will run inside the safety, inspection
applications
motor controller to move and steer the AGV along the track. The full source code is
provided at the bottom of the article. AI vision for monitoring
manufacturing and industrial
environments
Manual AGV test drive
Before the sensor can be used for automatic steering, it is a good idea to test drive
the chassis manually, either by attaching a joystick to the PC that is connected to the
motor controller, or by using an RC radio. If the vehicle is difficult to drive manually, in
automatic mode it will be equally challenging. Modify the design so that it drives and
steers as smoothly and accurately as possible.

Testing the automatic steering program


When running the program for the first time, it is recommended to lift the AGV’s
wheels off the ground. Then place a piece of magnetic tape below the sensor. Verify
that the left and right wheels start rotating when the tape is detected. Verify that the
left and right rotation speed changes as the tape is moved away from the sensor’s cen-
15
Magnets, sensors create paths for automated guided vehicles

ter, in a manner that would cause the AGV to rotate so that the sensor would become Discrete Sensors 101:
Sensor types and best
centered with the tape. If the AGV rotates away, then invert the polarity of the Gain practices
value in the script.
Magnets, sensors
create paths for
With the AGV wheels on the floor, verify that the steering correction is such that the automated guided
sensor never moves far from the track. Increasing the Gain value will cause a stronger vehicles
correction when the sensor moves away from the tape, but it can make the AGV oscil-
Artificial intelligence tools
late if the gain is too high. Find the optimal Gain value for stable and accurate track- can aid sensor systems
ing.
Vision and drone autonomy
add to industrial process,
Testing fork, stop markers safety, inspection
applications
Markers are best tested with the AGV on the track. Verify that the AGV follows the
expected track at a fork. When entering a merge, ensure that the AGV is following the AI vision for monitoring
manufacturing and industrial
correct track and that it will not jump to the opposite track when it enters the sensor’s
environments
range.

Verify that the AGV stops when a left and right marker are detected at the same time.
Check that the AGV resumes motion after 30 seconds, or when the button is pressed.
Beware that as the AGV moves away from the marker pair, one of the two markers will
disappear from the sensor’s range before the other. The other marker will remain ac-
tive for a short duration and will therefore be considered as a left or right fork marker.
Ensure that one marker is present before the next fork or next merge following a stop
location.

Improving the AGV


Using a more complex steering algorithm: The sample script uses a simple propor-
16
Magnets, sensors create paths for automated guided vehicles

tional control, where the amount of steering correction is the distance from the center Discrete Sensors 101:
Sensor types and best
track, multiplied by a gain factor. For better results, the script may need to be en- practices
hanced so that a proportional-integral or full proportional-integral-derivative control is
Magnets, sensors
used instead.
create paths for
automated guided
The amount of correction can also be capped to avoid oversteering. In a variable vehicles
speed system, it may also be desirable to have a different correction gain at slow and
Artificial intelligence tools
high speeds. can aid sensor systems

Vision and drone autonomy


Using multi-level markers add to industrial process,
In this application example, we used a simple left and right marker pair to identify a safety, inspection
applications
stop location. In typical applications, more information is needed about location so
that the AGV will change its behavior. For example, identifying segments of tracks AI vision for monitoring
manufacturing and industrial
where the AGV must move at a high speed and others at low speed, identifying load
environments
stations requiring longer pause time than others, or identifying charging stations where
the AGV will stop only when its battery level is low and resume when the battery is
charged.

One simple and free technique is to count markers in a track segment delimited by
a marker segment on the opposite side. A figure shows such a marker configuration.
When a left marker first appears, the counter is reset. The counter is then incremented
at every appearance of a right marker while the left marker is still present. When the
left marker disappears, the counter is evaluated and the AGV can alter its operation
accordingly.

Improving stop position accuracy


17
Magnets, sensors create paths for automated guided vehicles

In applications requiring the AGV to stop at a very precise location, a secondary sen- Discrete Sensors 101:
Sensor types and best
sor, oriented at 90 degrees from the main sensor, can be added. This sensor can then practices
be used to locate another magnetic guide with 1 mm position accuracy. Sensors and
Magnets, sensors
guides arrangements are shown in the figure.
create paths for
automated guided
AGV localization, safety vehicles
For safety reasons, it is typically necessary to fit the AGV with an infrared or a laser
Artificial intelligence tools
range finder so that it will stop if a person or obstacle is detected along the track. can aid sensor systems
Range finders typically provide a digital signal which can easily be connected to an
Vision and drone autonomy
input on the motor controller or to a PLC if one is present. add to industrial process,
safety, inspection
applications
If more information is needed by the AGV about its location along the track, RFID tags
positioned in key locations are a good solution. However, RFID tags typically imply the AI vision for monitoring
manufacturing and industrial
presence of a microcomputer or a PLC on the AGV to process the data and make navi-
environments
gation decisions.

Script source
The source code below is written in Roboteq’s MicroBasic language and runs inside the
motor controller to perform the AGV functionality described in this article.

option explicit

‘ This script provide basic control for an AGV.

‘ Motor will turn on upon the presence of a track and stop when track disappears.

18
Magnets, sensors create paths for automated guided vehicles

‘ The track position information is used to provide left/right steering. Discrete Sensors 101:
Sensor types and best
practices
‘ At forks, the AGV will follow the left or right track depending whether the last
Magnets, sensors
create paths for
‘ marker detected was on the left or right side of the track. automated guided
vehicles
‘ Make sure you precede merges with a marker so that the AGV remains on the main
Artificial intelligence tools
track. can aid sensor systems

Vision and drone autonomy


‘ The presence of a left and right marker simultaneously will cause the AGV to stop for add to industrial process,
safety, inspection
applications
‘ 30 seconds or until the operator presses the button
AI vision for monitoring
manufacturing and industrial
‘ declare variables
environments

dim Gain as integer

dim DefaultThrottle as integer

dim TapeDetect as boolean

dim MarkerLeft as boolean

dim MarkerRight as boolean

dim Throttle as integer


19
Magnets, sensors create paths for automated guided vehicles

Discrete Sensors 101:


Sensor types and best
dim Tape_Position as integer practices

Magnets, sensors
dim LineSelect as integer
create paths for
automated guided
dim Steering as integer vehicles
Artificial intelligence tools
dim GoButton as boolean can aid sensor systems

Vision and drone autonomy


dim RunState as boolean add to industrial process,
safety, inspection
applications
dim NotOnStopMarker as boolean
AI vision for monitoring
manufacturing and industrial
dim PauseTime as integer
environments

‘ initialize constants

Gain = -7 ‘ Use negative value to invert steering command

DefaultThrottle = 250 ‘ Motor power level while the AGV runs

LineSelect = 1 ‘ Use left track by default

PauseTime = 30000 ‘ in milliseconds

‘ main loop to repeat every 10ms


20
Magnets, sensors create paths for automated guided vehicles

Discrete Sensors 101:


Sensor types and best
top: practices

Magnets, sensors
wait(10)
create paths for
automated guided
‘ read sensor data vehicles
Artificial intelligence tools
TapeDetect = getvalue(_MGD) can aid sensor systems

Vision and drone autonomy


MarkerLeft = getvalue(_MGM, 1) add to industrial process,
safety, inspection
applications
MarkerRight = getvalue(_MGM, 2)
AI vision for monitoring
manufacturing and industrial
‘ Read button state
environments

GoButton = getvalue(_DI, 2)

if (GoButton) then SetTimerCount(1, 0) ‘ Pressing the button will clear the pause timer

if GetTimerState(1) then RunState = true ‘ When pause timer is cleared, AGV is allowed
to run

‘ Use TapeDetect and Pause Timer to apply throttle or not

if (TapeDetect and GetTimerState(1))

21
Magnets, sensors create paths for automated guided vehicles

Throttle = DefaultThrottle Discrete Sensors 101:


Sensor types and best
practices
else
Magnets, sensors
create paths for
Throttle = 0 automated guided
vehicles
end if
Artificial intelligence tools
can aid sensor systems
‘ Check Marker presence to select Left or Right track
Vision and drone autonomy
add to industrial process,
if (MarkerLeft) then LineSelect = 1 safety, inspection
applications

if (MarkerRight) then LineSelect = 2 AI vision for monitoring


manufacturing and industrial
environments
‘ Detect when transitioning onto stop markers

if (NotOnStopMarker and MarkerLeft and MarkerRight)

NotOnStopMarker = false ‘ Mark stop marker detection so that it is not de-


tected again until AGV moved away

SetTimerCount(1, PauseTime) ‘ Load stop timer timeout value

RunState = false

else
22
Magnets, sensors create paths for automated guided vehicles

Discrete Sensors 101:


Sensor types and best
NotOnStopMarker = true practices

Magnets, sensors
end if
create paths for
automated guided
Tape_Position = getvalue(_MGT, LineSelect) vehicles
Artificial intelligence tools
‘ use tape position multiplied with gain as steering can aid sensor systems

Vision and drone autonomy


Steering = Tape_Position * Gain add to industrial process,
safety, inspection
applications
‘ Send throttle and steering to controller Configured in Mixed mode
AI vision for monitoring
manufacturing and industrial
setcommand(_G, 1, Throttle)
environments

setcommand(_G, 2, Steering)

‘ Log output. Useful for troubleshooting. Comment out when done.

print(“r”, TapeDetect,”t”, Tape_Position,”t”, MarkerLeft,”t”, MarkerRight,”t”, Throt-


tle,”t”, Steering,”t”,RunState,”t”,LineSelect)

goto top ‘ Loop forever

Cosma Pabouctsidis
Cosma Pabouctsidis is senior technologist, Roboteq Inc.
23
Artificial intelligence tools can
aid sensor systems Discrete Sensors 101:
Sensor types and best
practices
At least seven artificial intelligence (AI) tools can be useful when applied to Magnets, sensors create
sensor systems: knowledge-based systems, fuzzy logic, automatic knowledge paths for automated guided
vehicles
acquisition, neural networks, genetic algorithms, case-based reasoning, and
ambient-intelligence. Artificial intelligence
tools can aid sensor
systems
S even artificial intelligence (AI) tools are reviewed that have proved to be useful with
sensor systems. They are: knowledge-based systems, fuzzy logic, automatic knowl-
edge acquisition, neural networks, genetic algorithms, case-based reasoning, and
Vision and drone autonomy
add to industrial process,
safety, inspection
ambient-intelligence. Each AI tool is outlined, together with some examples of its use applications
with sensor systems. Applications of these tools within sensor systems have become
AI vision for monitoring
more widespread due to the power and affordability of present-day computers. Many manufacturing and industrial
new sensor applications may emerge, and greater use may be made of hybrid tools environments
that combine the strengths of two or more of the tools reviewed.

The tools and methods reviewed here have minimal computation complexity and
can be implemented with small sensor systems, single sensors, or system arrays with
low-capability microcontrollers. The appropriate deployment of the new AI tools will
contribute to the creation of more competitive sensor systems and applications. Other
technological developments in AI that will impact sensor systems include data mining
techniques, multi-agent systems, and distributed self-organizing systems. Ambient
sensing involves integrating many microelectronic processors and sensors into every-
day objects to make them “smart.” They can explore their environment, communicate
with other smart things, and interact with humans. Advice provided aims to help users
cope with their tasks in intuitive ways, but the repercussion of such integration into our
24
Artificial intelligence tools can aid sensor systems

lives is difficult to predict. Using ambient intelligence and a mix of AI tools is an effort Discrete Sensors 101:
Sensor types and best
to use the best of each technology. The concepts are generically applicable across practices
industrial processes, and this research is intended to show that the concepts work in
Magnets, sensors create
practice. paths for automated guided
vehicles
Creating smarter sensor systems Artificial intelligence
Sensor systems can be improved using artificial intelligence (AI). AI emerged as a com- tools can aid sensor
puter science discipline in the mid 1950s, and it has produced a number of powerful systems
tools that are useful in sensor systems for automatically solving problems that would Vision and drone autonomy
normally require human intelligence. Seven such tools are here: knowledge-based sys- add to industrial process,
safety, inspection
tems, fuzzy logic, inductive learning, neural networks, genetic algorithms, case-based
applications
reasoning, and ambient-intelligence.
AI vision for monitoring
manufacturing and industrial
AI systems have been improving, and new advances in machine intelligence are cre- environments
ating seamless interactions between people and digital sensor systems. Although the
introduction of AI into industry has been slow, it promises to bring improvements in
flexibility, reconfigurability, and reliability. New machine systems are exceeding human
performance in increasing numbers of tasks. As they merge with us more intimate-
ly, and we combine our brain power with computer capacity to deliberate, analyze,
deduce, communicate, and invent, then we may be on the threshold of a new age of
machine intelligence.

AI (or machine intelligence) combines a wide variety of advanced technologies to give


machines an ability to learn, adapt, make decisions, and display new behaviors. This is
achieved using technologies such as neural networks, expert systems, self-organizing
maps, fuzzy logic, and genetic algorithms, and that machine intelligence technology
25
Artificial intelligence tools can aid sensor systems

has been developed through its application to many areas where sensor information Discrete Sensors 101:
Sensor types and best
has needed to be interpreted and processed, for example: practices

Magnets, sensors create


• Assembly paths for automated guided
• Biosensors vehicles
• Building modeling
Artificial intelligence
• Computer vision tools can aid sensor
• Cutting tool diagnosis systems
• Environmental engineering Vision and drone autonomy
• Force sensing add to industrial process,
• Health monitoring safety, inspection
applications
• Human-computer interaction
• Internet use AI vision for monitoring
manufacturing and industrial
• Laser milling environments
• Maintenance and inspection
• Powered assistance
• Robotics
• Sensor networks
• Teleoperation.

These developments in machine intelligence are being introduced into ever more
complex sensor systems. The click of a mouse, the flick of a switch, or the thought of a
brain might convert almost any sensor data to information and transport it to you. Re-
cent examples of this research work are provided, which include work at the University
of Portsmouth. Seven areas where AI can help sensor systems follow.

26
Artificial intelligence tools can aid sensor systems

Discrete Sensors 101:


1. Knowledge-based systems Sensor types and best
Knowledge-based (or expert) systems are computer programs embodying knowledge practices
about a domain for solving problems related to that domain. An expert system usually Magnets, sensors create
has two main elements, a knowledge base and an inference mechanism. The knowl- paths for automated guided
edge base contains domain knowledge which may be expressed as a combination of vehicles
‘IF–THEN’ rules, factual statements, frames, objects, procedures, and cases. An infer- Artificial intelligence
ence mechanism manipulates stored knowledge to produce solutions to problems. tools can aid sensor
Knowledge manipulation methods include using inheritance and constraints (in a systems
frame-based or object-oriented expert system), retrieval and adaptation of case exam- Vision and drone autonomy
ples (in case-based systems), and the application of inference rules (in rule-based sys- add to industrial process,
safety, inspection
tems), according to some control procedure (forward or backward chaining) and search
applications
strategy (depth or breadth first).
AI vision for monitoring
manufacturing and industrial
A rule-based system describes knowledge of a system in terms of IF… THEN… ELSE. environments
Specific knowledge can be used to make decisions. These systems are good at represent-
ing knowledge and decisions in a way that is understandable to humans. Due to the rigid
rule-base structure they are less good at handling uncertainty and are poor at handling
imprecision. A typical rule-based system has four basic components: a list of rules or rule
base, which is a specific type of knowledge base; an inference engine or semantic reason-
er, which infers information or takes action based on the interaction of input and the rule
base; temporary working memory; and a user interface or other connection to the outside
world through which input and output signals are received and sent.

The concept in case-based reasoning is to adapt solutions from previous problems


to current problems. These solutions are stored in a database and can represent the
experience of human specialists. When a problem occurs that a system has not expe-
27
Artificial intelligence tools can aid sensor systems

rienced, it compares with previous cases and selects one that is closest to the current Discrete Sensors 101:
Sensor types and best
problem. It then acts upon the solution given and updates the database depending practices
upon the success or failure of the action. Case-based reasoning systems are often
Magnets, sensors create
considered to be an extension of rule-based systems. They are good at representing paths for automated guided
knowledge in a way that is clear to humans, but they also have the ability to learn from vehicles
past examples by generating additional new cases.
Artificial intelligence
tools can aid sensor
2. Case-based reasoning systems
Case-based reasoning has been formalized for purposes of computer reasoning as a Vision and drone autonomy
four-step process: add to industrial process,
safety, inspection
applications
1) Retrieve: Given a target problem, retrieve cases from memory that are relevant to
solving it. A case consists of a problem, its solution, and, typically, annotations about AI vision for monitoring
manufacturing and industrial
how the solution was derived. environments

2) Reuse: Map the solution from the previous case to the target problem. This may
involve adapting the solution as needed to fit the new situation.

3) Revise: Having mapped the previous solution to the target situation, test the new
solution in the real world (or a simulation) and, if necessary, revise.

4) Retain: After the solution has been successfully adapted to the target problem,
store the resulting experience as a new case in memory.

Critics argue that it is an approach that accepts anecdotal evidence as its main operating
principle. Without statistically relevant data for backing and implicit generalization, there
28
Artificial intelligence tools can aid sensor systems

is no guarantee that the generalization is correct. However, all inductive reasoning where Discrete Sensors 101:
Sensor types and best
data is too scarce for statistical relevance is inherently based on anecdotal evidence. practices

Magnets, sensors create


The concept in case-based reasoning (CBR) is to adapt solutions from previous prob- paths for automated guided
lems to current problems. These solutions are stored in a database and represent the vehicles
experience of human specialists. When a problem occurs that a system has not experi-
Artificial intelligence
enced, it compares with previous cases and selects one closest to the current problem. tools can aid sensor
It then acts upon the solution given and updates the database depending upon the systems
success or failure of the action. Vision and drone autonomy
add to industrial process,
CBR systems are often considered to be an extension of rule-based systems. As with safety, inspection
applications
rule-based systems, CBR systems are good at representing knowledge in a way that is
clear to humans; however, CBR systems also have the ability to learn from past exam- AI vision for monitoring
manufacturing and industrial
ples by generating additional new cases. environments

Many expert systems are developed using programs known as “shells,” which are
ready-made expert systems complete with inferencing and knowledge storage facilities
but without the domain knowledge. Some sophisticated expert systems are construct-
ed with the help of “development environments.” The latter are more flexible than
shells in that they also provide means for users to implement their own inferencing and
knowledge representation methods.

Expert systems are probably the most mature among tools mentioned here, with many
commercial shells and development tools available to facilitate their construction. Con-
sequently, once the domain knowledge to be incorporated in an expert system has been
extracted, the process of building the system is relatively simple. The ease with which
29
Artificial intelligence tools can aid sensor systems

expert systems can be developed has led to a large number of applications of the tool. Discrete Sensors 101:
Sensor types and best
In sensor systems, applications can be found for a variety of tasks, including selection of practices
sensor inputs, interpreting signals, condition monitoring, fault diagnosis, machine and
Magnets, sensors create
process control, machine design, process planning, production scheduling, and system paths for automated guided
configuring. Some examples of specific tasks undertaken by expert systems are: vehicles

Artificial intelligence
• Assembly tools can aid sensor
• Automatic programming systems
• Controlling intelligent complex vehicles Vision and drone autonomy
• Planning inspection add to industrial process,
• Predicting risk of disease safety, inspection
applications
• Selecting tools and machining strategies
• Sequence planning AI vision for monitoring
manufacturing and industrial
• Controlling plant growth. environments

3. Fuzzy logic
A disadvantage of ordinary rule-based expert systems is that they cannot handle new
situations not covered explicitly in their knowledge bases (that is, situations not fitting
exactly those described in the “IF” parts of the rules). These rule-based systems are
unable to produce conclusions when such situations are encountered. They are there-
fore regarded as shallow systems which fail in a “brittle” manner, rather than exhibit a
gradual reduction in performance when faced with increasingly unfamiliar problems, as
human experts would.

The use of fuzzy logic, which reflects the qualitative and inexact nature of human rea-
soning, can enable expert systems to be more resilient. With fuzzy logic, the precise
30
Artificial intelligence tools can aid sensor systems

value of a variable is replaced by a linguistic description, the meaning of which is rep- Discrete Sensors 101:
Sensor types and best
resented by a fuzzy set, and inferencing is carried out based on this representation. practices
For example, an input from a sensor system of 20 can be replaced by “normal” as the
Magnets, sensors create
linguistic description of the variable “sensor input.” A fuzzy set defining the term “nor- paths for automated guided
mal sensor input” might be: vehicles

Artificial intelligence
normal sensor input = 0.0/below 10 widgets per minute +0.5/10−15 widgets per min- tools can aid sensor
ute +1.0/15−25 widgets per minute +0.5/25−30 widgets per minute +0.0/above 30 systems
widgets per minute. Vision and drone autonomy
add to industrial process,
The values 0.0, 0.5, and 1.0 are the degrees or grades of membership of the sensor safety, inspection
applications
ranges below 10 (above 30), 10−15 (25−30), and 15−25 to the given fuzzy set. A grade
of membership equal to 1 indicates full membership, and a null grade of membership AI vision for monitoring
manufacturing and industrial
corresponds to total non-membership. environments

Knowledge in an expert system employing fuzzy logic can be expressed as qualitative


statements (or fuzzy rules), such as, “If the input from the room temperature sensor
is normal, then set the heat input to normal.” A reasoning procedure known as the
compositional rule of inference, which is the equivalent of the modus-ponens rule in
rule-based expert systems, enables conclusions to be drawn by generalization (ex-
trapolation or interpolation) from the qualitative information stored in the knowledge
base. For instance, when a sensor input is detected to be “slightly below normal,” a
controlling fuzzy expert system might deduce that the sensor inputs should be set to
“slightly above normal.” Note that this conclusion might not have been contained in
any of the fuzzy rules stored in the system.

31
Artificial intelligence tools can aid sensor systems

Fuzzy expert systems (FES) use fuzzy logic to handle the uncertainties generated by Discrete Sensors 101:
Sensor types and best
incomplete or partially corrupt data. The technique uses the mathematical theory of practices
fuzzy sets to simulate human reasoning. Humans can easily deal with ambiguity (areas
Magnets, sensors create
of grey) in terms of decision making, yet machines find it difficult. paths for automated guided
vehicles
Fuzzy logic has many applications in sensor systems where the domain knowledge can
Artificial intelligence
be imprecise. Fuzzy logic is well suited where imprecision is inherent due to imprecise tools can aid sensor
limits between structures or objects, limited resolution, numerical reconstruction meth- systems
ods, and image filtering. For example, applications in structural object recognition Vision and drone autonomy
and scene interpretation have been developed using fuzzy sets within expert systems. add to industrial process,
Fuzzy expert systems are suitable for applications that require an ability to handle safety, inspection
applications
uncertain and imprecise situations. They do not have the ability to learn as the values
within the system are preset and cannot be changed. AI vision for monitoring
manufacturing and industrial
environments
Notable successes have been achieved in the areas of:

• Cooperative robots
• Mobile robots
• Prediction of sensory properties
• Supply chain management
• Welding.

4. Automatic knowledge acquisition


Getting domain knowledge to build into a knowledge base can be complicated and
time consuming. It can be a bottleneck in constructing an expert system. Automatic
knowledge acquisition techniques were developed to address this, for example, in the
32
Artificial intelligence tools can aid sensor systems

form of IF–THEN rules (or an equivalent decision tree). This sort of learning program Discrete Sensors 101:
Sensor types and best
usually requires a set of examples as a learning input. Each example is characterized by practices
the values of a number of attributes and the class to which they belong.
Magnets, sensors create
paths for automated guided
One approach for example is through a process of “dividing-and-conquering,” where vehicles
attributes are selected according to some strategy (for example, to maximize the infor-
Artificial intelligence
mation gain) to divide the original example set into subsets, and the inductive learning tools can aid sensor
program builds a decision tree that correctly classifies the given example set. The tree systems
represents the knowledge generalized from the specific examples in the set. This can Vision and drone autonomy
subsequently be used to handle situations not explicitly covered by the example set. add to industrial process,
safety, inspection
applications
In another approach known as the “covering approach,” the inductive learning program
attempts to find groups of attributes uniquely shared by examples in given classes and AI vision for monitoring
manufacturing and industrial
forms rules with the IF part as conjunctions of those attributes and the THEN part as environments
the classes. The program removes correctly classified examples from consideration and
stops when rules have been formed to classify all examples in the given set.

Another approach is to use logic programming instead of propositional logic to de-


scribe examples and represent new concepts. That approach employs the more power-
ful predicate logic to represent training examples and background knowledge and to
express new concepts. Predicate logic permits the use of different forms of training ex-
amples and background knowledge. It enables the results of the induction process (the
induced concepts) to be described as general first-order clauses with variables and not
just as zero-order propositional clauses made up of attribute-value pairs. There are two
main types of these systems, the first based on the top-down generalization/specializa-
tion method, and the second on the principle of inverse resolution.
33
Artificial intelligence tools can aid sensor systems

A number of learning programs have been developed, for example ID3, which is a Discrete Sensors 101:
Sensor types and best
divide-and-conquer program; the AQ program, which follows the covering approach; practices
the FOIL program, which is an ILP system adopting the generalization/specialization
Magnets, sensors create
method; and the GOLEM program, which is an ILP system based on inverse resolution. paths for automated guided
Although most programs only generate crisp decision rules, algorithms have also been vehicles
developed to produce fuzzy rules.
Artificial intelligence
tools can aid sensor
The requirement for a set of examples in a rigid format (with known attributes and of systems
known classes) has been easily satisfied by requirements in sensor systems and networks Vision and drone autonomy
so that automatic learning has been widely used in sensor systems. This sort of learning add to industrial process,
is most suitable for problems where attributes have discrete or symbolic values rather safety, inspection
applications
than those with continuous-valued attributes as in many sensor systems problems.
AI vision for monitoring
manufacturing and industrial
Some examples of applications of inductive learning are: environments

• Laser cutting
• Mine detection
• Robotics.

5. Neural networks
Neural networks can also capture domain knowledge from examples. However, they do
not archive the acquired knowledge in an explicit form such as rules or decision trees,
and they can readily handle both continuous and discrete data. They also have a good
generalization capability as with fuzzy expert systems.

A neural network is a computational model of the brain. Neural network models usually
34
Artificial intelligence tools can aid sensor systems

assume that computation is distributed over several simple units called neurons, which Discrete Sensors 101:
Sensor types and best
are interconnected and operate in parallel. (Hence, neural networks are also called par- practices
allel-distributed-processing systems or connectionist systems.)
Magnets, sensors create
paths for automated guided
The most popular neural network is the multi-layer perceptron, which is a feedforward vehicles
network: all signals flow in one direction from the input to the output of the network.
Artificial intelligence
Feedforward networks can perform static mapping between an input space and an out- tools can aid sensor
put space: the output at a given instant is a function only of the input at that instant. systems
Recurrent networks, where the outputs of some neurons are fed back to the same Vision and drone autonomy
neurons or to neurons in layers before them, are said to have a dynamic memory: the add to industrial process,
output of such networks at a given instant reflects the current input as well as previous safety, inspection
applications
inputs and outputs.
AI vision for monitoring
manufacturing and industrial
Implicit “knowledge” is built into a neural network by training it. Some neural networks environments
can be trained by being presented with typical input patterns and the corresponding
expected output patterns. The error between the actual and expected outputs is used
to modify the strengths, or weights, of the connections between the neurons. This
method of training is known as supervised training. In a multi-layer perceptron, the
back-propagation algorithm for supervised training is often adopted to propagate the
error from the output neurons and compute the weight modifications for the neurons
in the hidden layers.

Some neural networks are trained in an unsupervised mode, where only the input patterns
are provided during training and the networks learn automatically to cluster them in groups
with similar features.

35
Artificial intelligence tools can aid sensor systems

Artificial neural networks (ANNs) typically have inputs and outputs, with processing Discrete Sensors 101:
Sensor types and best
within hidden layers in between. Inputs are independent variables and outputs are de- practices
pendent. ANNs are flexible mathematical functions with configurable internal parame-
Magnets, sensors create
ters. To accurately represent complicated relationships, these parameters are adjusted paths for automated guided
through a learning algorithm. In “supervised” learning, examples of inputs and corre- vehicles
sponding desired outputs are simultaneously presented to networks, which iteratively
Artificial intelligence
self-adjust to accurately represent as many examples as possible. tools can aid sensor
systems
Once trained, ANNs can accept new inputs and attempt to predict accurate outputs. Vision and drone autonomy
To produce an output, the network simply performs function evaluation. The only as- add to industrial process,
sumption is that there exists some continuous functional relationship between input safety, inspection
applications
and output data. Neural networks can be employed as mapping devices, pattern clas-
sifiers, or pattern completers (auto-associative content addressable memories and pat- AI vision for monitoring
manufacturing and industrial
tern associators). Like expert systems, they have found a wide spectrum of applications environments
in almost all areas of sensor systems, addressing problems ranging from modeling,
prediction, control, classification, and pattern recognition, to data association, cluster-
ing, signal processing, and optimization. Some recent application examples are:

• Feature recognition
• Heat exchangers
• Inspection of soldering joints
• Optimizing spot welding parameters
• Power
• Tactile displays
• Vehicle sensor systems.

36
Artificial intelligence tools can aid sensor systems

Discrete Sensors 101:


6. Genetic algorithms Sensor types and best
A genetic algorithm is a stochastic optimization procedure inspired by natural evo- practices
lution. A genetic algorithm can yield the global optimum solution in a complex Magnets, sensors create
multi-modal search space without requiring specific knowledge about the problem to paths for automated guided
be solved. However, for a genetic algorithm to be applicable, potential solutions to a vehicles
given problem must be representable as strings of numbers (usually binary) known as Artificial intelligence
chromosomes, and there must be a means of determining the goodness, or fitness, of tools can aid sensor
each chromosome. A genetic algorithm operates on a group or population of chromo- systems
somes at a time, iteratively applying genetically based operators such as cross-over Vision and drone autonomy
and mutation to produce fitter populations containing better solution chromosomes. add to industrial process,
safety, inspection
applications
The algorithm normally starts by creating an initial population of chromosomes using
AI vision for monitoring
a random number generator. It then evaluates each chromosome. The fitness values
manufacturing and industrial
of the chromosomes are used in the selection of chromosomes for subsequent opera- environments
tions. After the cross-over and mutation operations, a new population is obtained and
the cycle is repeated with the evaluation of that population.

Genetic algorithms have found applications in sensor systems problems involving com-
plex combinatorial or multi-parameter optimization. Some recent examples of those
applications are:

• Assembly
• Assembly line balancing
• Fault diagnosis
• Health monitoring
• Power steering.
37
Artificial intelligence tools can aid sensor systems

Discrete Sensors 101:


7. Ambient-intelligence Sensor types and best
Ambient intelligence has been promoted for the last decade as a vision of people practices
working easily in digitally controlled environments in which the electronics can antic- Magnets, sensors create
ipate their behavior and respond to their presence. The concept of ambient intelli- paths for automated guided
gence is for seamless interaction between people and sensor systems to meet actual vehicles
and anticipated needs. Artificial intelligence
tools can aid sensor
Use in industry has been limited, but new, more intelligent and more interactive systems systems
are at the research stage. From the perspective of sensor systems, a less human and Vision and drone autonomy
more system-centered definition of ambient intelligence needs to be considered. Mod- add to industrial process,
safety, inspection
ern sensor concepts tend to be human-centered approaches so that the application of
applications
ambient intelligence technologies in a combination with knowledge management may
AI vision for monitoring
be a promising approach. Many research issues still have to be resolved to bring the
manufacturing and industrial
ambient intelligence technology to industrial sectors, such as robust, reliable (wireless) environments
sensors and context-sensitivity, intelligent user interfaces, safety, security, and so forth.

Ambient intelligence information and knowledge gathered from sensors within an envi-
ronment represents an untapped resource for optimization processes and for possibili-
ties to provide more efficient services. The introduction of ambient intelligence technol-
ogies is still in an initial phase. However, it is promising to bring advantages in flexibility,
reconfigurability, and reliability. At the same time, prices of sensors and tags are decreas-
ing. Development and implementation of new concepts based on ambient intelligence
systems in the mid- and long-term are likely. A large number of industrial companies will
probably introduce different ambient intelligence technologies to the shop-floor.

On the other hand, vendors of sensors will need to equip their products with additional
38
Artificial intelligence tools can aid sensor systems

ambient intelligence features and utilize the advantages of ambient intelligence integrat- Discrete Sensors 101:
Sensor types and best
ed sensors within the environment to provide new functionalities (for example: self-con- practices
figuration, context-sensitivity, etc.) and improve performances of their products.
Magnets, sensors create
paths for automated guided
AI applications, University of Portsmouth vehicles
AI tools are being applied at the University of Portsmouth to assist industry in the Artificial intelligence
adoption of artificial intelligence for use with sensor systems. tools can aid sensor
systems
Monitoring and controlling machinery: Simple rules are being investigated that Vision and drone autonomy
modify pre-planned paths and improve gross robot motions associated with pick-and- add to industrial process,
safety, inspection
place assembly tasks, and rules to predict terrain contours are being developed using
applications
a feed-forward neural network. Case-based reasoning is being applied to reuse pro-
grams (or parts of programs) to automatically program sensor arrays. The combined AI vision for monitoring
manufacturing and industrial
work is already showing that automatic programming and re-programming may help to environments
introduce environmental sensors into smaller and medium enterprises. Other projects
are using simple expert systems to improve the use of sensor data in tele-operation
applications.

Process monitoring and control: An expert system is being developed to assist in


process control and to enhance the implementation of statistical process control. A
bespoke expert system uses a hybrid rule-based and pseudo object-oriented method
of representing standard statistical process control knowledge and process-specific di-
agnostic knowledge. The amount of knowledge from sensor arrays and sensor systems
can be large, which justifies the use of a knowledge-based systems approach. The sys-
tem is being enhanced by integrating a neural network module with the expert system
modules to detect any abnormal patterns.
39
Artificial intelligence tools can aid sensor systems

Monitoring sensor arrays: A system has been created to monitor sensors in a high re- Discrete Sensors 101:
Sensor types and best
circulation airlift reactor (a process to produce clean water). Reactors can be at the edge practices
of stability, and that requires accurate interpretation of real-time sensor data from sen-
Magnets, sensors create
sors, such as: flow rate, air input, pressure, etc. A second system is interpreting data from paths for automated guided
ultrasonic sensor arrays on tele-operated mobile robots and on wheelchairs. vehicles

Artificial intelligence
Fuzzy monitoring and control: A robotic welding system is being created that uses tools can aid sensor
image processing techniques and a computer-aided design (CAD) model to provide systems
information to a multi-intelligent decision module. The system uses a combination of Vision and drone autonomy
techniques to suggest weld requirements. These suggestions are evaluated, decisions add to industrial process,
are made, and then weld parameters are sent to a program generator. The status of the safety, inspection
applications
welding process is difficult to monitor because of the intense disturbance during the pro-
cess. Other work is using multiple sensors to obtain information about the process. Fuzzy AI vision for monitoring
manufacturing and industrial
measurement and fuzzy integral methods are being investigated to fuse extracted signal environments
features in order to predict the penetration status of the welding process.

Neural-network-based product inspection: Two projects are using neural networks


for product inspection: one is recognizing shipbuilding parts and a second is using
cameras to detect and classify defects. Neural networks are useful for these types of
applications because of the common difficulty in precisely describing various types of
defects and differences. The neural networks can learn the classification task automati-
cally from examples.

The first system is managing to recognize shipbuilding parts using artificial neural
networks and Fourier descriptors. Improvements have been made to a pattern recogni-
tion system for recognizing shipbuilding parts. This has been achieved by using a new,
40
Artificial intelligence tools can aid sensor systems

simple and accurate corner-finder. The new system initially finds corners in an edge-de- Discrete Sensors 101:
Sensor types and best
tected image of a part and uses that new information to extract Fourier descriptors to practices
feed into a neural network to make decisions about shapes. Using an all-or-nothing
Magnets, sensors create
accuracy measure, the new systems have achieved an improvement over other systems. paths for automated guided
vehicles
A second intelligent inspection system has been built that consists of cameras con-
Artificial intelligence
nected to a computer that implements neural-network-based algorithms for detecting tools can aid sensor
and classifying defects. Outputs from the network indicate the type of defect. Initial systems
investigation suggests that the accuracy of defect classification is good (in excess of Vision and drone autonomy
85%) and faster than manual inspection. The system is also used to detect defective add to industrial process,
parts with a high accuracy (almost 100%). safety, inspection
applications

Genetic algorithms to create an ergonomic workplace layout: A genetic algorithm for AI vision for monitoring
manufacturing and industrial
deciding where to place sensors in a work cell is being developed. The layout pro- environments
duced by the program will be such that the most frequently needed sensors are pri-
oritized. A genetic algorithm is suitable for this optimization problem because it can
readily accommodate multiple constraints.

Ambient intelligence to improve energy efficiency: Ambient intelligence and knowl-


edge management technologies are being used to optimize the energy efficiency of
manufacturing units. This benefits both the company and the environment as the car-
bon footprint is reduced. Different measuring systems are being applied to monitor
energy use. Ambient data provide the opportunity to have detailed information on the
performance of a manufacturing unit. Knowledge management facilitates processing
this information and advises on actions to minimize energy usage but maintain produc-
tion. Existing energy consumption data from standard measurements is being com-
41
Artificial intelligence tools can aid sensor systems

plemented by ambient intelligence related measurements (from interactions of human Discrete Sensors 101:
Sensor types and best
operators and machines/processes and smart tags) as well as process related measure- practices
ments (manufacturing line temperatures, line pressure, production rate) and knowl-
Magnets, sensors create
edge gathered within the manufacturing assembly unit. This is fed to a service oriented paths for automated guided
architecture system. vehicles

Artificial intelligence
Combining different systems tools can aid sensor
The purpose of a hybrid system is to combine the desirable elements of different AI systems
techniques into one system. Each different methods of implementing AI has its own Vision and drone autonomy
strengths and weaknesses. Some effort has been made in combining different methods add to industrial process,
safety, inspection
to produce hybrid techniques with more strengths and fewer weaknesses. An example
applications
is the neuro-fuzzy system, which seeks to combine the uncertainty handling of fuzzy
systems with the learning strength of artificial neural networks. AI vision for monitoring
manufacturing and industrial
environments
A solution to the problems associated with weld programming is being addressed in
this way. An existing system consists of two software systems working in series to con-
struct viable robot programs. The first system, the CAD model interpreter, accepts a
CAD model and determines the welds required. This data is fed to the program gener-
ator, which re-orientates the weld requirements in line with the actual real-world orien-
tation of the panel. The program generator then sends any programs sequentially to
the robot (normally one program per weld line). Additional software systems could be
incorporated into the existing system at the point where the robot programs are sent
to the robot system. This is because the transmission protocol at this point is standard
TCP/IP [transmission control protocol/Internet protocol] and any programs to be sent
can be viewed as text files.

42
Artificial intelligence tools can aid sensor systems

A new proposed system will gather that data from an image sensor. The visual data Discrete Sensors 101:
Sensor types and best
and CAD model data will be used in conjunction to determine an object list, and that practices
object list will be passed to a weld identifier module that will use AI techniques to de-
Magnets, sensors create
termine weld requirement. paths for automated guided
vehicles
The proposed system uses a combination of AI techniques working in parallel to sug-
Artificial intelligence
gest weld requirements. These suggestions are then evaluated and decisions made tools can aid sensor
regarding the weld required. These parameters are then sent to a new program gen- systems
erator, which produces a custom robot program for use on the shop floor. Image cap- Vision and drone autonomy
ture methods are being combined with a decision making system that uses multiple AI add to industrial process,
techniques to decide on weld requirements for a job. safety, inspection
applications

The system will combine real-world visual sensor data with data provided by the CAD AI vision for monitoring
manufacturing and industrial
model. It will then use this combined data to present differing AI systems with the same environments
information. These systems will then make weld requirement suggestions to a weld
identifier module. This module will evaluate the suggestions and determine the optimum
weld path. The suggestions will then be passed to the existing robot program generator.

The robot program generation systems have been created, tested, and used to pro-
duce consistent straight line welds. A simple edge detection system has been creat-
ed. Work surrounding the AI systems is in the early stages and will be advanced. The
multi-intelligent decision module framework will be further developed and combina-
tions of AI techniques tested. The AI techniques to be tested will include rule-based,
case-based, and fuzzy systems. Any created system needs to be able to handle the
uncertainty of unidentified objects within the image; however, when all objects are pos-
itively identified there should be little doubt as to the weld path.
43
Artificial intelligence tools can aid sensor systems

Another example of combining different artificial intelligence tools is the fuzzy net- Discrete Sensors 101:
Sensor types and best
work. The nodes of this type of network are fuzzy rule bases and the connections be- practices
tween the nodes are interactions in the form of outputs from nodes that are fed as
Magnets, sensors create
inputs to the same or other nodes. The fuzzy network is a hybrid tool combining fuzzy paths for automated guided
systems and neural networks due to its underlying grid structure with horizontal levels vehicles
and vertical layers. This tool is quite suitable for modeling the assembly automation
Artificial intelligence
process because the separate assembly stages can be described as modular fuzzy rule tools can aid sensor
bases interacting in sequential/parallel fashion and feedforward/feedback context. systems
Vision and drone autonomy
The main advantages from the application of this hybrid modeling tool are better ac- add to industrial process,
curacy due to the single fuzzification-inference-de-fuzzification and higher transparency safety, inspection
applications
due to the modular approach used. These advantages are quite crucial bearing in mind
the uncertainties in the data and the interconnected structure of some sensor systems. AI vision for monitoring
manufacturing and industrial
environments
Mix of sensor and logic systems
Portsmouth University researchers are mixing sensor systems and some powerful new
technologies, and the longer use is yielding better results. Over time, results include
less use of energy, space, and time, along with more output for less cost. In the Re-
gional Centre for Manufacturing Industry at Portsmouth University, machines read in
data from real objects and lay-down successive layers to build up a model of the ob-
ject from a series of cross sections. AI is becoming important everywhere in reducing
costs and time.

AI can increase effective communication, reduce mistakes, minimize errors, and extend
sensor life.

44
Artificial intelligence tools can aid sensor systems

Over the past 40 years, artificial intelligence has produced a number of powerful tools, Discrete Sensors 101:
Sensor types and best
including those reviewed here: knowledge-based systems, fuzzy logic, automatic learn- practices
ing, neural networks, ambient intelligence, and genetic algorithms. Applications of
Magnets, sensors create
these tools in sensor systems have become more widespread due to the power and paths for automated guided
affordability of present-day computers. Many new sensor systems applications may vehicles
emerge, and greater use may be made of hybrid tools that combine the strengths
Artificial intelligence
of two or more of the tools mentioned. Other technological developments in AI that tools can aid sensor
will impact sensor systems include data mining, multi-agent systems, and distributed systems
self-organizing systems. The appropriate deployment of the new AI tools will contrib- Vision and drone autonomy
ute to the creation of more competitive sensor systems. add to industrial process,
safety, inspection
applications
It may take another decade for engineers to recognize the benefits given the current
lack of familiarity and the technical barriers associated with using these tools, but this AI vision for monitoring
manufacturing and industrial
field of study is expanding. environments

The tools and methods described have minimal computation complexity and can be
implemented on small assembly lines, single robots, or systems with low-capability
microcontrollers. These novel approaches proposed use ambient intelligence and the
mixing of different AI tools in an effort to use the best of each technology. The con-
cepts are generically applicable across many processes.

David Sanders
David Sanders is research coordinator, School of Engineering, Reader in Systems &
Knowledge Engineering, University of Portsmouth, U.K

45
Vision and drone autonomy add
to industrial process, safety, Discrete Sensors 101:
Sensor types and best
practices
inspection applications Magnets, sensors create
paths for automated guided
Technology advances such as light detection and ranging (LiDAR) are allowing vehicles

drones to go to dangerous environments they couldn’t before without the Artificial intelligence tools
can aid sensor systems
need for GPS.
Vision and drone

D rone inspection and mapping technology has expanded in the past decade with
the advancement of lightweight batteries, drone flight controls, autonomous navi-
gation and high-resolution digital cameras. Inspection applications have grown rapidly
autonomy add to
industrial process,
safety, inspection
applications
in agriculture, oil & gas, building and bridge construction, electric utility vegetation
management, fire and rescue, surveying and many other inspection tasks. GPS-guided AI vision for monitoring
manufacturing and industrial
drones using digital cameras have created new service markets for many inspection environments
and surveying tasks that were impossible to accomplish with other methods.

Empowering drones with autonomous navigation has greatly improved the ability to
collect useful data and reduce the need for, and costs associated with, skilled pilots.
Most of us take GPS for granted today when traveling by car or even hiking in the
mountains. GPS is an integral part of many drone surveying and inspection activities.
However, there are some environments like mines, warehouses, tanks, underneath
bridges and other covered spaces where GPS is not available or reliable. To address
this challenge, Emesent has developed proprietary LiDAR-based drone autonomy
technology that can survey and inspect previously inaccessible spaces or assets.

A traditional drone survey refers to the use of a drone, or unmanned aerial vehicle
46
Vision and drone autonomy add to industrial process, safety, inspection

(UAV), to capture ae- Discrete Sensors 101:


Sensor types and best
rial data with down- practices
ward-facing sensors,
Magnets, sensors create
such as RGB or mul- paths for automated guided
tispectral cameras vehicles
and light detecting Artificial intelligence tools
and ranging (LiDAR) can aid sensor systems
payloads. During a
Vision and drone
drone survey with autonomy add to
an RGB camera, industrial process,
the ground is pho- safety, inspection
tographed several applications
times from different AI vision for monitoring
angles, and each im- manufacturing and industrial
Figure 1: Hovermap drone and light environments
age is tagged with coordinates. 2D and 3D maps can detection and ranging (LiDAR). Courtesy:
be created using GPS and photometric techniques. ARC Advisory Group/Emesent

The Emesent Hovermap solution is LiDAR-based and does not require GPS. Instead,
it uses a simultaneous localization and mapping (SLAM) algorithm to estimate the
drone’s position and orientation from the LiDAR data itself and create high-detail, ac-
curate 3D point clouds and also provides advanced autonomy capabilities. It allows the
drone to fly autonomously, self-navigating around obstacles even in GPS-denied envi-
ronments and beyond line-of-sight and communication range.

This lets the drone be used for mapping in underground mines, underneath bridges or
platforms, tanks, and other covered spaces; the modular camera can be removed from
the drone and mounted to a ground robot or used for walking scans. 47
Vision and drone autonomy add to industrial process, safety, inspection

Discrete Sensors 101:


Measurement technology adds data that can’t be gathered Sensor types and best
from a camera practices
The basic idea is to measure and regulate key variables when controlling a process. Magnets, sensors create
When new technology allows you to measure important variables, there is an oppor- paths for automated guided
vehicles
tunity to improve the control of that process. With accurate 3D point clouds from the
new high precision lightweight LiDAR, models can be produced to provide great in- Artificial intelligence tools
sights into both surface and subsurface mining, structures like industrial processes, can aid sensor systems

warehouses, buildings, bridges, tunnels and subways. Vision and drone


autonomy add to
Hovermap’s rotating LiDAR provides an omni-directional field of view, ensuring 3D data industrial process,
safety, inspection
is collected in all directions. These maps are useful for review of erosion, tree mea-
applications
surements, and stockpile volumes. The acquired data can be combined with data from
additional sensors, such as cameras or hyperspectral imaging for further data analytics. AI vision for monitoring
manufacturing and industrial
environments
One of the possibilities for autonomy is flying drones below the forest canopy for ac-
curate growth assessment. This is useful to cover large areas of new plantings without
trudging miles or fear of GPS interruptions.

Partnering with mine planning software providers


Emesent’s ambition to help mining customers boost productivity and improve outputs
has led to alliances with major mine planning software vendors. These alliances make
it easier for mining companies to adopt new data capture technology by leveraging
existing surveying capability. Through seamless import of Hovermap scans into these
platforms, users are able to use the data for geotechnical analysis and condition moni-
toring, as-built surveys, volume reporting and design updates.

48
Vision and drone autonomy add to industrial process, safety, inspection

Data from previously in- Discrete Sensors 101:


Sensor types and best
accessible areas is bring- practices
ing new insights and an
Magnets, sensors create
additional level of detail paths for automated guided
to mine operations. Part- vehicles
nering with key technol- Artificial intelligence tools
ogy providers in the van- can aid sensor systems
guard is helping make a
Vision and drone
digitally-driven mining autonomy add to
ecosystem a reality. industrial process,
Figure 2: Hovermap creates mine scan safety, inspection
with LiDAR. Courtesy: ARC Advisory applications
Keeping workers away from danger Group/Emesent
Underground mines contain large open voids that AI vision for monitoring
manufacturing and industrial
are critical to the efficient and safe operation of the mine. These unsupported voids
environments
are a source of ground falls that can endanger personnel, underground infrastructure,
and equipment. The traditional method of scanning an open void is for a surveyor to
operate a boom-mounted LiDAR sensor near the edge of the void, which is time-con-
suming, produces poor quality, shadowed, incomplete scans and endangers personnel.

Hovermap pilots can fly an entire mission, from take-off to landing, using only a tablet.
Data is processed on-board in real-time to stream a 3D map back to the operator. Tap-
ping on the map will set smart waypoints, and Hovermap takes care of the rest, navigat-
ing to the waypoints while avoiding obstacles and completing the mission autonomously.

Drone technology advances make humans safer


New market segments have developed as drones coupled with a variety of sensors
49
Vision and drone autonomy add to industrial process, safety, inspection

have fed a seemingly insatiable demand for fast, easy access to detailed asset infor- Discrete Sensors 101:
Sensor types and best
mation. Drones can provide quality images and improve the operation of agriculture, practices
building and bridge construction, offshore platforms, pipelines, transmission lines,
Magnets, sensors create
disaster response and many other industries. paths for automated guided
vehicles
Drone technology continues to improve the capture of useful data with the advance- Artificial intelligence tools
ment of LiDAR. The SLAM-based autonomy now provides the ability to operate in can aid sensor systems
GPS-denied environments using lightweight precision LiDAR carried by a drone. This
Vision and drone
autonomous capability means LiDAR can operate in previously unmapped dark, con- autonomy add to
fined spaces like mines with minimal human involvement. This ability can improve mine industrial process,
operations and gather valuable data without putting humans into hostile environments safety, inspection
or work on dangerous tasks. applications
AI vision for monitoring
Rick Rys manufacturing and industrial
environments
Rick Rys is a senior consultant at ARC Advisory Group.

50
AI vision for monitoring
manufacturing and industrial Discrete Sensors 101:
Sensor types and best
practices
environments Magnets, sensors create
paths for automated guided
Manufacturers can benefit from AI machine vision technologies by increasing vehicles

uptime, leverage preventive maintenance and more. Artificial intelligence tools


can aid sensor systems

I n traditional industrial and manufacturing environments, monitoring worker safety,


enhancing operator efficiency and improving quality inspection were physical tasks.
Today, AI-enabled machine vision technologies replace many of these inefficient, la-
Vision and drone autonomy
add to industrial process,
safety, inspection
applications
bor-intensive operations for greater reliability, safety, and efficiency. By deploying AI
smart cameras, further performance improvements are possible since the data used to AI vision for monitoring
empower AI machine vision comes from the camera itself.
manufacturing and
industrial environments

AI-enabled machine vision


In 2020, the AI machine vision market size for manufacturing and industrial environ-
ments was $4.1 billion, which, according to an IoT Analytics report, is forecast to grow
to $15.2 billion by 2025. That is a compound annual growth rate (CAGR) of 30% com-
pared to the 6.5% CAGR of traditional machine vision deployments. This high CAGR
is because the next generation of real-time edge AI machine vision is not limited to
quality assurance and product inspection applications.

Worker safety is a top priority in manufacturing and industrial settings, and AI-enabled
smart cameras help automate monitoring and inspection in these environments. It is
essential to ensure protection for employees, contractors, and other third-party op-
erators who work in potentially unsafe environments, such as dangerous mechanical
51
AI vision for monitoring manufacturing and industrial environments

Discrete Sensors 101:


Sensor types and best
practices

Magnets, sensors create


paths for automated guided
vehicles

Artificial intelligence tools


can aid sensor systems

Vision and drone autonomy


add to industrial process,
safety, inspection
applications

AI vision for monitoring


manufacturing and
industrial environments

equipment or hazardous ma- Figure 1: Safety light curtains take up floor space, are difficult to
deploy, and lack flexibility; some are reaching limitations regarding
terials. behavior and position
responsiveness. All-in-one artificial intelligence (AI) smart cameras,
(POSE) detection generates in- such as Adlink Neon-2000 series, decrease latency by capturing
formation that indicates whether images and performing AI-related operations before sending results
and instructions to related equipment, such as the robotic arm. This
machine operators are in dan-
minimize delays, reduces space and bandwidth requirements, and
ger, following standard oper- the camaras are easy to deploy and maintain. Courtesy: Adlink 52
AI vision for monitoring manufacturing and industrial environments

ating procedures (SOP), or working in ways that enhance productivity and efficiency. Discrete Sensors 101:
Sensor types and best
Automated optical inspection (AOI) also increases the speed and accuracy of quality practices
control.
Magnets, sensors create
paths for automated guided
AI for smart worker safety vehicles
Fatalities in industrial settings are not unheard of. When evaluating worker safety, fa- Artificial intelligence tools
cilities also must consider non-fatal work-related injuries. In addition to the emotional can aid sensor systems
trauma, there is often a financial consideration to take into account. Vision and drone autonomy
add to industrial process,
safety, inspection
Industrial and manufacturing sites traditionally use human supervision and light cur-
applications
tains for ensuring worker safety. However, human supervisors, who cannot be every-
where and see everything, are fallible and safety light curtains have their inherent AI vision for monitoring
manufacturing and
limitations.
industrial environments

Geo-fencing
In modern smart factories, people often work in potentially hazardous areas with dan-
gerous equipment, such as robotic arms. Safety light curtains protect personnel from
injury by creating a sensing screen that guards machine access points and perimeters.
However, they occupy lots of floor space, are difficult to deploy and lack flexibility. In
some instances, the safety light curtain’s limited response time may create additional
issues.

Conventional machine vision solutions use IP cameras and AI modules that are flexible
and easy to deploy but come with considerable latency and are, therefore, unsuitable
for situations requiring an immediate response.

53
AI vision for monitoring manufacturing and industrial environments

An all-in-one AI smart camera can address this latency problem. It captures images and Discrete Sensors 101:
Sensor types and best
performs all AI-related operations before sending results and instructions to related practices
equipment, such as the robotic arm [Figure 1]. Compared to light curtains and con-
Magnets, sensors create
ventional machine vision implementations, using an all-in-one smart camera minimizes paths for automated guided
delays, reduces space and bandwidth requirements, and is easy to install and maintain. vehicles

Artificial intelligence tools


Real-time machine vision AI offers additional benefits to augment worker safety by can aid sensor systems
alerting users if they enter an unsafe zone and logging that information for retraining Vision and drone autonomy
purposes. Logging data from past events can be helpful in the future. For example, if add to industrial process,
a worker approaches a hazardous area, instead of the robotic arm shutting down com- safety, inspection
applications
pletely, it could go into a functional safety process loop. These can help improve work-
er safety as well as increase the factory’s operating efficiency. AI vision for monitoring
manufacturing and
industrial environments
Smart refueling with AI vision
When a fuel truck arrives at a manufacturing facility, it introduces the potential for mul-
tiple safety issues smart AI vision can resolve. First of all, the truck may roll if the brake
is not applied correctly or fails. Training the AI machine vision system to monitor the
truck for movement enables it to raise an immediate alarm should the status change.

Facilities also must consider the operator’s location during the refueling process be-
cause there are different types of zoning breaches. It is critical to ensure that all work-
ers on-site understand that there are safety risks. For example, it is necessary to place
traffic cones at the four corners of the truck and ensure that the operative refueling the
truck is wearing the appropriate personal protective equipment (PPE) – AI smart vision
can perform all safety checks to confirm procedures are met correctly [Figure 2].

54
AI vision for monitoring manufacturing and industrial environments

Discrete Sensors 101:


Sensor types and best
practices

Magnets, sensors create


paths for automated guided
vehicles

Artificial intelligence tools


can aid sensor systems

Vision and drone autonomy


add to industrial process,
safety, inspection
applications

AI vision for monitoring


manufacturing and
industrial environments

Immediate alerts from the AI machine vision system can Figure 2: Although supervisors
can be present to reinforce
warn operators of a safety breach and prevent injury. It
safety procedures, it is not always
also creates accountability; if someone enters an unsafe possible. If a person breaches a
zone without the appropriate PPE, the logged images hazardous area, AI smart machine
vision can sound an immediate
can flag errors and educate employees to prevent future
alarm. Courtesy: Adlink
mistakes.

POSE detection teaches repetitive injury avoidance


For the manufacturing industry, “cycle time” is a critical performance index for produc-
tion efficiency. It represents the amount of time a team spends producing an item until
the product is ready for shipment. Monitoring employee behavior and position with 55
AI vision for monitoring manufacturing and industrial environments

AI smart camera Discrete Sensors 101:


Sensor types and best
technology helps practices
enforce SOP and
Magnets, sensors create
improve worker effi- paths for automated guided
ciency. vehicles

Artificial intelligence tools


POSE detection can aid sensor systems
from live video Vision and drone autonomy
plays a critical role, add to industrial process,
enabling the overlay of digital content and information Figure 3: POSE detection on an safety, inspection
electronics manufacturing line can applications
on top of the analog world. POSE describes the body’s
help increase productivity, as well
position and movement with a set of skeletal landmark as improve order, quantity, and line AI vision for monitoring
points such as a hand, elbow or shoulder. balance. Courtesy: Adlink manufacturing and
industrial environments
AI machine vision enables factory operators and workers to focus on how physical po-
sitions affect their work. POSE data can help train operators to learn where they should
place their arms and hands to work more ergonomically and efficiently [Figure 3].

Tracking whether an operator is present at their workstation on the production line also
automates and verifies timesheets. Making sure they are following SOP helps ensure
quality control and line balancing.

AI smart AOI for contact lens inspections


Manual product quality inspection is time-consuming, inconsistent and can create
production line bottlenecks. Conventional AOI machine vision can detect easy-to-find
defects faster than professional quality control staff due its exceptional accuracy and
56
AI vision for monitoring manufacturing and industrial environments

efficiency. However, when a fault is difficult to detect, such as a flaw on a contact lens, Discrete Sensors 101:
Sensor types and best
these machine vision systems reach their limits for accuracy and consistency. practices

Magnets, sensors create


While most manufacturers randomly sample test products for flaws, this is not possible paths for automated guided
on contact lens production lines as every lens needs inspecting. Quality control staff vehicles
can only view up to 4,000 lenses per shift, creating production bottlenecks. False dis- Artificial intelligence tools
covery rates and missed detections also are inevitable. can aid sensor systems

Vision and drone autonomy


As contact lenses are transparent, implementing machine vision-based detection has add to industrial process,
been a challenge for manufacturers. Conventional AOI relies on fixed geometric al- safety, inspection
applications
gorithms to discover defects, but acquiring quality images from transparent objects is
challenging, which results in unacceptable detection performance. AI vision for monitoring
manufacturing and
industrial environments
Collecting data using AI smart cameras to train the AI algorithms and iterate on in-
spection performance gains provides a more favorable solution. The AI smart system
can identify the most common defects, including burrs, bubbles, edges, particles,
scratches, and more [Figure 4], and maintain inspection logs for customer reference.

Each AI smart camera can inspect 50x more lenses than manual visual inspection, with
accuracy improvements from 30 to 95%.

Machine vision with AI increases uptime, safety


Manufacturers that leverage robust, real-time data from AI machine vision technolo-
gies can increase uptime, leverage preventive maintenance enhance productivity and
worker safety and many other workplace benefits.

57
AI vision for monitoring manufacturing and industrial environments

The AI machine vision applications require AI algorithms for deep learning. The soft- Discrete Sensors 101:
Sensor types and best
ware experts that develop AI algorithms need a smart, reliable platform for executing practices
AI model inferencing. AI smart cameras with pre-installed edge vision analytics (EVA)
Magnets, sensors create
software can address many issues common to conventional AI vision systems, improve paths for automated guided
compatibility, speed up installation, and minimize maintenance issues. vehicles

Artificial intelligence tools


To successfully deploy an AI vision project, it may take engineers as long as 12 weeks can aid sensor systems
to conduct a proof of concept. It takes considerable time to overcome the learning Vision and drone autonomy
curve of choosing optimized cameras and the AI inference engine, retraining AI mod- add to industrial process,
els and optimizing video streams. EVA software simplifies these steps with its pipeline safety, inspection
applications
structure and shortens the PoC time to up to 2 weeks, serving as an excellent starting
point to kick-start the AI vision project. AI vision for monitoring
manufacturing and
industrial environments
Chia-Wei Yang
Chia-Wei Yang, director, business and product center, IoT Solution and Technology
Business Unit, ADLINK

58
Vision Systems

Thank you for visiting the Vision Systems eBook!


Content Archive
If you have any questions or feedback about the contents
2021 Winter Edition in this eBook, please contact CFE Media at
2021 Fall Edition customerservice@cfemedia.com

We would love to hear from you!

You might also like