You are on page 1of 229

FANUC Robotics

R-J3 and R-J3iB Controller


VISION V-500iA/3DL 3D Sensor
Operator's Manual
MARO3V50006001E REV. B
B-81444EN/04
This publication contains proprietary information of FANUC Robotics
America, Inc. furnished for customer use only. No other uses are
authorized without the express written permission of FANUC
Robotics America, Inc.
FANUC Robotics America, Inc.
3900 W. Hamlin Road
Rochester Hills, Michigan 48309–3253
The descriptions and specifications contained in this manual were in
effect at the time this manual was approved for printing. FANUC
Robotics America, Inc, hereinafter referred to as FANUC Robotics,
reserves the right to discontinue models at any time or to change
specifications or design without notice and without incurring
obligations.
FANUC Robotics manuals present descriptions, specifications,
drawings, schematics, bills of material, parts, connections and/or
procedures for installing, disassembling, connecting, operating and
programming FANUC Robotics’ products and/or systems. Such
systems consist of robots, extended axes, robot controllers,
application software, the KAREL programming language,
INSIGHT vision equipment, and special tools.
FANUC Robotics recommends that only persons who have been
trained in one or more approved FANUC Robotics Training
Course(s) be permitted to install, operate, use, perform procedures
on, repair, and/or maintain FANUC Robotics’ products and/or
systems and their respective components. Approved training
necessitates that the courses selected be relevant to the type of
system installed and application performed at the customer site.

! WARNING
This equipment generates, uses, and can radiate radio
frequency energy and if not installed and used in accordance
with the instruction manual, may cause interference to radio
communications. As temporarily permitted by regulation, it
has not been tested for compliance with the limits for Class A
computing devices pursuant to subpart J of Part 15 of FCC
Rules, which are designed to provide reasonable protection
against such interference. Operation of the equipment in a
residential area is likely to cause interference, in which case
the user, at his own expense, will be required to take
whatever measure may be required to correct the
interference.

FANUC Robotics conducts courses on its systems and products on


a regularly scheduled basis at its headquarters in Rochester Hills,
Michigan. For additional information contact
FANUC Robotics America, Inc.
Training Department
3900 W. Hamlin Road
Rochester Hills, Michigan 48309-3253
www.fanucrobotics.com
Send your comments and suggestions about this manual to:
product.documentation@fanucrobotics.com
Copyright 2004 by FANUC Robotics America, Inc.
All Rights Reserved
The information illustrated or contained herein is not to be
reproduced, copied, downloaded, translated into another language,
published in any physical or electronic format, including internet, or
transmitted in whole or in part in any way without the prior written
consent of FANUC Robotics America, Inc.
AccuStat, ArcTool, DispenseTool, FANUC LASER DRILL,
KAREL, INSIGHT, INSIGHT II, PaintTool, PaintWorks,
PalletTool, SOCKETS, SOFT PARTS SpotTool,
TorchMate, and YagTool are Registered Trademarks of FANUC
Robotics.
FANUC Robotics reserves all proprietary rights, including but not
limited to trademark and trade name rights, in the following names:
AccuAir AccuCal AccuChop AccuFlow AccuPath
AccuSeal ARC Mate ARC Mate Sr.  ARC Mate System 1
ARC Mate System 2 ARC Mate System 3 ARC Mate System
4 ARC Mate System 5 ARCWorks Pro AssistTool
AutoNormal AutoTCP BellTool BODYWorks Cal Mate Cell
Finder Center Finder Clean Wall CollisionGuard
DispenseTool F-100 F-200i FabTool FANUC LASER
DRILL Flexibell FlexTool HandlingTool HandlingWorks
INSIGHT INSIGHT II IntelliTrak Integrated Process Solution
Intelligent Assist Device IPC -Integrated Pump Control IPD
Integral Pneumatic Dispenser ISA Integral Servo Applicator ISD
Integral Servo Dispenser Laser Mate System 3 Laser Mate
System 4 LaserPro LaserTool LR Tool MIG Eye
MotionParts NoBots Paint Stick PaintPro PaintTool 100
PAINTWorks PAINTWorks II PAINTWorks III PalletMate
PalletMate PC PalletTool PC PayloadID RecipTool
RemovalTool Robo Chop Robo Spray S-420i S-430i
ShapeGen SoftFloat SOF PARTS SpotTool+ SR Mate
SR ShotTool SureWeld SYSTEM R-J2 Controller SYSTEM R-
J3 Controller SYSTEM R-J3iB Controller TCP Mate
TurboMove TorchMate visLOC visPRO-3D visTRAC
WebServer WebTP YagTool

 FANUC LTD 2004

• No part of this manual may be reproduced in any form.


• All specifications and designs are subject to change without notice.
Conventions
This manual includes information essential to the safety of
personnel, equipment, software, and data. This information is
indicated by headings and boxes in the text.

! WARNING
Information appearing under WARNING concerns the
protection of personnel. It is boxed and in bold type to set it
apart from other text.

! CAUTION
Information appearing under CAUTION concerns the protection of
equipment, software, and data. It is boxed to set it apart from
other text.

NOTE Information appearing next to NOTE concerns related


information or useful hints.
Ȧ No part of this manual may be reproduced in any form.
Ȧ All specifications and designs are subject to change without notice.

In this manual we have tried as much as possible to describe all the


various matters.
However, we cannot describe all the matters which must not be done,
or which cannot be done, because there are so many possibilities.
Therefore, matters which are not especially described as possible in
this manual should be regarded as ”impossible”.
Laser Safety
ANSI Z136.1, Section 1.3

FANUC Robotics North America, Inc. is not, and does not represent itself as an expert in laser safety
systems, equipment, or the specific safety aspects of your company and/or its workforce. It is your
responsibility as the owner, employer, or user to take such steps as may be necessary to ensure the
safety of all personnel in the workplace.

You as the owner, employer, Laser Safety Officer (LSO), or user of laser systems, are obligated to
monitor current safety standards and be sure your safety procedures are in conformance with current
standards.

The appropriate level of safety for an installation can best be determined by safety professionals most
familiar with the particular application or installation. FANUC Robotics therefore recommends that
each customer consult with such professionals in order to provide a workplace that allows for the safe
application, use, and operation of FANUC Robotics laser systems.

Safety References

The following laser safety considerations touch briefly on the reasonable and adequate use of laser
systems. Refer to the American National Standards Institute, Inc. (ANSI) Standard For the Safe Use
of Lasers, ANSI Z136.1-2000 and Specifications for Accident Prevention Signs, ANSI Z35.1-1972 for
additional information on laser safety.

Note American National Standards are subject to periodic review and users are cautioned to obtain the
latest editions.

LASER SAFETY OFFICER


ANSI Z136.1, Section 1.3

The Laser Safety Officer (LSO) is an individual with the authority and responsibility to monitor and
enforce the control of laser hazards and to effect the knowledgeable evaluation and control of laser
hazards. The conditions under which the laser is used, the level of safety training of individuals using
the laser, and other environmental and personnel factors are important considerations in determining
the full extent of safety control measures. Since such situations require informed judgments by
responsible persons, it is recommended that you appoint and thereafter consult a Laser Safety Officer

Recommended duties of the LSO are detailed in the For the Safe Use of Lasers manual, ANSI
Z136.1-2000, Section 1, plus additional information located in Sections 3, 4, and 5. See item in
the References section.

i
Laser Safety

Laser Users’ Responsibility

All personnel who intend to operate, program, repair, or otherwise use the laser system should be
familiar with the safeguarding devices identified in this section. The intent of this section is to help
make you aware that the listed components exist in the laser system. It is not intended as an exhaustive
list or as a thorough explanation of the devices.

FANUC Robotics recommends that all individuals associated with the laser system be trained in
an approved FANUC Robotics training course and become familiar with the proper operation of
the laser system.

This chapter describes the laser system as it is originally equipped by FANUC Robotics and does not
cover modifications or reconstructions that might be performed by other individuals or companies.

Warning

CAUTION - Use of controls or adjustments or performance procedures


other than those specified herein may result in hazardous radiation
exposure.

Note Refer to ANSI specifications and Occupational Safety and Health Administration (OSHA)
guidelines for further information.

Classification of Safety Systems

ANSI Z136.1, Section 3.3

Laser systems are classified according to their relative hazards:

• Class 1
• Class 2
• Class 3 (a and b)
• Class 4

LASER SYSTEM SAFEGUARDING DEVICES


ANSI Z136.1, Section 4.3 “Engineering Controls”

It is your responsibility as the owner, employer, LSO, or user to take such steps as might be necessary
to ensure that the laser system is installed in accordance with installation specifications. The means
and degree of safeguarding your laser system should correspond directly to the type and level of
potential hazards presented by your specific installation and application.

ii
Laser Safety

Note The safeguarding measures described in this section are intended to reflect current industry
standards, and therefore your LSO should monitor such standards for further safeguarding.

Safeguards include, but are not limited to the following:

• Access restriction
• Eye protection (refer to ANSI Z136.1, Section 4.6)
• Area controls
• Barriers, shrouds, beam stops, and so forth
• Administrative and procedural controls
• Education and training

Protective Housings

ANSI Z136.1, Section 4.3.1

A protective housing shall be provided for all classes of lasers or laser systems. The protective
housing may require interlocks and labels.

Since service personnel may remove protective housings, e.g., for alignment, special safety procedures
may be required and the use of appropriate eyewear is recommended.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.1. Refer to item 1 in
the References section.

Service Access Panels

ANSI Z136.1, Section 4.3.3

Portions of the protective housing that are only intended to be removed from any laser or laser system
by service personnel, which then permits direct access to laser radiation associated with a Class
3b or Class 4 laser or laser system, shall either:

• Be interlocked (fail-safe interlock not required), or


• Require a tool for removal and shall have an appropriate warning label on the panel.

If the interlock can be bypassed or defeated, a warning label with the appropriate indications shall be
located on the protective housing near the interlock. The label shall include language appropriate to
the laser hazard. The interlock design shall not permit the service access panel to be replaced with
the interlock bypassed or defeated.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.3. Refer to item 1 in
the References section.

iii
Laser Safety

Master Switch (Class 4)

ANSI Z136.1, Section 4.3.4

A Class 4 laser or laser system shall be provided with a master switch. This master switch shall effect
beam termination and/or system shutoff and shall be operated by a key, or by a coded access (such as
a computer code). The authority for access to the master switch shall be vested in the appropriate
supervisory personnel.

During periods of prolonged non-use (e.g. laser storage), the master switch shall be left in a disabled
condition (key removed or equivalent).

A single master switch on a main control unit shall be acceptable for multiple laser installations where
the operational controls have been integrated.

All energy sources associated with Class 3b or Class 4 lasers or laser systems shall be designed to
permit lockout/tagout procedures required by the Occupational Safety and Health Administration of
the U.S. Department of Labor.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.4. Refer to item 1 in
the References section.

Fully Open Beam Paths (Class 3b or Class 4)

ANSI Z136.1, Section 4.3.6.1

In applications of Class 3b or Class 4 lasers or laser systems where a beam path is unenclosed, a laser
hazard analysis shall be effected by the LSO to establish the NHZ if not furnished by the manufacturer.

The LSO will define the area where laser radiation is accessible at levels above the appropriate MPE,
or levels which could impair visual performance (applicable to only visible radiation, i.e., 400-700
nm).

If variable beam divergences are possible, a NHZ determination maybe required for more than one
beam divergence (i.e. minimum divergence / maximum divergence). The LSO shall ensure correct
control measures are in place for planned operating divergences prior to laser operation.

In some cases, the total hazard assessment may be dependent upon the nature of the environment,
the geometry of the application or the spatial limitations of other hazards associated with the laser
use. This may include, for example, localized fume or radiant exposure hazards produced during
laser material processing or surgery, robotic working envelopes, location of walls, barriers, or other
equipment in the laser environment.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.6. Refer to item 1 in
the References section.

iv
Laser Safety

Limited Open Beam Path (Class 3b or Class 4)

ANSI Z136.1, Section 4.3.6.2

In applications of Class 3b or Class 4 lasers or laser systems where the beam path is confined by
design to significantly limit the degree of accessibility of the open beam, a hazard analysis shall be
effected by the LSO to establish the NHZ if not furnished by the manufacturer. The analysis will
define the area where laser radiation is accessible at levels above the appropriate MPE and will define
the zone requiring control measures. The LSO shall establish controls appropriate to the magnitude
and extent of the accessible radiation.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.6. Refer to item 1 in
the References section.

Remote Interlock Connector (Class 3b or Class 4)

ANSI Z136.1, Section 4.3.7

A Class 3b laser or laser system should and Class 4 laser or laser systems shall be provided with a
remote interlock connector. The interlock connector facilitates electrical connections to an emergency
master disconnector interlock, or to a room, entryway, floor, or area interlock, as may be required
for a Class 4 controlled area.

When the terminals of the connector are open circuited, the accessible radiation shall not exceed
the appropriate MPE levels.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.7. Refer to item 1 in
the References section.

Beam Stop or Attenuator (Class 3b or Class 4)

ANSI Z136.1, Section 4.3.8

A Class 4 laser or laser system shall be provided with a permanently attached beam stop or attenuator.

The beam stop or attenuator shall be capable of preventing access to laser radiation in excess of the
appropriate MPE level when the laser or laser system output is not required, as in warm up procedures.

There are a few instances, such as during service, when a temporary beam attenuator placed over the
beam aperture can reduce the level of accessible laser radiation to levels at or below the applicable
MPE level. In this case, the LSO may deem that laser eve protection is not required.

Note For those lasers or laser systems that do not require a warm-up time, the main power switch
may be substituted for the requirement of a beam stop or attenuator.

Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.8. Refer to item 1 in
the References section.

v
Laser Safety

Activation Warning Systems (Class 3b or Class 4)

ANSI Z136.1, Section 4.3.9.4

An alarm (for example, an audible sound such as a bell or chime), a warning light (visible through
protective eyewear), or a verbal “countdown” command for single pulse or intermittent operations
should be used with Class 3b, and shall be used with Class 4 lasers or laser systems during activation
or startup.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.9.4. Refer to item 1 in
the References section.

CLASS 4 LASER CONTROLLED AREA (Class 4)


ANSI Z136.1, Section 4.3.10

ANSI Z136.1, Section 4.3.10.2

A laser hazard analysis, including determination of the NHZ, shall be effected by the LSO. If the
analysis determines that the classification associated with the maximum level of accessible radiation
is Class 3b or Class 4, a laser controlled area shall be established and adequate control measures
instituted.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.10. Refer to item 1 in
the References section.

All Class 4 area or entryway safety controls shall be designed to allow both rapid egress by laser
personnel at all times and admittance to the laser controlled area under emergency conditions.

All personnel who require entry into a laser controlled area shall be appropriately trained, provided
with appropriate protective equipment, and shall follow all applicable administrative and procedural
controls.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.10.2. Refer to item 1 in
the References section.

Note The guidelines set forth below are intended to reflect current industry standards, and therefore
your LSO should monitor such standards for further safeguarding developments.

The Class 4 laser controlled area shall:

• Be controlled to permit lasers and laser systems to be operated only by personnel who have been
trained in the operation of the laser, laser system and laser safety.
• Be posted with the appropriate warning sign(s). An appropriate warning sign shall be posted at the
entryway(s) and, if deemed necessary by the LSO, should be posted within the laser controlled
area.

vi
Laser Safety

• Be operated in a manner such that the path is well defined and projects into a controlled airspace
when the laser beam must extend beyond and indoor controlled area, particularly to the outdoors
under adverse atmospheric conditions, i.e.; rain, fog, snow, etc.
• Be under the direct supervision of an individual knowledgeable in laser safety.
• Be located so that access to the area by spectators is limited and requires approval.
• Have any potentially hazardous beam terminated in a beamstop of an appropriate material.
• Have only diffusely reflecting materials in or near the beam path, where feasible.
• Provide personnel within the laser controlled area with the appropriate eye protection.
• Have the laser secured such that the exposed beam path is above or below eye level of a person in
any standing or seated position, except as required for medical use.
• Have all windows, doorways, open portals, etc. from an indoor facility be either covered or
restricted in such a manner as to reduce the transmitted laser radiation to levels at or below
the applicable ocular MPE.
• Require storage or disabling (for example, removal of the key) of the laser or laser system when
not in use to prevent unauthorized use.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.10.1. Refer to item 1 in
the References section.

Temporary Laser Controlled Area (All Classes)

ANSI Z136.1, Section 4.3.12

In those conditions where removal of panels or protective housings, over-riding of protective housing
interlocks, or entry into the NHZ becomes necessary (such as for service), and the accessible laser
radiation exceeds the applicable MPE, a temporary laser controlled area shall be devised for the
laser or laser system.

Such an area, which by its nature will not have the built-in protective features as defined for a laser
controlled area, shall provide all safety requirements for all personnel, both within and outside the area.

A notice sign shall be posted outside the temporary laser controlled area to warn of the potential
hazard.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.12 for Class 4 laser
systems. Refer to item 1 in the References section.

Alignment Procedures

ANSI Z136.1, Section 4.4.5

Mechanical alignments are to be completed by your FANUC Robotics representative. Any alignments
performed by other than FANUC Robotics personel could result in the warranty becoming void.

vii
Laser Safety

Warning

A significant ocular hazard could exist during the alignment procedure.


Be sure to take the appropriate precautions.

SAFEGUARDING PERSONNEL
FANUC Robotics recommends that all personnel potentially associated with the laser systems be
trained in an approved FANUC Robotics training course and become familiar with the proper
operation of the system.

Note Additional guidelines for general robotic safety are set forth in the robotic safety section of the
manual. You should consult these guidelines before operating a robot and robotic systems.

In addition to the robot Safety guidelines set forth in this section, the following list of precautions
should be considered to help safeguard the teacher and operator of a laser system:

• Wear the necessary personal protective equipment. Refer to the Personal Protective Equipment
section for specifics.
• Do not watch laser operation without the proper protective eyewear as described in the Personal
Protective Equipment section. The robot warning indicator will turn ON just prior to laser
activation.
• Check that all laser safeguards are in place and functioning as intended.
• Make sure the work envelope is secured prior to initiating production operation. Securing the
area requires that all safety barriers are in place and all safety signals are active. Follow the
standard guidelines set forth in ANSI Z136.1-2000, for all safety barriers. Refer to item 1 in the
References section.

Your eyes could be exposed directly to the laser beam when the Maximum Permissible Exposure
(MPE) has been be exceeded.

Protective Eyewear (Class 3b or Class 4)

ANSI Z136.1, Section 4.6.2

Eye protection devices which are specifically designed for protection against radiation from Class 4
lasers or laser systems shall be administratively required and their use enforced when engineering
or other procedural and administrative controls are inadequate to eliminate potential exposure in
excess of the applicable MPE level.

Laser protective eyewear may include goggles, face shields, spectacles, or prescription eyewear
using special filter materials or reflective coatings (or combination of both) to reduce the potential
ocular exposure below the applicable MPE level.

viii
Laser Safety

Laser protective eyewear shall be specifically selected to withstand either direct or diffusely scattered
beams. In this case, the protective filter shall exhibit a damage threshold for a specified exposure
time, typically 10 seconds. The eyewear shall be used in a manner so that the damage threshold is
not exceeded in the “worst case” exposure scenario. Important in the selection of laser protective
eyewear is the factor of flammability.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.6.2. Refer to item 1 in
the References section.

Safeguarding Maintenance and Repair Personnel

Maintenance of the laser should follow procedures in the laser system documentation.

Where possible, perform maintenance without power to the robot and the laser. Use lockout and
tagout procedures as defined by your plant safety procedures, and release or block all stored energy,
such as air.

FANUC Robotics recommends that all repair operations be performed with the controller power
turned OFF and the attenuator bolted over the laser aperture.

Routine maintenance, such as cleaning, replacement of lens, greasing gears, or changing air filters,
can be performed with the attenuator in place.

Wear the necessary personal protective equipment. Refer to the Personal Protective Equipment section
for specifics. Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.6. Refer
to item 1 in the References section.

PERSONAL PROTECTIVE EQUIPMENT


ANSI Z136.1, Section 4.6

It is your responsibility as the owner, employer, LSO, or user to take such steps as might be necessary
to ensure that potentially affected personnel are aware of and use available protective equipment.

Protective Eyewear

ANSI Z49.1, Section 4.2 and ANSI Z136.1, Section 4.6.2.7

All laser protective eyewear shall be clearly labeled with the optical density and wavelength for which
protection is afforded. Color coding or other distinctive identification of laser protective eyewear is
recommended in multi-laser environments. Refer to the Safeguarding Personnel section set forth in
ANSI Z49.1-1988, item 6 and ANSI Z136.1-2000, item 1 in the References section for specifics.

Physical and chemical hazards to the eye can be reduced by the use of face shields, goggles, and
similar protective devices.

ix
Laser Safety

Periodic inspection should be made of protective eyewear to ensure they are in satisfactory condition.

WARNING SIGNS AND LABELS


ANSI Z136.1, Section 4.

All warning signs and labels should be displayed and contain appropriate warning and cautionary
statements. A label should be provided and placed on the laser housing or control panel. However,
if the housing and control panel are separated by more than two meters, labels should be placed on
both the housing and control panel.

• The warning labels affixed to the laser housing and control panels should not under any
circumstances be removed.
• In the event either label becomes detached, it should be immediately reattached to the housing
or the control panel.

Follow the standard guidelines set forth in ANSI Z35.1-1972 (or latest revision thereof) and Federal
Register 21CFR, section 1040.10 (g); Warning Labels; certification 1010.2 and 1010.3. Refer to item
1 in the References section.

Design of Signs

ANSI Z136.1, Section 4.7.1

Sign dimensions, letter size and color, etc., shall be in accordance with American National Standard
Specification for Accident Prevention Signs, ANSI Z535 series (latest revision thereof).

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7.1. Refer to item 1 in
the References section.

Note Refer to the applicable maintenance or operations manual for a description of the signs and
labels used in your specific system.

Symbols

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7. Refer to item 1 in the
References section.

Note Classification labeling in accordance with the Federal Laser Product Performance Standard as
described in item 3 of the References section, can be used as a guideline for the labeling discussed
in this section.

x
Laser Safety

Figure 1. Sample Warning Sign for Certain Class 3b and Class 4 lasers

Signal Words

ANSI Z136.1, Section 4.7.3

The following signal words are used with the ANSI Z535 design laser signs and labels:

• The signal word “Danger” shall be used with all signs and labels associated with all Class 3a
lasers and laser systems that exceed the appropriate MPE for irradiance and all Class 3b and Class
4 lasers and laser systems.
• The signal word “Caution” shall be used with all signs and labels associated with Class 2 laser
and laser systems and all Class 3a lasers and laser systems that do not exceed the appropriate
MPE for irradiance.
• The signal word “Notice“ shall be used on signs posted outside a temporary laser controlled
area, for example, during periods of service.

Note When a temporary laser controlled area is created, the area outside the temporary area remains
Class 1, while the area within is either Class 3b or Class 4 and the appropriate danger warning is also
required within the temporary laser controlled area.

xi
Laser Safety

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7.3. Refer to item 1 in
the References section.

Location of Signs

ANSI Z136.1, Section 4.7.4.3

All signs shall be conspicuously displayed in locations where they best will serve to warn onlookers.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7.4.3. Refer to item 1 in
the References section

Figure 2. Sample Warning Sign for Temporary Controlled Area

xii
Laser Safety

NON-BEAM HAZARDS
ANSI Z136.1, Section 7

You as the owner, employer, LSO, or user of laser systems, are obligated to monitor current safety
standards and be sure your safety procedures are in conformance with current standards. It is your
responsibility to take such steps as might be necessary to inform personnel of possible hazards within
a laser system.

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 7. Refer to item 1 in the
References section.

Electrical Hazards

ANSI Z136.1, Section 7.2

The use of lasers or laser systems can present an electric shock hazard. This may occur from contact
with exposed utility power utilization, device control, and power supply conductors operating at
potentials of 50 volts and above. These exposures can occur during laser set up or installation,
maintenance, modification, and service where equipment protective covers are often removed to
allow access to active components as required for those activities. Those exposed can be equipment
installers, users, technicians, and uninformed members of the public, such as passers by.

The effect upon those who accidentally come into contact with energized conductors at or above 50
volts can range from a minor “tingle,” to startle reaction, to serious personal injury, or death. Because
the pathways of current are all-pervasive, such as ground, it is not possible to characterize all the
parameters in any situation to predict the occurrence or outcome of an electric shock accident. Electric
shock is a very serious opportunistic hazard, and loss of life has occurred during electrical servicing
and testing of laser equipment incorporating high-voltage power supplies.

Protection against accidental contact with energized conductors by means of a barrier system is the
primary methodology to prevent electric shock accidents with laser equipment. Hazard warnings
and safety instructions extend the safety system to embody exposures caused by conditions of use,
maintenance, and service, and provide protection against the hazards of possible equipment misuse.

Additional electrical safety requirements are imposed upon laser devices, systems, and those
who work with them, by the United States Department of Labor, Occupational Safety and Health
Administration (OSHA), the National Electrical Code (NFPA 70), and related state and local, laws
and regulations. These requirements govern equipment connection to the electrical utilization system,
electrical protection parameters, and specific safety training. These requirements must be observed
with all laser installations. The following potential electrical problems have frequently been identified
during laser facility audits:

• Uncovered electrical terminals


• Improperly insulated electrical terminals
• Hidden “power-up” warning lights

xiii
Laser Safety

• Lack of personnel trained in current cardiopulmonary resuscitation practices, or lack of refresher


training.
• “Buddy system” or equivalent safety measure not being practiced during maintenance and service
• Failure to properly discharge and ground capacitors
• Non earth-grounded or improperly grounded laser equipment
• Non-adherence to the OSHA lock-out standard (29 CFR 1910.147)
• Excessive wires and cables on floor that create fall or slip hazards

Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 7.2. Refer to item 1 in the
References section.

Fire Hazards

ANSI Z136.1, Section 7.5

Class 4 laser beams represent a fire hazard. Enclosure of Class 4 laser beams can result in potential
fire hazards if enclosure materials are likely to be exposed to irradiances exceeding 10 W or
beam powers exceeding 0.5 W. Under some situations where flammable compounds or substances
exist, it is possible that fires can be initiated by Class 3 lasers. The LSO should encourage the use of
flame retardant materials wherever applicable with all laser applications.

Opaque laser barriers, e.g., curtains, can be used to block the laser beam from exiting the work area
during certain operations. While these barriers can be designed to offer a range of protection, they
normally cannot withstand high irradiance levels for more than a few seconds without some damage,
e.g., production of smoke, open fire, or penetration. Users of commercially available laser barriers
should obtain appropriate fire prevention information form the manufacturer. Users can also refer to
the National Fire Protection Association (NFPA) Code #115 for further information on controlling
laser induced firs.

Operators of Class 4 lasers should be aware of the ability of unprotected wire insulation and plastic
tubing to catch on fire from intense reflected or scattered beams, particularly from lasers operating at
invisible wavelengths.

REFERENCES
The following list consists of all safety guideline reference manuals that you need to refer to:

• 1 American National Standards Institute, Inc. (ANSI) Standard For the Safe Use of Lasers,
ANSI Z136.1-2000 and Specifications for Accident Prevention Signs, ANSI Z35.1-1972 (or
latest revisions thereof)
• 2 Connections and Maintenance Manual for FYSL

xiv
Laser Safety

• 3 Federal Laser Product Performance Standard, Federal Register 21CFR, Part III; Vol 50, #161
- 8/20/85
• 4 American Conference of Governmental Industrial Hygienists
• 5 Polymeric Materials for use in Electric Equipment, Underwriters Laboratories Standard, UL
746C
• 6 American National Standards Institute, Inc. (ANSI) Standard Safety in Welding and Cutting,
ANSI Z49.1-1988; Chapter 4, Sections 4.2 and 4.3.
• 7 ANSI/RIA R15.06-1999 American National Standard for Industrial Robots and Robot Systems -
Safety Requirements
• 8 ANSI Z535 Safety Sign and Color Standards

Additional Safety Guidelines Reference Sources

Additional safety guidelines reference sources are as follows:

• Guidelines for Laser Safety and Hazard Assessment, Occupational Safety and Health
Administration (OSHA) Pub 8–1.7 8/19/91 (or latest revisions thereof).
• Compliance Guide for Laser Products, US Department of Health and Human Services, Food and
Drug Administration (FDA) 86-8260 9/85 (or latest revisions thereof).
• National Electrical Codes/Joint Industrial Council (NEC/JIC).
• Federal Register 21CFR, Part III; Vol 50, #161 - 8/20/85 Laser Products; Amendments to
Performance Standard; Final Rule, along with Labeling rules 1010.2 and 1010.3 (or latest
revisions thereof) and the FDA Department of Health and Human Services; Parts 1000 & 1040.
• Public Law 90-602, 90th Congress, H.R. 10790-10/18/68 (or latest revisions thereof).
• Reviewing the Federal Standard for Laser Products, Lasers & Optronics-3/88.
• JIC Electrical Standards for General Purpose Machine Tools; EGP-1-67
• Electrical Standards for Mass Production Equipment; EMP-1-67.
• NEC/NFPA ‘79 Electrical Code - These are the minimum electrical requirements with any
supplements from your local area.

xv
B-81444EN/03 CONTENTS

CONTENTS
1 PREFACE ......................................................................................................................1
1.1 OVERVIEW OF THE MANUAL........................................................................................................................ 2
1.2 OUTLINE OF 3D LASER VISION SENSOR .................................................................................................... 4
1.2.1 3D Laser Vision Sensor System Configuration ................................................................................... 4
1.2.2 Safety of Laser Sensor.......................................................................................................................... 5
1.2.3 Laser Beam ........................................................................................................................................... 5
1.2.4 Warning Label ...................................................................................................................................... 5
1.3 3D LASER VISION SENSOR SETUP PROCEDURE....................................................................................... 8
2 SETUP .........................................................................................................................10
2.1 CONFIGURATION ........................................................................................................................................... 11
2.1.1 Personal Computer............................................................................................................................. 12
2.1.2 Frame Grabber Board ........................................................................................................................ 12
2.1.3 3D Laser Vision Sensor ...................................................................................................................... 12
2.1.4 Camera Adapter and Adapter Cable.................................................................................................. 12
2.1.5 Camera Cable ..................................................................................................................................... 13
2.1.6 I/O Board............................................................................................................................................. 13
2.1.7 PC I/O Cable ....................................................................................................................................... 13
2.1.8 Ethernet Cable.................................................................................................................................... 13
2.2 INSTALLATION SEQUENCE.......................................................................................................................... 14
2.3 DISPLAY ........................................................................................................................................................... 14
2.4 FRAME GRABBER BOARD ........................................................................................................................... 15
2.4.1 Installing Hardware ........................................................................................................................... 15
2.4.2 Installing the Driver Software........................................................................................................... 17
2.4.3 Uninstalling the Driver Software ...................................................................................................... 18
2.5 I/O BOARD ....................................................................................................................................................... 19
2.5.1 Installing Hardware ........................................................................................................................... 19
2.5.2 Installing the Driver Software........................................................................................................... 21
2.5.3 Uninstalling the Driver Software ...................................................................................................... 23
2.6 ETHERNET COMMUNICATION SOFTWARE .............................................................................................. 24
2.7 3D LASER VISION SENSOR........................................................................................................................... 25
2.7.1 Installing the 3D Laser Vision Sensor Software ............................................................................... 25
2.7.2 Uninstalling the 3D Laser Vision Sensor Software .......................................................................... 25
2.8 ETHERNET ....................................................................................................................................................... 27
2.8.1 Connecting Cables .............................................................................................................................. 27
2.8.2 Determining the IP Address............................................................................................................... 27
2.8.3 Making Settings for the Robot Side................................................................................................... 27
2.8.4 Making Settings for the 3D Laser Vision Sensor Side...................................................................... 28
2.8.5 Communication Test........................................................................................................................... 29
2.9 CONNECTING THE CAMERA ....................................................................................................................... 30
2.9.1 Setting the Camera ............................................................................................................................ 30
2.9.2 APC-3322A.......................................................................................................................................... 31
2.10 PROTECTOR..................................................................................................................................................... 35
3 3D LASER VISION SENSOR BASIC OPERATIONS ..................................................36
3.1 START 3D LASER VISION SENSOR.............................................................................................................. 37
3.2 EXIT 3D LASER VISION SENSOR................................................................................................................. 37
3.3 SELECT A CAMERA........................................................................................................................................ 37
3.4 SNAP A NEW IMAGE FROM A CAMERA .................................................................................................... 37
3.5 DISPLAY LIVE IMAGES ................................................................................................................................. 38
3.6 CHANGING THE SHUTTER SPEED MODE FOR SNAP AND LIVE .......................................................... 38
3.7 TURN ON THE LASER.................................................................................................................................... 39
3.8 ENLARGING AND REDUCING IMAGES ..................................................................................................... 39
3.9 MOVING IMAGES ........................................................................................................................................... 39
3.10 ERASING ALARMS ......................................................................................................................................... 40
3.11 PAINTING AN IMAGE WITH A RED PEN .................................................................................................... 40

- c-1 -
CONTENTS B-81444EN/03

3.12 STATUS OF FINDING ...................................................................................................................................... 40


4 VISION DATA...............................................................................................................41
4.1 VISION DATA TYPES ...................................................................................................................................... 42
4.2 3D MEASUREMENT AND 2D MEASUREMENT ......................................................................................... 43
4.3 LIST OF VISION DATA.................................................................................................................................... 44
4.4 BACKING UP VISION DATA.......................................................................................................................... 45
4.5 CREATING A VISION DATA LIST FILE ........................................................................................................ 45
5 ROBOT COMMUNICATION SETTING ........................................................................47
5.1 COMMUNICATION CONDITION .................................................................................................................. 48
5.2 CONNECTION STATUS................................................................................................................................... 49
5.3 COMMUNICATION TEST ............................................................................................................................... 49
5.4 AUTOMATIC CONNECTION ......................................................................................................................... 49
6 CALIBRATION .............................................................................................................51
6.1 CALIBRATION TYPES .................................................................................................................................... 52
6.2 3D MEASUREMENT CALIBRATION ............................................................................................................ 53
6.2.1 Setting of 3D Measurement Calibration ........................................................................................... 53
6.2.2 Detecting a Grid Pattern and a Laser ............................................................................................... 54
6.2.3 Calibration Data Detail Display ........................................................................................................ 56
6.2.4 Laser Slit Detail Display.................................................................................................................... 58
6.2.5 Setting Laser Slit Detection Parameters .......................................................................................... 58
6.2.6 Checking the Precision of Calibration ............................................................................................... 59
6.3 SIMPLE 2-POINT CALIBRATION .................................................................................................................. 62
6.4 GRID PATTERN CALIBRATION (2D MEASUREMENT)............................................................................. 65
6.4.1 Find Grid Pattern ............................................................................................................................... 66
6.4.2 Grid Pattern Frame............................................................................................................................ 67
6.4.3 Example of Setting a Grid Pattern Frame ........................................................................................ 69
6.5 SHOW DETAILS OF CAMERA CALIBRATION DATA ................................................................................ 70
6.6 SENSOR COORDINATE SYSTEM ................................................................................................................. 71
6.6.1 Sensor Coordinate System and Measurement Results..................................................................... 71
6.6.2 Setting a Sensor Coordinate System ................................................................................................. 72
6.7 SIMPLE RE-CALIBRATION............................................................................................................................ 79
6.7.1 Execution of Simple Re-calibration ................................................................................................... 79
6.7.2 Differences from 3D Measurement Calibration ................................................................................ 83

7 TEACHING AND TESTING THE LOCATION TOOL ...................................................84


7.1 TRAIN MODEL ................................................................................................................................................ 85
7.2 MODIFYING LOCATION TOOL MODEL...................................................................................................... 88
7.2.1 Model Display ..................................................................................................................................... 89
7.2.2 Setting a Reference Point................................................................................................................... 90
7.3 ADJUST LOCATION TOOL PARAMETER .................................................................................................... 91
7.4 OPTIONAL FUNCTION OF LOCATION TOOL ............................................................................................ 93
7.4.1 Extra care region ................................................................................................................................ 94
7.4.2 Statistic Function ............................................................................................................................... 95
7.5 TESTING LOCATION TOOL........................................................................................................................... 97
8 TEACHING AND EXECUTION THE 2D VISION PROCESS .....................................100
8.1 SETUP VISION PROCESS ............................................................................................................................. 101
8.2 TEST VISION PROCESS................................................................................................................................ 104
8.2.1 Use BMP Files .................................................................................................................................. 105
8.3 EXECUTION BY ROBOT CONTROLLER................................................................................................... 106
9 TEACHING AND TESTING THE 3D VISION PROCESS ..........................................108
9.1 TYPES OF 3D MEASUREMENT FUNCTIONS ........................................................................................... 109
9.2 DISK MEASUREMENT ................................................................................................................................. 110

- c-2 -
B-81444EN/03 CONTENTS

9.2.1 Information Obtained....................................................................................................................... 110


9.2.2 Settings ..............................................................................................................................................111
9.2.3 Teaching Procedure .......................................................................................................................... 122
9.3 DISPLACEMENT MEASUREMENT ............................................................................................................ 126
9.3.1 Information Obtained....................................................................................................................... 126
9.3.2 Settings ............................................................................................................................................. 127
9.3.3 Teaching Procedure .......................................................................................................................... 137
9.4 EDGE MEASUREMENT................................................................................................................................ 140
9.4.1 Information Obtained....................................................................................................................... 140
9.4.2 Settings ............................................................................................................................................. 141
9.4.3 Teaching Procedure .......................................................................................................................... 152
9.5 CROSS SECTION MEASUREMENT ............................................................................................................ 156
9.5.1 Information Obtained....................................................................................................................... 156
9.5.2 Settings ............................................................................................................................................. 157
9.5.3 Teaching Procedure .......................................................................................................................... 166
9.6 VISION PROCESS TEST EXECUTION........................................................................................................ 171
9.6.1 Continuous Run................................................................................................................................ 173
9.6.2 Use BMP Files .................................................................................................................................. 173
9.7 EXECUTION BY ROBOT CONTROLLER................................................................................................... 174
9.8 SHOWING BRIGHTNESS DISTRIBUTION ................................................................................................ 176
10 CREATION AND EXECUTION OF A ROBOT PROGRAM........................................178
10.1 CORRECTING THE POSITION AND POSTURE FOR HOLDING A WORKPIECE................................. 179
10.1.1 Creating a Robot Program ............................................................................................................... 180
10.1.2 Executing a Robot Program ............................................................................................................. 182
10.2 OTHER APPLICATION EXAMPLES ............................................................................................................ 184
11 OPTIONS ...................................................................................................................185
11.1 OPTION SETTING.......................................................................................................................................... 186
11.2 ETHERNET SETTING.................................................................................................................................... 188
11.3 3DV OPTION SETTING................................................................................................................................. 189
APPENDIX

A ERROR CODES.........................................................................................................193

B CALIBRATION JIG ....................................................................................................195

C ROBOT POSITION COMPENSATION ......................................................................197

D RESTRICTIONS WITH A RAIL AXIS.........................................................................198

- c-3 -
1.PREFACE B-81444EN/03

1 PREFACE
This chapter describes the overview of this manual, outline of the
FANUC 3D Laser Vision Sensor, and safety precautions which should
be noted before operating this robot.

-1-
B-81444EN/03 1.PREFACE

1.1 OVERVIEW OF THE MANUAL

Overview
The "FANUC VISION V-500iA/3DL (3D Laser Vision Sensor)
Operator's Manual" describes how to operate the 3D Laser Vision
Sensor controlled by the R-J3 or R-J3iB control unit.
In this manual, only the operation and the technique of programming
for the dedicated sensor functions are explained assuming that the
installation and the setup of the robots have the 3D Laser Vision
Sensor are completed. Refer to the "HANDLING TOOL Operations
Manual" about other operations of FANUC Robots.

CAUTION
This manual is based on 3D Laser Vision Sensor
software Version 1.02.0118. Note that functions
and setting items not described in this manual are
available or some notation differences are present,
depending on the version.

Contents of this manual


Chapter 1 How to use this manual. An outline and a safety of the 3D Laser Vision
Sensor. Be sure to read the safety.
Chapter 2 Configuration of the 3D Laser Vision Sensor.
The method and the procedure for setup to connect hardware and to install
software.
Chapter 3 Basic operation of 3D Laser Vision Sensor.
to Teaching a measurement function.
Chapter 11 All kinds of setup.
Appendix List of error codes.
Calibration jig.
Robot position compensation methods.
Restrictions imposed when the 3D Laser Vision Sensor is installed on a robot
with a rail axis

-2-
1.PREFACE B-81444EN/03

Related manuals
The following manuals are available for the 3D Laser Vision Sensor.

Controller Operations Manual (R-J3iB) Topics: Robot functions, operations, programming, interfaces,
HANDLING TOOL alarms
B-81464EN-2 Use: Applicable design, robot installation, teaching,
adjustment
FANUC VISION V-500iA/3DL Topics: 3D Laser Vision Sensor function, operation,
Operator's Manual programming, alarms
(This manual) Use: Teaching, adjustment
B-81444EN
R-J3iB Topics: Installation and set-up, connection to peripheral
Maintenance Manual equipment, maintenance of the system
B-81465EN Use: Installation, start-up, connection, maintenance
Mechanical unit Force sensor / 3D Laser Topics: Connection of the sensors, robot, and control devices,
Vision Sensor maintenance of the sensors
Maintenance Manual Use: Connection and maintenance of the sensors
B-81155EN

-3-
B-81444EN/03 1.PREFACE

1.2 OUTLINE OF 3D LASER VISION SENSOR

This section describes the system configuration of the 3D Laser


Vision Sensor and precautions to be taken before using it.

1.2.1 3D Laser Vision Sensor System Configuration

System configuration
3D Laser Vision Sensor system includes a 3D Laser Vision Sensor
with the mechanical units and the robot controller.
Peripheral equipment and external control equipment can also be
added.

3D Laser Vision Sensor

Robot Controller
Mechanical Unit

Fig. 1.2.1 3D Laser Vision Sensor system configuration

Sensors and functions

Sensor Function Function as a robot system


3D Laser Vision By irradiating visible laser beam on a For instance, it is installed around the wrist of a
Sensor workpiece, the sensor detects the robot mechanical unit.
position and the posture of it. It detects the position and the posture of a
workpiece to the coordinate system related to the
mechanical unit.
The motion of the robot is compensated based
on the difference between present position and
posture of the workpiece (actual) and those
standards measured beforehand (nominal).

-4-
1.PREFACE B-81444EN/03

1.2.2 Safety of Laser Sensor


3D Laser Vision Sensor is a visual sensor which detects the position
and posture of an object using a semiconductor laser.
CAUTION
Execute user's safety and fire precautions in
accordance with the safety standard and the
regulations which the country and the region provide
when you use this sensor. Moreover, when the
safety standard and regulations are changed or
newly enacted, please follow them.

The laser classification used in the sensor


Semiconductor laser Class IIIa Laser (cf. FDA 1040.10)
Class 3R Laser ( cf. IEC Pub.
60825 / JIS C6802)

1.2.3 Laser Beam


The semiconductor laser beam is a visible optical laser with a wave
length of 650 nm. It is necessary to pay attention to its operation
though the maximum output power is at most 4.5mW x 2. Do not
irradiate the output beam from the sensor directly to your eyes.
Moreover, do not look straight at scattered light for a long time.

1.2.4 Warning Label


The warning label which informs of the danger of the irradiation of
the laser beam is affixed on this laser sensor. Moreover, the warning
label in accordance with United States FDA standard is prepared as an
option.
Figs. 1.2 and 1.3 show the warning labels used. Please prepare and
affix the warning labels to safety fences or other surroundings.

-5-
B-81444EN/03 1.PREFACE

Fig. 1.2 Warning labels (1)

-6-
1.PREFACE B-81444EN/03

Fig. 1.3 Warning labels (2)

-7-
B-81444EN/03 1.PREFACE

1.3 3D LASER VISION SENSOR SETUP PROCEDURE

The final purpose of an application that uses the 3D Laser Vision


Sensor is to make the 3D Laser Vision Sensor measure the position
and posture of a workpiece so that the robot can operate as the
position and posture of the workpiece change. This section briefly
describes the procedure for setting up the 3D Laser Vision Sensor.

Installing and connecting Install and connect the 3D Laser Vision Sensor on the floor.
3D vision sensor For details, see Chapter 2, "SETUP".

Four types of 3D Laser Vision Sensor teaching data are available:


Teaching 3D vision sensor robot data, location tool data, camera setup data, and vision process
data. Setting these types of data means teaching the 3D Laser Vision
Sensor.
For details of each type of data, see Chapter 4, "VISION DATA".
For setup from the beginning, set these types of data in the order
described below.

Robot setting
Set a robot controller to be connected with the 3D Laser Vision
Sensor.
For details, see Chapter 5, "ROBOT COMMUNICATION
SETTING".

Camera setup

Perform calibration of the main unit of the 3D Laser Vision Sensor or


the camera connected at the same time. Calibration sets the optical
parameters of the 3D Laser Vision Sensor and camera, and the
positional relationships of the 3D Laser Vision Sensor and camera
with the robot.
For details, see Chapter 6, "CALIBRATION".

Location tool creation

Set the 2D features of a workpiece. The 3D Laser Vision Sensor can


also use the camera, which is the section for receiving a laser beam,
for 2D image processing. By combining 3D information obtained as
a laser beam receiver with 2D information obtained as a camera, a
measurement with high precision and robustness is achieved.
This location tool is created for a measurement that requires 2D image
processing.
For details, see Chapter 7, "TEACHING AND TESTING THE
LOCATION TOOL".

-8-
1.PREFACE B-81444EN/03

Vision process creation

Based on the camera setup and location tool settings mentioned above,
a vision process is created for an actual measurement. A vision
process for 2D measurement and a vision process for 3D measurement
need to differ from each other.
For details, see Chapter 8, "TEACHING AND EXECUTION THE 2D
VISION PROCESS", and Chapter 9, "TEACHING AND TESTING
THE 3D VISION PROCESS".

An instruction for starting the 3D Laser Vision Sensor, a position for


Robot program creation the robot to measure a workpiece placed at a reference position, and a
position for work such as retrieving work need to be taught to the
robot as a minimum requirement. For details, see Subsection 10.1.1,
"Creating a Robot Program". A description is provided using a
typical example.

A created robot program is executed in one of two modes: nominal


Robot program execution execution mode and actual execution mode. In nominal execution, a
robot program is executed when a workpiece is placed at a reference
position. In actual execution, a robot program is executed in a usual
state where a workpiece is placed at an arbitrary position.
In nominal execution, a measurement is made when a workpiece is
placed at a reference position, and the result of measurement is not
erased but is preserved. This measurement result to be preserved is
referred to as "nominal data".
In actual execution, compensation data is created from the result of
measurement of a workpiece placed at an arbitrary position and
"nominal data". By using this compensation data, the robot position
taught when a workpiece is placed at a reference position is converted
to data that matches the current position of the workpiece. As the
result, robot can make a movement that matches a workpiece placed at
an arbitrary position.
For details, see Subsection 0, "Executing a Robot Program". A
description is provided using a typical example.

-9-
B-81444EN/03 2.SETUP

2 SETUP
This chapter describes the following operating procedures, which are
necessary to set up the 3D Laser Vision Sensor system.

- Installing the hardware and software in a personal computer (PC)


- Connecting the 3D Laser Vision Sensor to peripheral equipment,
such as the PC and cables
- Setting up parameters for both the robot and the 3D Laser Vision
Sensor

- 10 -
2.SETUP B-81444EN/03

2.1 CONFIGURATION

The 3D Laser Vision Sensor system consists of the devices shown


below.

Camera

Frame grabber
3D Laser
PC Vision Sensor
100 BaseT Adapter
R-J3iB Ethernet cable cable
Sensor cable

PC I/O cable Camera adapter


I/O board

The following table lists the necessary devices and their quantity.

Device Quantity
1 Personal computer 1 set
2 Frame grabber board 1 set
3 3D Laser Vision Sensor 1 each
4 Camera adapter 1 each
5 Camera cable 1 set
6 Adapter cable 1 each
7 Sensor cable 1 set
8 I/O board 1 set
9 PC I/O cable 1 each
10 Ethernet cable 1 each or more
(depending on the network configuration)

The following subsections describe the role of each device, their


requirements, and recommended models in details. We have
confirmed that each of the recommended models can work normally,
but cannot guarantee them as products. For details of their
specifications and estimates, ask the respective vendors or
manufacturers.

- 11 -
B-81444EN/03 2.SETUP

2.1.1 Personal Computer


The personal computer (PC) controls the 3D Laser Vision Sensor,
receives video images from it, processes them, detects an object
previously taught, and notifies the robot controller of the position of
the object and other detection results.
CPU : Pentium III / Pentium 4
(Neither Celeron nor AMD is usable.)
Memory : 128 Mbytes or more
OS : Windows 2000 ( Japanese version or U.S. version )
Interface : PCI bus × 2 (for installing the frame grabber board and I/O
board)
100 BASE-T Ethernet port

2.1.2 Frame Grabber Board


The frame grabber board receives video images from the 3D Laser
Vision Sensor and transfers them to the PC. It is installed on the PCI
bus in the PC.
Recommended model :APC-3322A from AVAL DATA
Quantity : 1 set

2.1.3 3D Laser Vision Sensor


The 3D Laser Vision Sensor measures the three dimensional position
and posture of an object. When the 3D Laser Vision Sensor is
mounted on the wrist of the robot as usually done, cables for
connection to the sensor head of the 3D Laser Vision Sensor and to
the robot connection section are provided in a set.
Quantity : 1 each

2.1.4 Camera Adapter and Adapter Cable


The camera adapter feeds power to the camera built into the 3D Laser
Vision Sensor.
The adapter cable is used to make a connection between the camera
adapter and frame grabber board. An image signal is sent from the
camera adapter, and a synchronization signal is sent from the frame
grabber board. The usable adapter cable depends on the type of
camera adapter.
Recommendation : Camera adapter: DC-700 from SONY
Adapter cable : CBL-Z015-005 from AVAL DATA
or,
Camera adapter : APU-332 from AVAL DATA
Adapter cable : CBL-Z036 from AVAL DATA
Quantity : One camera adapter, one adapter cable

- 12 -
2.SETUP B-81444EN/03

2.1.5 Camera Cable


The camera cable is used to make a connection between the robot
connection section (on the backward portion of the J1 base) of the 3D
Laser Vision Sensor and the camera adapter. When the 3D Laser
Vision Sensor is used on a fixed-installation basis, a direct connection
can be made between the sensor head of the 3D Laser Vision Sensor
and the camera adapter. The camera sends video signals. The camera
adapter sends power and synchronization signals.
Recommended model : CCXC-12P10N from SONY or equivalent
Quantity : 1 set

2.1.6 I/O Board


The I/O board outputs control signals from the PC to the 3D Laser
Vision Sensor and inputs status signals from the 3D Laser Vision
Sensor to the PC.
Recommended model : PIO-16/16L(PCI) or PIO-16 / 16L(PCI) from
CONTEC
Quantity : 1 set

2.1.7 PC I/O Cable


The I/O cable connects the robot controller to the PC. It transfers 3D
Laser Vision Sensor control and status signals.
Quantity : 1 each

2.1.8 Ethernet Cable


The Ethernet cable connects the robot controller to the PC to enable
Ethernet communication.
Recommended model : NW05H-484RA-10M-BK from MISUMI or
equivalent
(This product is a max. 200-Mbps cross-type, twisted-pair, shielded
cable. If a hub unit intervenes between the robot controller and the
PC, use a straight-type cable instead of a cross type.)
Quantity : 1 each

- 13 -
B-81444EN/03 2.SETUP

2.2 INSTALLATION SEQUENCE

The frame grabber board, I/O board, Ethernet communication


software, and the 3D Laser Vision Sensor software must be installed
in the predetermined sequence.
First install the frame grabber board, I/O board, and Ethernet
communication software, then install the 3D Laser Vision Sensor
software.
The frame grabber board, I/O board, and Ethernet communication
software can be installed in any sequence.
If the frame grabber board and the I/O board are installed at the same
time, however, it becomes confusing to identity which is being
installed. So, install one board at a time.

2.3 DISPLAY

Enabling video images received from the 3D Laser Vision Sensor to


be displayed normally requires setting the color palette and desktop
area of the PC to 65536 colors or more and 1024 × 768 pixels or more,
respectively.

(1) Click the right button of the mouse (right-click the mouse) on the
desktop at a portion where nothing is displayed.
(2) Select [Properties]. The [Display Properties] dialog box
appears.
(3) Select the [Setttings] tab.
(4) Click the [Colors] button and [Screen area] button. A list of
modes appears. Select, from the list, a mode in which the color
palette supports 65536 colors or more, the desktop area supports
1024 × 768 pixels or more, and the refresh rate that matches the
specification of the monitor used with the PC.
(5) Click the OK button to apply the setting and close the dialog
box.

- 14 -
2.SETUP B-81444EN/03

2.4 FRAME GRABBER BOARD

2.4.1 Installing Hardware


The frame grabber board APC-3322A from AVAL DATA
is used for the 3D Laser Vision Sensor. Attach the frame grabber
board to the PCI bus of the PC according to the following procedure.
(1) Do not change the settings of the jumper pins and status switches
on the frame grabber board because their factory defaults are
appropriate.
(2) Make sure that the PC is powered off, insert the frame grabber
board into the PCI bus slot of the PC and then fasten it with the
panel retaining screws.
(3) Start the personal computer after installing the frame grabber
board. When Windows starts up, new hardware detection is
automatically performed(Note).
(4) Insert the CD-ROM for the 3D Laser Vision Sensor software
installation into the CD-ROM drive of the personal computer.
(5) On the screen below, click Next >.

(6) On the screen below, choose “Search for a suitable driver for my
device [recommended]” then click Next >.

- 15 -
B-81444EN/03 2.SETUP

(7) On the screen below, check “CD-ROM drives” then click Next
>.

(8) On the screen below, click Next >.

(9) On the screen below, click Finish. This completes the


installation.

For details such as the settings of the status switches and jumper pins
on the frame grabber board, and insertion into the PCI bus slot, refer

- 16 -
2.SETUP B-81444EN/03

to the operator's manuals of the frame grabber board and personal


computer.

NOTE
If new hardware detection is not performed
automatically, choose [Start] → [Settings] → [Control
Panel] → [Add/Remove Hardware]

2.4.2 Installing the Driver Software


Using the frame grabber board requires installing its driver. The
installation procedure follows:
(If an earlier version of the driver you are going to install is already in
the PC, uninstall it.)

Preparatory work for installation


Before starting to install the driver software, make the following
preparations.
(1) Before installing the driver, make sure that the frame grabber
board is already installed on the PCI bus of the PC. If no frame
grabber board is installed, install one on the PCI bus of the PC
according to Subsection 2.4.1, "Installing Hardware“.
(2) Start the PC.
(3) If "Plug & Play OS" is available for the BIOS setting, make sure
that it is set to "NO/Disable”. If you installed the driver with
"Plug & Play OS" set to "YES/Enable," uninstall it according to
Subsection 2.4.3, "Uninstalling the Driver Software," and re-set
"Plug & Play OS" to "NO/Disable," then install it again. If the
driver is installed with "Plug & Play OS" set to "YES/Enable," it
may become impossible to allocate memory to the frame grabber
board.
(4) If "PCI Bus Master" is available for the BIOS setting, make sure
that it is set to "YES/Enable”. If it is "NO/Disable," re-set it to
"YES/Enable”. If it is "NO/Disable," a DMA transfer (used to
transfer video images from the frame grabber board to the PC)
may not work

Installing
To install the driver software, follow this procedure:
(1) Get a 3D Laser Vision Sensor software installation CD-ROM and
insert it into the CD-ROM drive of the PC.
(2) Execute SETUP.EXE in the folder D:¥AvalData*** from
"Run…" (D:¥ indicates the name of the CD-ROM drive and ***
indicates the version of the driver software.)
(3) Install the software as directed in the menu that appears.

Checking driver operation


To check whether the driver is operating normally, the method
described below is available.

- 17 -
B-81444EN/03 2.SETUP

(1) Choose [Control Panel] → [System] → [Hardware] → [Device


Manager], then find the following location (“Aip” ~
“AVALDATA APC-332 Series(Image Capture Module)”):

(2) Right-click then choose [Properties]. The screen below appears.


Check that the driver is operating normally.

2.4.3 Uninstalling the Driver Software


If you need to uninstall the driver software, follow this procedure:
(1) To remove the driver from the system, select [Control Panel] →
[Add/Remove Programs] → [AVALDATA APC-332 Series
Driver / Library] and click the Add/Remove button.

- 18 -
2.SETUP B-81444EN/03

2.5 I/O BOARD

2.5.1 Installing Hardware


The I/O board PIO-16/16L(PCI)H or PIO-16 / 16L(PCI) from
CONTEC is used for the 3D Laser Vision Sensor. Attach the I/O
board to the PCI bus of the PC and install the device driver according
to the following procedure.
(1) Do not change he settings of the jumper pins on the I/O board
because their factory defaults are appropriate.
(2) Make sure that the PC is powered off, then insert the I/O board
into the PCI bus slot of the PC, and fasten it with the panel
retaining screws.
(3) Start the personal computer after installing the I/O board. When
Windows starts up, new hardware detection is automatically
started (Note).
(4) Insert the CD-ROM for 3D Laser Vision Sensor software
installation into the CD-ROM drive of the personal computer.
(5) On the screen below, click Next >.

(6) On the screen below, choose “Search for a suitable drive for my
device [recommended]” then click Next >.

- 19 -
B-81444EN/03 2.SETUP

(7) On the screen below, check “CD-ROM drives” then click Next
>.

(8) On the screen below, click Next >.

(9) On the screen below, click Finish. This completes the


installation.

- 20 -
2.SETUP B-81444EN/03

Refer to the applicable I/O board and PC operator's manuals for details
of each step of the setup procedure, such as setting the jumper pins on
the I/O board and attaching the I/O board to the PCI bus.

NOTE
If new hardware detection is not performed
automatically, choose [Start] → [Settings] → [Control
Panel] → [Add/Remove Hardware].

2.5.2 Installing the Driver Software


Using the I/O board requires installing the I/O board driver. The
installation procedure follows:
(If an earlier version of the driver you are going to install is already in
the PC, uninstall it.)

Preparatory work for installation


Before starting to install the driver software, make the following
preparations.
(1) Before installing the driver, make sure that the I/O board is
already installed on the PCI of the PC. If no I/O board is
installed, install one on the PCI bus of the PC according to
Subsection 2.5.1, "Installing Hardware“.
(2) Start the PC.
(3) If "Plug & Play OS" is available for the BIOS setting, make sure
that it is set to "NO/Disable”. If you installed the driver with
"Plug & Play OS" set to "YES/Enable," uninstall it according to
Subsection 2.5.3, "Uninstalling the Driver Software," and re-set
"Plug & Play OS" to "NO/Disable," then install it again. If the
driver is installed with "Plug & Play OS" set to "YES/Enable," it
may become impossible to allocate memory to the I/O board.
(4) If "PCI Bus Master" is available for the BIOS setting, make sure
that it is set to "YES/Enable”. If it is "NO/Disable," re-set it to
"YES/Enable”. If it is "NO/Disable," a DMA transfer (used to
transfer video images from the I/O board to the PC) may not
work.

Installing
To install the driver, follow this procedure:
(1) Get a 3D Laser Vision Sensor software installation CD-ROM and
insert it into the CD-ROM drive of the PC.
(2) Execute SETUP.EXE in the folder D:¥CONTEC API-DIO
*****¥Disk1 from "Run…" (D:¥ indicates the name of the
CD-ROM drive and ***** indicates the version of the driver
software.)
(3) Install the software as directed in the menu that appears.
(4) Upon completion of driver installation, choose [Start] →
[Programs] → [CONTEC API-PIC(W32)] → [API-TOOL
Configuration]. The board is automatically recognized. The
screen below shows an example of how the board is recognized.

- 21 -
B-81444EN/03 2.SETUP

If the board is not automatically recognized, choose [Edit] →


[Add Board…]. Next, set "PIO-16/16L(PCI)H"
(or "PIO-16/16L(PCI)") in [Board Name] then click the OK button.

(5) Choose [File] → [Exit].


(6) When the dialog box [Save current settings to the registry?] is
displayed, choose Yes. This completes the installation.

- 22 -
2.SETUP B-81444EN/03

Checking driver operation


To check whether the driver is operating normally, the method
described below is available.
(1) Choose [Control Panel] → [System] → [Hardware] →
[Device Manager] then find the following location
(“Multifunction adapters” ~ “CONTEC Co.Ltd-PIO-16/16L(PC
I)H” (or “CONTEC Co.Ltd-PIO-16/16L(PCI)”)):

(2) Right-click then choose [Properties]. The screen below appears.


Check that the driver is operating normally.

2.5.3 Uninstalling the Driver Software


If you need to uninstall the driver, follow this procedure:
To remove the driver from the system, select [Control Panel] →
[Add/Remove Programs] → [API-DIO (98/PC) NT***] and click the
Add/Remove button.(*** indicates the version of the driver.).

- 23 -
B-81444EN/03 2.SETUP

2.6 ETHERNET COMMUNICATION SOFTWARE

The Ethernet communication software is necessary to use Ethernet


communication between the robot controller and the 3D Laser Vision
Sensor. To install the software, follow this procedure.
(If an earlier version of the software you are going to install is already
in the PC, uninstall it.)
(1) Choose [Settings] → [Control Panel] → [Network and Dial-up
Connections] → [Local Area Connection] → [Properties], then
check that [Internet Protocol(TCP/IP)] is displayed. If it is not
displayed, install the Internet protocol. Refer to the applicable
PC operator's manual for explanations about how to install.
(2) Get a 3D Laser Vision Sensor software installation CD-ROM and
insert it into the CD-ROM drive of the PC.
(3) Execute SETUP.EXE in the folder D:¥EthernetComm *** from
"Run…" (D:¥ indicates the name of the CD-ROM drive and ***
indicates the version of the Ethernet communication software.)
(4) Install the software as directed in the menu that appears.

- 24 -
2.SETUP B-81444EN/03

2.7 3D LASER VISION SENSOR

This section explains how to install the 3D Laser Vision Sensor


software. If you are going to install it in your PC for the first time,
follow Subsection 2.7.1, “Installing the 3D Laser Vision Sensor
Software”.
When you are going to upgrade the 3D Laser Vision Sensor software
in the PC, first uninstall the current version from the PC according to
Subsection 2.7.2, “Uninstalling the 3D Laser Vision Sensor Software”,
then install the new version according to Subsection 2.7.1, “Installing
the 3D Laser Vision Sensor Software”.

2.7.1 Installing the 3D Laser Vision Sensor Software


The procedure for installing the 3D Laser Vision Sensor software
follows:
(If an earlier version of the software you are going to install is already
in the PC, uninstall it.)
(1) Get a 3D Laser Vision Sensor software installation CD-ROM and
insert it into the CD-ROM drive of the PC.
(2) Execute SETUP.EXE in the folder D:¥3DV*** from "Run…"
(D:¥ indicates the name of the CD-ROM drive and *** indicates
the version of the 3D Laser Vision Sensor software.)
(3) Install the software as directed in the menu that appears.

2.7.2 Uninstalling the 3D Laser Vision Sensor Software


The procedure for uninstalling the 3D Laser Vision Sensor software
follows:
(1) Select [Control Panel] → [Add/Remove Programs]. The
[Add/Remove Program] menu appears.
(2) Choose the Change or Remove Programs button. Next, choose
[FANUC 3D VISION] from the list of displayed installed
applications then click the Chang/Remove button.
(3) Uninstall the software according to the menus displayed. If the
question “ Are you sure you want to remove the shared file? ” as
indicated below is asked, choose Yes To All

- 25 -
B-81444EN/03 2.SETUP

Taught data and preserved images are not deleted by uninstallation


operation. However, it is recommended to back up these data and
images for safety before starting uninstallation operation.

- 26 -
2.SETUP B-81444EN/03

2.8 ETHERNET

2.8.1 Connecting Cables


To use Ethernet communication between the robot controller and 3D
Laser Vision Sensor, connect the robot controller and the PC with an
Ethernet cable.
- On the robot controller, attach one end of the Ethernet cable to a
connector labeled CD38 on the front of the MAIN board.
- On the PC , attach the other end of the Ethernet cable to a

connector indicated with:

2.8.2 Determining the IP Address


Using Ethernet communication requires setting up an IP address for
each of the robot controller and the PC to which the 3D Laser Vision
Sensor is connected.
Basically, the network administrator determines the IP addresses.
To connect the robot controller and 3D Laser Vision Sensor on a
one-to-one basis, for example, set the IP addresses as stated below.
- Controller172.16.0.1
- Router 172.16.0.2 (No need to set if no router is in use.)
- PC 172.16.0.3
- Subnet mask 255.255.0.0

2.8.3 Making Settings for the Robot Side


To make settings for the robot controller, follow this procedure.
(1) Click the Menu button, and select "SETUP".
(2) Click "TYPE" (F1), and select "Host Comm".
(3) Select "TCP/IP”.

(4) Set a name for the controller in "Robot name". (Any name, for
example "a," is acceptable.)
(5) Set the IP address of the controller in "Robot IP addr".
(6) Set up a subnet mask in "Subnet Mask".
(7) Set the name and IP address of the controller, the name and IP
address of the router (if the router is available), and the name and
IP address of the PC to which the 3D Laser Vision Sensor is
connected in “Host Name (LOCAL)" and "Internet Address”.
(8) Turn the power for the robot controller off and on again.

- 27 -
B-81444EN/03 2.SETUP

2.8.4 Making Settings for the 3D Laser Vision Sensor Side


To make settings for the 3D Laser Vision Sensor, follow this
procedure.
(1) Choose [Control Panel] → [Network and Dial-up Connections].
(2) Right-click on [Local Area Connection] in the opened window
then choose [Properties].

(3) Choose [Internet Protocol[TCP/IP]] → [Properties].


(4) Check [Use the following IP address].
(5) Enter the IP address and subnet mask.
(6) Enter a default gateway if necessary.
(7) Click the OK button.

(8) Turn the PC off and on again.


- 28 -
2.SETUP B-81444EN/03

2.8.5 Communication Test


After attaching the Ethernet cable and making settings for both the
robot and the 3D Laser Vision Sensor, check to see whether it is
possible to make connections based on Ethernet according to the
following procedure.
(1) Enter the following command at the command prompt (started by
choosing [Start] → [Programs] → [Accessories] → [Command
Prompt]). ping <IP-address-of-the-robot>
Example ping 172.16.0.1
(2) When a connection is made successfully, the following message
appears.
”Reply from 172.16.0.1: bytes=32 time<10ms TTL=255”
(3) If an attempt to connect fails, the following message appears.
”Request timed out”
”Destination host unreachable”

The probable causes of a failure in this communication test include:


- The robot controller is not supplied with power.
- The Ethernet cable is not inserted correctly or has come off.
- An incorrect IP address is set for the robot controller.
- The robot controller has not been turned off and on again after its
IP address is set up.
- The PC has not been turned off and on again after its IP address
is set up.
- The type of the cable used is incorrect (use a straight type for
connection via a hub and a cross type for direct connection).

The "connection" confirmed here means that an environment for using


the network is set between the personal computer and robot, but does
not mean that communication between the 3D Laser Vision Sensor
software and robot controller is established. Communication
between the 3D Laser Vision Sensor software and robot controller is
established after completion of Chapter 5, ”ROBOT COMMUNICAT
ION SETTING”.

- 29 -
B-81444EN/03 2.SETUP

2.9 CONNECTING THE CAMERA

The models that can be used in this system include the following
seven: XC-ST70, XC-ST50, XC-ST30, XC-ES50, XC-ES30,
XC-EI50, and XC-EI30 from SONY.

2.9.1 Setting the Camera


(1) XC-ST70, XC-ST50, and XC-ST30 from SONY
For these models, it is necessary to change the setting of the
switches on the rear of the camera and the DIP switches from the
factory setting.
Switch HD/VD:Check that this switch is set to EXT
(factory setting).
Switch 75Ω: Re-set this switch from ON (factory setting) to
OFF.
DIP switch 1: Check that this switch is set to OFF (left side);
(top) keep the factory setting.
DIP switch 2: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 3: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 4: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 5: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 6: Set this switch to ON (right side).
DIP switch 7: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 8: Set this switch to ON (right side).

(2) XC-ES50, XC-ES30, XC-EI50, and XC-EI30 from SONY


For these models, it is necessary to change the setting of the
switches on the rear of the camera and the DIP switches from the
factory setting.
Switch HD/VD: Check that this switch is set to EXT
(factory setting).
Switch 75Ω: Re-set this switch from ON (factory setting) to
OFF.
DIP switch 1: Check that this switch is set to OFF (left side);
(top) keep the factory setting.
DIP switch 2: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 3: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 4: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 5: Check that this switch is set to OFF (left side);
keep the factory setting.
DIP switch 6: Set this switch to ON (right side).
DIP switch 7: Check that this switch is set to OFF (left side);
- 30 -
2.SETUP B-81444EN/03

keep the factory setting.


DIP switch 8: Set this switch to ON (right side).
DIP switch 9: Check that this switch is set to OFF
(factory setting).
DIP switch 0:Check that this switch is set to OFF
(bottom) (factory setting).

2.9.2 APC-3322A
When a camera adapter and adapter cable recommended in Subsection
2.1.4, "Camera Adapter and Adapter Cable" are used, up to three
cameras for 2D image processing can be connected in addition to one
3D Laser Vision Sensor.
However, the method of connection depends on the model selected.

(1) Camera adapter: DC-700 from SONY


Adapter cable: CBL-Z015-005 from AVAL DATA
Connect the video signal lines (ocher, red, orange, and yellow),
synchronization signal lines HZ (blue) and VD (purple), and trigger
signal line (green) of the adapter cable to the camera adapter.
The trigger signal line (green) must be connected as follows:

CAUTION
If the trigger signal line is not connected correctly, the
shutter speed witch function of the camera does not
operate normally.

- When the DC-700 is connected to the sensor head of the 3D


Laser Vision Sensor: Connect the trigger signal line to the
VIDEO 2 terminal.
- When the DC-700 is connected to a camera for 2D image
processing: Connect the trigger signal line to the TRIG
terminal.

TIP
The reason why the trigger signal line is connected to
the VIDEO2 terminal when the DC-700 is connected to
the sensor head of the 3D Laser Vision Sensor is that a
camera cable dedicated to the 3D Laser Vision Sensor
is used for the camera cable (ultimately connected to
the sensor head) of the wrist section of the robot.
When a commercially available camera cable (see
Subsection 2.1.5 "Camera Cable") is used, for
example, to install the 3D Laser Vision Sensor on a
fixed-installation basis or to check for a broken cable
temporarily, make a connection to the TRIG terminal.

- 31 -
B-81444EN/03 2.SETUP

3D Laser Vision Sensor (Camera #1)


Camera adaptor #1
Ocher VIDEO 1
DC IN
Green VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
TRIG

Frame grabber
board
Camera adaptor #2
Camera #2
Red VIDEO 1
DC IN
VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
Green TRIG

Camera adaptor #3
Camera #3
Orange VIDEO 1
DC IN
VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
Green TRIG

Camera adaptor #4
Camera #4
Yellow VIDEO 1
DC IN
VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
Green TRIG

(2) Camera adapter: APU-332 from AVAL DATA


Adapter cable: CBL-Z036 from AVAL DATA
When camera 4 is used, an adapter cable is connected, and a
commercially available coaxial cable with a BNC connector attached
to each end is also used for connection.
For each camera connection, the setting of the trigger signal can be
switched using a jumper pin of the APU-332. This jumper pin is
placed on the internal board of the APU-332, so that the upper case of
APU-332 needs to be removed for setting. For details of setting,
refer to "APU-332 Operator's Manual" delivered with the APU-332.
Two types of settings are available: one setting uses pin 9, and the
other pin 11.

- 32 -
2.SETUP B-81444EN/03

CAUTION
If the trigger signal is not set correctly, the shutter
speed control function of the camera does not operate
normally.

- When the camera connection section of the APU-332 is


connected to the sensor head of the 3D Laser Vision Sensor:
Pin 9 is used for setting.
- When the camera connection section of the APU-332 is
connected to the camera for 2D image processing: Pin 11 is
used for setting.

TIP
The reason why the trigger signal is set using pin 9
when the camera connection section of the APU-332 is
connected to the sensor head of the 3D Laser Vision
Sensor is that a camera cable dedicated to the 3D
Laser Vision Sensor is used for the camera cable
(ultimately connected to the sensor head) of the wrist
section of the robot. When a commercially available
camera cable (see Subsection 2.1.5, "Camera Cable")
is used, for example, to install the 3D Laser Vision
Sensor on a fixed-installation basis or to check for a
broken cable temporarily, use pin 11 for setting.

CAUTION
Before setting the trigger output signal, be sure to
unplug the power socket of the APU-332.

- 33 -
B-81444EN/03 2.SETUP

3D vision sensor
Camera adapter

DC IN
/ SYNC
CAM1

Camera #2
DC IN
CAM2 / SYNC

BOARD Camera #3

DC IN
CAM3
/ SYNC

Frame grabber
Camera #4
board CAM4
CH4 DC IN
/ SYNC

- 34 -
2.SETUP B-81444EN/03

2.10 PROTECTOR

Using the 3D Laser Vision Sensor requires a dedicated protector. The


protector is connected to the printer port of the PC. If no protector is
connected, it is impossible to use the 3D Laser Vision Sensor.
If no protector is connected, the following dialog box appears when
the 3D Laser Vision Sensor attempts to detect it.

Clicking the OK button in the dialog box closes the dialog box. In
this case, nothing is detected.
If no protector is connected to the PC, stop the 3D Laser Vision
Sensor, and connect a protector correctly, then start the 3D Laser
Vision Sensor .
If you want to use a printer, attach its printer cable to the far end of the
protector. In this case, the printer can be used in the same way as
when it is connected directly to the printer port.

CAUTION
Turn off the power to the personal computer before
connecting the protector.

- 35 -
B-81444EN/03 3.3D LASER VISION SENSOR BASIC OPERATIONS

3 3D LASER VISION SENSOR BASIC


OPERATIONS

- 36 -
3.3D LASER VISION SENSOR BASIC OPERATIONS B-81444EN/03

3.1 START 3D LASER VISION SENSOR

Click the [Start] button of Windows, and select [Programs] →


[FANUC 3D VISION] → [FANUC 3D VISION]. You will see the
screen as shown below.

3.2 EXIT 3D LASER VISION SENSOR

Select [eXit] from the [File] menu.

3.3 SELECT A CAMERA

The number of the camera currently selected is displayed in the


camera number box at the left end of the tool bar. To change the
camera number, click to the right of the camera number box and select
the number of the camera you want.

3.4 SNAP A NEW IMAGE FROM A CAMERA

Select [Snap] from the [View] menu. Or, click the button on
the tool bar.

- 37 -
B-81444EN/03 3.3D LASER VISION SENSOR BASIC OPERATIONS

3.5 DISPLAY LIVE IMAGES

Select [Live] from the [View] menu. Or, click the button on the
tool bar. The button remains pressed and a live image is
displayed.
To stop displaying live image, click the button to snap the
image.

3.6 CHANGING THE SHUTTER SPEED MODE FOR SNAP


AND LIVE

The shutter speed mode of the camera for snapping images and
displaying live images can be changed. The current mode is
displayed in the lower-right corner of the screen. By choosing from
the modes displayed below, the shutter speed of the camera for
subsequent image snapping and live image display is changed.
Moreover, when a program is executed, the display is changed
according to the setting of the program (see Chapter 8, "TEACHING
AND EXECUTION THE 2D VISION PROCESS", and Chapter 9,
"TEACHING AND TESTING THE 3D VISION PROCESS".)

Shutter speed mode Description


Not Use The shutter speed is not changed.
In this case, the standard shutter speed is used.
Multi mode(2D) In the multi-speed mode, multiple images obtained by different shutter speeds are
Multi mode(2D)E processed to generate an image that has a wider apparent dynamic range and is
Multi mode(3D) more immune to brightness changes when compared with a single image obtained
by one shutter speed. However, this mode has such disadvantages that a longer
time is required to take images and the image contrast difference becomes
smaller.
(2D) uses the shutter speed for 2D detection.
(3D) is suitable for laser beam detection and produces images darker than (2D).
(2D)E produces the same brightness as (2D), but generates an image that
highlights the feature of the image. In a program, (2D)E is used with the bin
picking function only. For the bin picking function, refer to the operator's manual.
Single mode This mode switches the shutter speed to obtain an image. A shutter speed is
determined by a numeric value indicated to the right of the mode. A smaller
number represents a brighter image.
A number from 0 to 16 can be set. When 0 is set, the same brightness as set by
“Not Use” is produced.

- 38 -
3.3D LASER VISION SENSOR BASIC OPERATIONS B-81444EN/03

3.7 TURN ON THE LASER

Select [Laser] from the [View] menu. Or, click the button on
the tool bar.
To turn off the laser, click the button again.

3.8 ENLARGING AND REDUCING IMAGES

Click the button on the tool bar.


The cursor changes to . In this state, left-click on a point of an
image. The point where you clicked moves to the center and the
image is enlarged. If you right-click, the image is reduced.
However, the image cannot be made smaller than the original size.

This image enlargement function can be used, for example, to identify


the feature of an image more precisely when teaching the location tool
as described later (see Chapter 7, "TEACHING AND TESTING THE
LOCATION TOOL").
To return the cursor to the original state, click on the tool bar.
To return an enlarged or reduced image to the original state, click
.

3.9 MOVING IMAGES

Click the button on the tool bar.


The cursor changes to . In this state, drag a point of an image to
move the image.
To return the cursor to the original state, click on the tool bar.
To return a moved image to the original state, click .

- 39 -
B-81444EN/03 3.3D LASER VISION SENSOR BASIC OPERATIONS

3.10 ERASING ALARMS

During program execution or teaching, an alarm as indicated below


may be displayed. Click to erase such an alarm after checking
the alarm.

3.11 PAINTING AN IMAGE WITH A RED PEN

In teaching of the location tool as described later (see Section 7.1,


"TRAIN MODEL"), an image may need to be painted using a red pen
or an image painted using a red pen may need to be restored to the
original state. These operations can be performed using the
following procedure:
- Move the mouse while left-clicking to paint an image.
- Move the mouse while right-clicking to return a painted
image to the original state.
For the thickness of a red pen, the following three options are
available for choice:
- "Thin" when the mouse is used for painting without using
any key
- "Medium" when the mouse is used while holding down the
Cntl key
- "Thick" when the mouse is used while holding down the Shift
key
Choose an option suitable for an area to be taught.

3.12 STATUS OF FINDING

While the 3D Laser Vision Sensor is finding the object, for example
the vision process is running, "Finding" appears at the lower right of
the screen.

- 40 -
4.VISION DATA B-81444EN/03

4 VISION DATA
The data used for setting or training the 3D Laser Vision Sensor is
called vision data.
Choose [3DV] or [2DV] from [View] on the menu bar, or
click the (3D measurement) button or the (2D
measurement) button on the tool bar. The list of vision data is
displayed to enable vision data to be taught.

- 41 -
B-81444EN/03 4.VISION DATA

4.1 VISION DATA TYPES

There are four types of vision data as described in the table below.

Description
Type of vision data
Camera setup Camera setup data is used to convert position information detected on an image into
robot coordinates. There are three types of calibration methods for the 3D Laser
Vision Sensor : 3D measurement calibration, simple 2-point calibration, and grid
pattern calibration.

The robot controller may use a camera setup name for execution. So, set a camera
setup name by using "alphanumeric characters" or "half-size kana characters". Up to
12 half-size characters may be used.
Location tool Location tool data is a 2D image processing tool that finds the taught shape on an
image and recognizes the position. The data that describes the shape of an object to
be found is called a model. In this case, the object to be found is taught as a model
for the location tool and whether or not it is found is checked.

An arbitrary location tool name can be set using characters that can be entered. No
limit is imposed on the number of characters.
Program The program defines what the 3D Laser Vision Sensor does according to a request
(vision process) from the robot controller. The program is also called as the vision process to
distinguish it from the robot controller program. When the camera setup and location
tool have been selected, if the program is started by the robot controller, it finds the
part by executing the location tool, converts the found position into robot coordinates
by using the camera setup, and then returns the result to the robot controller.

The robot controller may use a program name for execution. So, set a program
name by using "alphanumeric characters" or "half-size kana characters". Up to 12
half-size characters may be used.
Robot Robot data describes the robot controller that communicates with the 3D Laser Vision
Sensor via Ethernet.

An arbitrary robot name can be set using characters that can be entered. No limit is
imposed on the number of characters.

- 42 -
4.VISION DATA B-81444EN/03

4.2 3D MEASUREMENT AND 2D MEASUREMENT

The 3D Laser Vision Sensor consists of a laser projector and camera.


By using the camera singly, a 2D measurement can be made. By
using the camera together with the laser projector, a 3D measurement
can be made.
Measurement information obtained by a 2D measurement consists of a
"2D position and angle" or "3D view line".

Camera

2D position and angle 3D view line

Measurement information obtained by a 3D measurement consists of a


"3D position and posture" (in the case of disk measurement).

3D position and posture


A wide variety of applications can be served by using a 2D
measurement and 3D measurement on a case-by-case basis.
Information to be set for "3D measurement" differs from information
to be set for "2D measurement". So, vision data for the two types of
measurement is managed separately from each other, and the vision
data of one type cannot be shared with the vision data of the other,
except some information items.
The table below indicates the vision data that can be shared between
the two.
Sharable vision data Description
Camera setup Calibration data of "3D measurement" can be used by being imported to "2D
measurement". The converse is not true. For information about importing, see
Section 4.3, "LIST OF VISION DATA".
Robot Robot data, when set with one of "2D measurement" and "3D measurement", can
be used with the other.

- 43 -
B-81444EN/03 4.VISION DATA

4.3 LIST OF VISION DATA

Choose [3DV] or [2DV] from [View] on the menu bar, or


click the button or the button on the tool bar. The vision
data list screen shown below appears.

On this screen, you can newly create, delete, copy, or rename vision
data.

Icon and its meaning Description


Creates new vision data.
Creation
Deletes vision data.
Deletion
Copies vision data. Using this function together with "Paste" below creates the
Copy same data as a copy source under a different name.
Pastes copied vision data. The same data as a copy source is created under a
Paste different name.
Displays a list of vision data names together with large icons.
Large icon
Displays a list of vision data names together with small icons.
Small icon
Displays a list of vision data names together with small icons vertically.
List
Displays a list of vision data names together with size and update date/time
Detail information.
Displayed only with "2D measurement". Camera setup data of "3D measurement"
Import becomes usable with "2D measurement".

- 44 -
4.VISION DATA B-81444EN/03

4.4 BACKING UP VISION DATA

All of the settings of taught camera setup data, location tool data,
vision process data, and robot data are saved in the following folder
set on the option setting screen (see Section 11.1 "OPTION
SETTING"):
[Data folder]
All files in this folder contain user-taught data. So, such data, when
lost, cannot be restored. For safety, it is recommended to back up all
files contained in this folder.

4.5 CREATING A VISION DATA LIST FILE

The settings of camera setup data, location tool data, vision process
data, and robot data can be checked by opening each setting screen.
Besides, the settings can be checked by creating and opening a list file
(CSV format) containing the settings of all data.

Choose [Output Taught Data] from [Tool] on the menu bar. The
screen shown below is displayed. In a folder specified here, a file
listing all taught data is recorded.
The standard save destination is [Data folder] on the option setting
screen (see Section 11.1"OPTION SETTING").

Clicking the OK button stores the following files:

Type of data Description


2D measurement 2Dprogram(date-time).csv
program (Example: 2Dprogram2003-08-27-13-32-22.csv)
3D measurement 3Dprogram(date-time).csv
program (Example: 3Dprogram2003-08-27-13-32-22.csv)
Camera setup data Calib(date-time).csv
(Example: Calib2003-08-27-13-32-22.csv)
Robot data Robot(date-time).csv
(Example: Robot2003-8-27-13-32-22.csv)
Location tool Tool(date-time).csv
(Example: Tool2003-08-27-13-32-22.csv)

In the case of simultaneous execution, the files below are also saved at
the same time. These files are related to the bin picking function

- 45 -
B-81444EN/03 4.VISION DATA

only. For information about the stack/retrieve function, see the


relevant operator's manual.

Item Description
Overall search SearchProgram(date-time).csv
program (Example: SearchProgram2003-08-27-13-32-22.csv)
Trial measurement TrialProgram(date-time).csv
program (Example: TrialProgram2003-08-27-13-32-22.csv)
Overlap detection OverlapProgram(date-time).csv
program (Example: OverlapProgram2003-08-27-13-32-22.csv)
Fine detection FineProgram(date-time).csv
program (Example: FineProgram2003-08-27-13-32-22.csv)
Environment data Environment(date-time).csv
(Example: Environment2003-08-27-13-32-22.csv)

- 46 -
5.ROBOT COMMUNICATION SETTING B-81444EN/03

5 ROBOT COMMUNICATION SETTING


The operation explained in this chapter is necessary for the 3D Laser
Vision Sensor to communicate with the robot via Ethernet.
When the robot controller invokes a vision process, the 3D Laser
Vision Sensor finds the object in an image which is snapped by a
camera, and sends the result and found object position to the robot
controller.
For communication between the 3D Laser Vision Sensor and robot
controller, "Robot" vision data must be set properly.
By setting up "Robot," the robot controller can invoke the 3D Laser
Vision Sensor process. And if you specify the "Robot" name in 3D
Laser Vision Sensor process, the 3D Laser Vision Sensor can return
the found result and position to the robot controller.
This chapter explains the training procedure of the robot.
Select the [Robot] tab, and double-click a robot icon that you want to
train on the vision data list screen. The Robot Communication Setting
screen shown below appears.

Before this setting can be made, it is assumed that an environment for


using the network between the personal computer and robot controller
is already established. For details, see Section 2.8, "ETHERNET".

- 47 -
B-81444EN/03 5.ROBOT COMMUNICATION SETTING

5.1 COMMUNICATION CONDITION

Click the Change button on the Robot Communication Setting screen.


You will see the screen as shown below.

Item Description
Enable Communication If you check this item, the Ethernet communication to this robot is enabled. If the
robot controller specified by the IP address and PC are connected (see Section
5.2, "CONNECTION STATUS"), the 3D Laser Vision Sensor process can be
invoked from the robot controller. And if the robot is selected in the 3D Laser
Vision Sensor process, the found result and found position are sent from the 3D
Laser Vision Sensor to the robot controller. If you do not check this item, even
when the robot is selected in the 3D Laser Vision Sensor process, the
communication is not performed.
Protocol Choose “RobotServer” or “R90”. Both are communication protocol names.
The choice of “R90” speeds up communication unless multiple robot controllers
are connected. If multiple robot controllers are connected, choose
“RobotServer”.
IP Address IP address of the robot controller to connect.

Upon completion of setting, click the OK button. If “Enable


Communication” is checked, clicking this button immediately starts a
connection job and returns the screen display to the screen for setting
communication with the robot controller.

To cancel the setting, click the Cancel button. The setting returns to
the one before modification, and the screen display returns to the
screen for setting communication with the robot controller.

CAUTION
You must not specify the same IP address for multiple
robots. In the same way, you must not create the
multiple robots with the same IP address. Otherwise,
when the robot controller invokes the 3D Laser Vision
Sensor process, the same 3D Laser Vision Sensor
process is invoked twice.

- 48 -
5.ROBOT COMMUNICATION SETTING B-81444EN/03

5.2 CONNECTION STATUS

If you enable communications, specify the IP address, and press the


OK button on the Change of communication condition screen, the
software of the 3D Laser Vision Sensor starts to connect to the robot
immediately. The connection status is displayed on the Robot
Communication Setting screen.

Status Color Description


Connected Green The 3D Laser Vision Sensor software is connected with the robot
controller, and communication is enabled.
Not connected Red The 3D Laser Vision Sensor software is not connected with the robot
controller.
Connecting Yellow The 3D Laser Vision Sensor software has issued a request to make a
connection with the robot controller and is waiting for a reply. If this state
lasts for a long time, the cause may be the power to the robot controller
turned off, the disconnection of the cable, or the like.

5.3 COMMUNICATION TEST

When a connection has been established and communication is


enabled, you can execute a communication test to confirm that
Ethernet communication works correctly between the robot controller
and 3D Laser Vision Sensor. When you click the Send button, the
specified data will be sent to the specified register of the robot
controller.

5.4 AUTOMATIC CONNECTION

If the communication is enabled and the IP address is set on the


Change of communication condition screen, the connection is
established automatically when the 3D Laser Vision Sensor is started.
If several robots are trained, the 3D Laser Vision Sensor connects all
the robot controllers specified in the robot.
The state of connection with a trained robot can be checked using the
alarm history.

- 49 -
B-81444EN/03 5.ROBOT COMMUNICATION SETTING

The figure above shows alarm history information displayed when the
3D Laser Vision Sensor software is started in the normal state.
The robot name is "Robot".
“Receive a reply” indicates that the network can be used between the
personal computer and robot controller. “Connection established”
indicates that communication is established between the 3D Laser
Vision Sensor software and robot controller.

The figure above shows alarm history information displayed when the
3D Laser Vision Sensor software is started with the power to the robot
controller turned off.
“No reply” indicates that the network cannot be used between the
personal computer and robot controller, and the result is that the 3D
Laser Vision Sensor software has failed to establish communication
with the robot controller.

TIP
If the state of communication between the personal
computer and robot controller is unstable (for example,
when the cable is partly broken and the broken point is
connected or disconnected from time to time), ”Receive
a reply” and “No reply” are recorded in the alarm history
from time to time.
If the 3D Laser Vision Sensor abruptly stops returning
the result of detection to the robot or the 3D Laser
Vision Sensor abruptly consumes a longer processing
time, for example, check the state of communication
reported in the alarm history.

- 50 -
6.CALIBRATION B-81444EN/03

6 CALIBRATION
The 3D Laser Vision Sensor finds the target workpiece in the image
received from the camera and informs the robot controller of the
location of the workpiece. For the robot controller to perform
position compensation, the positional data on the image measured
with the 3D Laser Vision Sensor must have been converted to
positional data represented using the robot coordinate system (base
frame or user frame). Performing this data conversion requires
information about how and which workpiece in the tool coordinate
system or user coordinate system the 3D Laser Vision Sensor is
looking at. The operation performed to set up this information is
called 3D Laser Vision Sensor calibration.

- 51 -
B-81444EN/03 6.CALIBRATION

6.1 CALIBRATION TYPES

The 3D Laser Vision Sensor is provided with three types of camera


calibration methods.

Measurement Calibration type Description


type
3D measurement 3D measurement calibration This is a calibration method for 3D measurement. This
calibration uses a dedicated grid pattern jig.
2D measurement Simple 2-point calibration This is a camera calibration method for 2D measurement.
It is very simple method just specifying two points on an
image.
Grid pattern calibration This is a camera calibration method for 2D measurement.
This calibration uses a dedicated grid pattern jig of the
default shape.

For 2D measurement, select a necessary calibration type from


"Calibration type" in the above table.

- 52 -
6.CALIBRATION B-81444EN/03

6.2 3D MEASUREMENT CALIBRATION

3D measurement calibration is dedicated to 3D measurement. 3D


measurement calibration uses a calibration jig having a predefined
shape.
Calibration is completed by executing the following in this order:
- 6.2.1 Setting of 3D Measurement Calibration
- 6.2.2 Detecting a Grid Pattern and a Laser
For use as a hand eye to be attached to the robot, the following is
additionally required:
- 6.6.2 Setting a Sensor Coordinate System

TIP
This calibration must be performed at all times in the
following cases:
- When the 3D Laser Vision Sensor is used for the first
time, for example, in the initial installation or
immediately after the sensor is replaced
- When the relative positional relationship between the
camera unit and laser unit may have changed for a
cause such as a collision
When the 3D Laser Vision Sensor is installed or
uninstalled, calibration is not required if no modification
is made to the 3D Laser Vision Sensor itself.

6.2.1 Setting of 3D Measurement Calibration


The following dialog box is displayed by selecting the [CameraSetup]
tab from the 3D measurement vision data list displayed on the screen
and double-clicking the camera calibration type to be taught to the
robot.

Before starting to perform calibration, make the following setting:


Camera No.
Select the number of the camera to which the 3D Laser Vision
Sensor is connected. (Default value: 1)

- 53 -
B-81444EN/03 6.CALIBRATION

Grid spacing
Enter the grid point interval of the grid pattern jig to be used.
For information about a grid point interval, see Appendix B,
"CALIBRATION JIG". (Default value: 15 mm)
Camera type
The type of camera used with the 3D Laser Vision Sensor is
“SONY ES-30”. (Default value: “SONY ES-30”)
This value need not be changed usually. If this value needs to
be changed for a cause, click the Change… button, then choose a
desired type of camera on the screen shown below. Choose a
desired type of camera from Camera Type then click the Close
button.

Grid position
Enter the grid pattern position of the calibration jig. Usually,
+50 mm is to be set in Plane1, and -50 mm is to be set in Plane2.
For information about the position of the calibration jig specified
by +50 mm and -50 mm, see Subsection 6.2.2, "Detecting a Grid
Pattern and a Laser". (Default values: +50 mm in Plane1 and
-50 mm in Plane2) The setting of Plane1 is reflected when you
click the Snap1 button. The setting of Plane2 is reflected when
you click the Snap2 button.

6.2.2 Detecting a Grid Pattern and a Laser


To detect grid patterns, first mount a grid pattern jig at the specified
location on the 3D Laser Vision Sensor. Refer to the "Force sensor /
3D Laser Vision Sensor Maintenance Manual" for details of how to
mount the grid pattern jig.

- 54 -
6.CALIBRATION B-81444EN/03

Position for snap 1 (+ 50mm )

Position for snap 2 (- 50mm )

First, put the grid pattern plate of the calibration jig in the place closer
to the camera (+50 mm), and click the Snap 1 button. A dialog box
for teaching the window area appears. Teach the position of the
window so that the grid pattern fits in the window, and click the OK
button.

In that state, detection is performed. Each dot is detected, and a laser


beam is detected. Check the results as follows:
- Check that a circled plus sign is indicated at four big grid points.
- Check that no plus sign is indicated at positions other than the grid
points.
- There may be other grid points with no plus sign indicated, but check
that there are no more than several of those grid points.
- 55 -
B-81444EN/03 6.CALIBRATION

- Check that red lines representing laser beam points are displayed
across the window.
The figure below shows an example of correct detection.

Next, place the grid pattern plate of the calibration jig in the place
farther from the camera (-50 mm), then click the Snap2 button. The
dialog box for teaching the window is displayed. Teach the window
properly so that the grid pattern plate fits nicely, then click the OK
button.
Make the same checks as done when you click the Snap1 button.
This completes calibration.

6.2.3 Calibration Data Detail Display


To see if calibration is successful, check that the state of calibration
data has been changed to “Trained” (green).
Next, click the Show details button to check that grid points are
detected correctly. If calibration has been performed according to
the correct procedure, a confirmation should have been made in the
process described in Subsection 6.2.2, "Detecting a Grid Pattern and a
Laser". However, check the following again here:
- Check that a circled plus sign is indicated at four big grid points.
- Check that no plus sign is indicated at positions other than the grid
points.
-There may be other grid points with no plus sign indicated, but
check that there are no more than several of those grid points.

- 56 -
6.CALIBRATION B-81444EN/03

If detection has not been performed correctly, start all over again.
At the same time, the detail data of the calibrated camera can be
checked.

Camera data
Information such as the scale and lens distortion of the calibrated
camera is displayed. Check “Focal length” and “Lens
distortion” according to the following guideline:

Value to be checked Description


Focal length Check that the deviation from the specified focal length (8 mm or 12 mm) of the
lens used with the 3D Laser Vision Sensor is within 5%.
Lens distortion Check that the value is positive and is 0.01 or less.

Camera position
The position and posture of the camera in the sensor coordinate
system are displayed. For information about the sensor
coordinate system, see Section 6.6, "SENSOR COORDINATE
SYSTEM".

- 57 -
B-81444EN/03 6.CALIBRATION

6.2.4 Laser Slit Detail Display


Information indicating laser beam planes is displayed.
(Px,Py,Pz) represents a point on a plane, and. (Vx,Vy,Vz) represents a
vector normal to the plane. This information is based on the sensor
coordinate system described later. For information about the sensor
coordinate system, see Section 6.6, "SENSOR COORDINATE
SYSTEM ".
Slit plane 1 data
Information about the plane of slit 1 (from the lower-left corner
to the upper-right corner of the screen) is displayed.
Slit plane 2 data
Information about the plane of slit 2 (from the upper-left corner
to the lower-right corner of the screen) is displayed.

This screen does not provide information used to determine whether


calibration is successful.

6.2.5 Setting Laser Slit Detection Parameters


In a rare case, a laser beam is detected poorly for an environmental
cause (related particularly to brightness such as illumination).
In such a case, move the robot to change the brightness on the image,
and adjust the parameters for laser beam detection. By taking these
measures, a better result may be produced.
Brightness Threshold
This detection parameter is related to the difference in brightness
between the area where a laser beam is directed and the
background. If the background is bright, a better result may be
obtained by decreasing this parameter value. If a laser beam
glares, a better result may be obtained by increasing this
parameter value. For tuning this parameter value, see Section
9.8 "SHOWING BRIGHTNESS DISTRIBUTION".
Slit Points Threshold
This detection parameter is related to the number of laser beam
points used for a calibration calculation. If calibration cannot

- 58 -
6.CALIBRATION B-81444EN/03

be made possible by any other means, calibration may be enabled


by decreasing this parameter value.

CAUTION
These parameters, when tuned properly, can produce a
better calibration result. However, the precision of
calibration can deteriorate, depending on the tuning of
these parameters.
If the laser beam cannot be calibrated properly,
measures such as adjusting the illumination
environment and moving the robot to another position
should be taken before tuning these parameters.

6.2.6 Checking the Precision of Calibration


After completion of calibration, the precision of calibration can be
checked using the calibration jig.
By using three calibration jig installation locations, the calibration jig
plate is actually measured. Obtained measurement values are
compared with the expected values to check the precision.

Position for snap 1 (+50 mm)

Reference position (0 mm)

Position for snap 2 (-50 mm)

- 59 -
B-81444EN/03 6.CALIBRATION

For a measurement program, the disk measurement function for 3D


measurement is used. For program creation, see Section 9.2, "DISK
MEASUREMENT" in Chapter 9, "TEACHING AND TESTING THE
3D VISION PROCESS". In Subsection 9.2.3, "Teaching Procedure",
particularly, the teaching procedure is described using an example of
creating a measurement program usable for checking the precision of
calibration.
A model is created in the state where the plate is installed at the
reference position (0 mm). If “Search angle” is disabled,
and ”Search size” is set to about 80 to 120, a correct detection is made
at any other installation position.
Set the origin of the model precisely at the center of the central dot of
the calibration jig by using the image enlargement function (see
Section 3.8, "ENLARGING AND REDUCING IMAGES") and the
Center button (see Section 7.2, "MODIFYING LOCATION TOOL
MODEL").

- 60 -
6.CALIBRATION B-81444EN/03

If a created program is executed for testing, and the results of


comparison between the theoretical values and actually measured
values satisfy the criteria indicated below, the precision of calibration
is acceptable. For the state monitor, see Section 9.6, "VISION
PROCESS TEST EXECUTION".

Criteria for checking


The error of each of X, Y, and Z is within 0.5 mm.
The error of each of W, P, and R is within 0.8 deg.
The “LL Dist” of the status monitor is within 1.0 mm,
and the “Lean” of the status monitor is within 1.0 deg.

Plate position Description


Position for snap (X, Y, Z, W, P, R) = (0, 0, +50, 0, 0, 0)
1 (+50 mm) The LL distance is 0 mm, and the inclination is 0 deg.
Reference (X, Y, Z, W, P, R) = (0, 0, 0, 0, 0, 0)
position (0 mm) The LL distance is 0 mm, and the inclination is 0 deg.
Position for snap (X, Y, Z, W, P, R) = (0, 0, -50, 0, 0, 0)
2 (-50 mm) The LL distance is 0 mm, and the inclination is 0 deg.

The figure below shows an example of measurement result at the


reference position (0 mm) when the calibration is correct. The figure
indicates that the criteria are satisfied.

- 61 -
B-81444EN/03 6.CALIBRATION

6.3 SIMPLE 2-POINT CALIBRATION

The simple 2-point calibration is the camera calibration method for


2-D application. Supposing that the camera image plane is parallel to
the X-Y plane of the robot user coordinate system, it just converts
scale and X-, Y-axis directions between two planes. This is very
simple method, but provides enough accuracy for usual 2-D robot
guidance applications.

CAUTION
3D Laser Vision Sensor can handle only the X-Y plane
of the robot user coordinate system, for instance, the
Y-Z plane, Y-X plane (reverse side of X-Y plane) can
not be handled.
Be sure to perform calibration at the same height as an
actual workpiece. If the calibration plane is not at the
same height as an actual workpiece, an error can
result.

Select the [CameraSetup] tab, and double-click a camera setup icon


that you want to teach on the vision data list screen, then select the
simple 2-point calibration. You will see the screen shown below.

From the Camera No. drop-down list, select the number of the camera
you want to calibrate.
Robot to get pos. is used to get the current robot position by the
Ethernet communication during camera calibration teaching.
If you click the Select button, the list of robots is displayed as follows.
Select the robot from the list.

- 62 -
6.CALIBRATION B-81444EN/03

Prepare two points in the field of view of the camera, to which a robot
can touch up by using the TCP.
The two points must be at the same height as the workpiece. You will
get better accuracy when you place the two points apart as from each
other as possible as shown below.

FOV

x
x
P1
P2

Click the 1st Point tab.


Click the Select button of “Vision coord”. You will see a red crosshair
cursor on the image.
Drag the cursor onto the center of the first point, and then click OK.
Move the robot to touch up the center of the first point by using the
TCP, and copy X and Y values of the robot position from the teach
pendant to X and Y boxes of “World coord”.
Click the 2nd Point tab.
Click the Select button of “Vision coord”. You will see a red crosshair
cursor on the image.
Drag the cursor onto the center of the second point, and click OK.
Move the robot to touch up the center of the second point by using the
TCP, and copy the X and Y values of the robot position from teach
pendant to X and Y boxes of “World coord”.
After you complete the training, press the Show details button, display
the calibration data details screen, and confirm that the value of scale
(the length per one pixel) is adequate. See Section 6.5 "SHOW
DETAILS OF CAMERA CALIBRATION DATA " in this chapter
about the calibration data detail screen.
When you click the OK button to exit from the screen, the taught data
is saved.
When you click the Cancel button to exit from the screen, the taught
data is lost.

- 63 -
B-81444EN/03 6.CALIBRATION

Robot position
When the personal computer and robot controller are connected
with each other via Ethernet, the current robot position can be
obtained by using the Record button instead of by reading X and
Y of the current position. Click the Record button while the
center of point 1 or point 2 is touched up by the TCP of the
robot.

CAUTION
Before the robot position can be obtained,
communication must be enabled. For the setting of
communication, see Chapter 5, "ROBOT
COMMUNICATION SETTING". Moreover, before
obtaining the current robot position, check that a user
coordinate system and tool coordinate system are
selected correctly on the robot controller side.

Find
If the object representing point 1 or point 2 is a circle or ellipse,
the object can be detected and the center of point 1 or point 2 can
be automatically set by clicking the Find button, instead of by
clicking the Select button and setting the center of point 1 or
point 2 by dragging the red + mark. If the object is not a circle
or ellipse, do not use this method, because precise detection
cannot be performed.

- 64 -
6.CALIBRATION B-81444EN/03

6.4 GRID PATTERN CALIBRATION (2D MEASUREMENT)


The grid pattern calibration is the camera calibration method for 2-D
applications. It is applicable to wider varieties of applications and
provides better accuracy than the simple 2-point calibration.

A grid pattern calibration uses a grid pattern jig of the default shape.
About the grid pattern jig, see Appendix.B, "Calibration Jig”.

Be careful about the following points when you use the grid pattern
calibration for 2-D application:

NOTE
When the workpiece height does not match the height
after calibration, a parallax error occurs. Place the
calibration fixture at the same height as the workpiece.
It is assumed that the workpiece height corresponds to Z
= 0 in the user coordinate system. Set up the user
coordinate system so that it matches the workpiece
height.

Select the [CameraSetup] tab from the vision data list, and
double-click the camera calibration type to be taught to the robot, then
select the grid pattern calibration as the desired calibration type. You
will see the screen as shown below.

Before starting calibration, make the following settings:


1. From Camera No., select the camera number you want to
calibrate.
2. "Robot to get pos”. is used to get the current robot position by
the Ethernet communication during camera calibration teaching.
If you click the Select button, the list of robots is displayed as
follows. Select the robot from the list.

- 65 -
B-81444EN/03 6.CALIBRATION

3. Enter the interval between grid points of a grid pattern jig in Grid
spacing.
4. Click the Change button to select the type of a camera to be used.
After selecting the camera type from Camera Type, click the
Close button.

6.4.1 Find Grid Pattern


Place the grid pattern jig in the field of view of the camera.
At this time, set the grid pattern jig so that the camera direction and
the direction of the plane of the grid pattern jig r forms an angle from
30 degrees to 45 degrees.
First, find a grid pattern and then identify the relative position between
the camera and the grid pattern.
Click the Snap button to snap a new image of the grid pattern jig.
Click the Find grid pattern button. The 3D Laser Vision Sensor
detects the grid pattern and plots a red crosshair cursor on the found
grids as shown below.

- 66 -
6.CALIBRATION B-81444EN/03

Confirm if the grid pattern is properly detected.


- Make sure a crosshair cursor in a circle appears on each of the
four big grids.
- Check that no plus sign is indicated at positions other than the
grid points.
- There may be other grid points with no plus sign indicated, but
check that there are no more than several of those grid points.
If 3D Laser Vision Sensor does not find the grid pattern properly, try
again by snapping a new picture.

6.4.2 Grid Pattern Frame

Click the Grid Pattern Frame button to set up the grid pattern frame;
that is the information where the calibration jig is place in the robot
coordinate system when you calibrate the camera.
The 3D Laser Vision Sensor provides two methods to setup grid
pattern frame as shown below.

Method Description
Direct Directly enter the location and orientation of grid frame in the
robot user coordinate system ( See Appendix B,
“CALIBRATION JIG”. ) by XYZWPR format.
3-point Enter X, Y and Z of three points that indicate the coordinates
of the grid pattern (origin, X-axis point and Y-axis point). The
3D Laser Vision Sensor computes the location and orientation
of the grid frame.
Direct method
Select "Direct" from Setup method.

- 67 -
B-81444EN/03 6.CALIBRATION

Enter a numeric value in each of X, Y, Z, W, P, and R of the frame.


If the grid pattern jig is attached to the hand of the robot, click the
Record button while the camera is viewing the grid pattern jig. The
coordinates to be set as the current robot position are read. Check
that the tool coordinate system is set to match the grid pattern
coordinates.
Usually, set 0 for X, Y, and Z as the offset of the grid pattern origin.

3-point method
Select "3-points" from Setup method.
Move the robot to touch up the origin, X-axis and Y-axis points of the
calibration jig by its TCP, and enter X, Y and Z of each robot position.
(For the origin, X-axis point, and Y-axis point, see Appendix B,
"CALIBRATION JIG".)
Clicking the Record button in the state of touch-up by the TCP of the
robot reads the current robot position for setting.
Under Offset of origin, enter the offset from the touched up origin to
the grid pattern origin, which can be obtained from the drawing of the
calibration jig.

After you complete the training, click the Show details button, display
the Calibration data details screen, and confirm that the value of focal
length is almost adequate. See also Section 6.5, "SHOW DETAILS
OF CAMERA CALIBRATION DATA" about the Calibration data
details screen.
- 68 -
6.CALIBRATION B-81444EN/03

CAUTION
Regardless of whether “Direct” or “3-points” is chosen, communication must be enabled
before the robot position can be obtained.
For the setting of communication, see Chapter 5, "ROBOT COMMUNICATION
SETTING". Moreover, before obtaining the current robot position, check that a user
coordinate system and tool coordinate system are selected correctly on the robot
controller side.

6.4.3 Example of Setting a Grid Pattern Frame


An example of setting a grid pattern frame is provided below.
When a user coordinate system is set using the calibration
jig…
In this case, the grid pattern frame matches the user coordinate
system. Set 0 for all of X, Y, Z, W, P, and R according to
“Direct”.
When the calibration jig is attached to the hand of the
robot…
In this case, the design drawing of the hand indicates the
positional relationships between the flange of the robot and the
grid pattern. So, set the tool coordinate system to match the
grid pattern. By doing so, the current robot position directly
indicates the grid pattern frame. Move the robot so that the grid
pattern is within the view of the camera, then enter the current
robot position at that time according to “Direct”.
When the calibration jig is fastened to an appropriate
table…
Touch up the three points representing the grid pattern frame
with the TCP of the robot in the same way as for setting the user
coordinate system of the robot, then enter the X, Y, and Z values
of the current robot position.
In this case, three touch-up pins that enable touch-up by the TCP
of the robot must be prepared with the calibration jig as shown
below. Those touch-up pins must be parallel with the grid
points of the grid pattern. Set the deviation between the
coordinate origin to be touched up and the grid point origin as a
grid pattern origin offset. In the figure below, the grid pattern
origin offset is (X=a, Y=b, Z=-c).
Y-axis point

Origin

b X-axis point


- 69 -
B-81444EN/03 6.CALIBRATION

6.5 SHOW DETAILS OF CAMERA CALIBRATION DATA

Click the Show details button. You will see the screen as shown
below.
Data displayed for simple 2-point calibration differs from data
displayed for grid pattern calibration.

Item Description
Camera data Scale, lens distortion and so on of the calibrated camera.
Camera Location and orientation of the camera in the robot
position coordinate system.

For simple 2-point calibration

For grid pattern calibration

- 70 -
6.CALIBRATION B-81444EN/03

6.6 SENSOR COORDINATE SYSTEM

When calibration is performed, a coordinate system is set in front of


the sensor. This coordinate system is referred to as a "sensor
coordinate system" (or "sensor reference coordinate system").

.
=. 400 mm
Z (depends on the sensor type)
Sensor coordinate system

CAUTION
The term "sensor coordinate system" is used in two ways:
A. Coordinate system set by calibration as described here.
B. Value indicating the position of the coordinate system of
“A.” above relative to the mechanical interface coordinate
system of the robot (value set as a tool coordinate system).
Practically, both “A.” and “B.” represent the same thing.
However, be careful not to confuse one from the other.

6.6.1 Sensor Coordinate System and Measurement Results


Actual measurement results are output as values indicated in the
sensor coordinate system or values converted to values in the
(currently selected) user coordinate system of the robot to which the
3D Laser Vision Sensor is attached.

- 71 -
B-81444EN/03 6.CALIBRATION

Σf



Σs Σu
② ①

Measurement

3D Laser Vision Sensor output = ① or ②

③ = Position of sensor reference coordinate system Σs (fixed)


relative to robot wrist mechanical interface Σf
= Referred to as a "sensor coordinate system"
④ = Position of mechanical interface Σf relative to user coordinate system Σu

⑤ = Position of sensor reference coordinate system Σs relative to


user coordinate system Σu

Output ① = ⑤ × ②
= ④ × ③ × ②
Output ② = ②

Whether the output format is ① or ② depends on the setting of


each vision process. For details, see Chapter 9, "TEACHING AND
TESTING THE 3D VISION PROCESS".
- ④ represents a value obtained as the current robot position, and
③ represents a user-set value. As seen from the formulas
above, precise output ① cannot be obtained unless a "sensor
coordinate system" ③ is set precisely. For the setting of a
sensor coordinate system, see Subsection 6.6.2, "Setting a Sensor
Coordinate System".
-
6.6.2 Setting a Sensor Coordinate System
As described in the previous subsection, the precise setting of a sensor
coordinate system is a key to ensuring final work precision. The 3D
Laser Vision Sensor provides a function for setting a sensor
coordinate system in a simple way.

- 72 -
6.CALIBRATION B-81444EN/03

TIP
A sensor coordinate system must always be set in the
following cases:
- After performing 3D measurement calibration. (For
details, see Section 6.2, "3D MEASUREMENT
CALIBRATION".)
- If installation position reproducibility is doubtful as in
the case of absence of positioning pins when the 3D
Laser Vision Sensor is installed and removed. The
setting is unnecessary if simple re-calibration can be
performed. (For details, see Section 6.7, "SIMPLE
RE- CALIBRATION".)

Preparation and procedure


Prepare the calibration jig described in Appendix B, and install it
beforehand at an arbitrary position that is within the view of the 3D
Laser Vision Sensor.
Before this setting, 3D Laser Vision Sensor calibration must already
be performed. For 3D Laser Vision Sensor calibration, see Section
6.2, "3D MEASUREMENT CALIBRATION".

TIP
In the setting of a sensor coordinate system, the
position of the calibration jig need not be reproduced
precisely. However, if reproducibility is available, the
program described later can be reused to speed up
restoration work required as in the case where the
sensor is replaced and a sensor coordinate system
needs to be set again.

The procedure is described below.

First, set approximate sensor coordinate system values with a


specified tool coordinate system (TF = 1 here). (TF: Tool coordinate
system number)
These approximate values may have a precision as obtained from the
drawing or as measured actually in a simplified way.

Next, create a robot program by referencing the sample program


described later.
In the hatched portion, specify a sensor coordinate system number (TF
= 1), grid pattern dot interval, and 3D Laser Vision Sensor
calibration data name. This program assumes that the 3D Laser
Vision Sensor calibration data name is "CALIB".

Then, cause the robot to measure the calibration plate by placing it in


three different postures. Furthermore, referencing the example of
position data described later, adjust the position and posture while
paying attention so that all dots of the calibration jig are within the
view of the camera.

- 73 -
B-81444EN/03 6.CALIBRATION

When
PCVIS RUN CALIB_MNT* [ST=R[1],OF=PR[1]]
(where * represents 1, 2, or E)
of the program is executed, the following screen is displayed:

Set the window so that the window contains all grid patterns.
When all points are successfully detected, the screen shown below is
displayed.

Check the results of detection. If all dots are detected successfully,


click the OK button.

If an even one dot is not detected successfully, the following is


displayed:

- 74 -
6.CALIBRATION B-81444EN/03

By adjusting the position and brightness, retry until all dots are
detected successfully. Upon successful completion, a measurement
is automatically made using a laser beam.

CAUTION
In each measurement based on the sensor coordinate
system setting function, an error occurs if all dots are
not detected each time. Adjust the position and
brightness so that all dots can be detected.

Perform this operation for all of three points.


In the example of position data described later, the remaining two
points are used for teaching so that the calibration jig is viewed as
follows:

- 75 -
B-81444EN/03 6.CALIBRATION

Upon completion of this program, the sensor coordinate system values


are updated to precise values.

Sample program
The sample program for setting a sensor coordinate system uses two
register areas and one position register area. In the sample program,
these registers are referred to as follows:
Register : R[1], R[2]
Position register : PR[1]
When an area other than these areas is used, modify the program
accordingly, observing the caution described below.
These register areas are used for the following purposes:
R[1] : Measurement processing status
R[2] : Calibration jig dot interval (in mm)
PR[1] : Measurement processing results

- 76 -
6.CALIBRATION B-81444EN/03

CAUTION
When using registers not used in this sample program,
be sure to use a series of registers. This means that
when R[N] is used for measurement processing status
setting, R[N+1] must always be used for calibration jig
dot interval setting.
! Setting Sensor Frame
!
In R[2], specify a grid pattern point
UFRAME_NUM = 0
interval to be used.
UTOOL_NUM = 1

!
! Grid patter pitch In P[1] to P[2], teach a position for calibration jig measurement.
! The distance between the calibration plate and sensor is
R[2] = 15.0 approximately a reference standoff.
Set P[1] and P[2] so that only R of the posture component
J P[1] 30% FINE differs (important).
R[1] = (-1)

PCVIS RUN CALIB_MNT1[ST=R[1],OF=PR[1]


WAIT R[1] <> (-1)
IF R[1] = 0, JMP LBL[999]
Specify the calibration data name of the
J P[2] 30% FINE 3D vision sensor.
R[1] = (-1)
PCVIS RUN CALIB_MNT2 [ST=R[1],OF=PR[1]]
WAIT R[1] <> (-1)
IF R[1] = 0,JMP LBL[999]

J P[3] 30% FINE


R[1] = (-1)
PCVIS RUN CALIB_MNTE [ST=R[1],OF=PR[1]]
WAIT R[1] <> (-1)
IF R[1] = 0,JMP LBL[999]

PAUSE
UTOOL [1] = PR[1]
ABORT
LBL[999]
UALM[1]

- 77 -
B-81444EN/03 6.CALIBRATION

Example of position data


The robot position of teach data is described below.
The three taught points are used to adjust the position so that the
following posture condition is satisfied and that the entire grid pattern
can be viewed by the camera of the 3D Laser Vision Sensor.

P[1] : Posture in which the optical axis of the camera unit of the 3D
Laser Vision Sensor faces downward directly
P[2] : Posture in which only R is turned 90°
P[3] : Posture in which the sensor is inclined by a proper amount
and R is turned by -135° from P [1]
P[1]{
GP1:
UF : 0, UT : 1, CONF : 'N U T, 0, 0, 0',
X= 927.23 mm, Y= 20.86 mm, Z= -94.86 mm,
W= 0.00 deg, P = 0.00 deg, R= -90.00 deg
};
P[2]{
GP1:
UF : 0, UT : 1, CONF : 'N U T, 0, 0, 0',
X= 927.23 mm, Y= 20.86 mm, Z= -94.86 mm,
W= 0.00 deg, P = 0.00 deg, R= 0.00 deg
};
P[3]{
GP1:
UF : 0, UT : 1, CONF : 'N U T, 0, 0, 0',
X= 926.09 mm, Y = 13.07 mm, Z = -98.48 mm,
W= -26.00 deg, P = 6.00 deg, R= 135.00 deg
};

CAUTION
Proper position data depends on how the 3D Laser
Vision Sensor is installed on the robot. Note that the
posture above is not recommended in all cases.
However, the condition "Set P [1] and P[2] so that
only R of the posture component differs.” indicated
in the sample program is required at all times.

- 78 -
6.CALIBRATION B-81444EN/03

6.7 SIMPLE RE-CALIBRATION

If you encounter an accident such as the 3D Laser Vision Sensor


interfering with peripheral equipment in an application using the 3D
Laser Vision Sensor, the optical system of the sensor can be
disordered depending on the extent of interference, requiring 3D
measurement calibration (see Section 6.2, "3D MEASUREMENT
CALIBRATION"). To take action quickly in such an accident, the
simple re-calibration function is available.

This function makes a reference measurement when the 3D Laser


Vision Sensor is operating normally. The results are saved as
reference data. When re-calibration becomes necessary, a similar
measurement is made, and the difference between the results and the
reference data is used to perform calibration in a simplified way.

In simple re-calibration, an error in the installation of the 3D Laser


Vision Sensor relative to the robot is also corrected, so that sensor
coordinate system setting modification and nominal data reacquisition
by an application program are unnecessary.

6.7.1 Execution of Simple Re-calibration

Preparation and procedure


Prepare the calibration jig indicated in Appendix B, and install it at a
fixed position on the base (installation surface) of the robot. If the
jig needs to be removed, ensure that the position and posture of the jig
can be reproduced precisely. This reproducibility precision directly
affects the precision in re-calibration.

Next, create a vision process for measuring the calibration jig (see
Chapter 9, "TEACHING AND TESTING THE 3D VISION
PROCESS"). In CameraSetup, specify camera calibration data
subject to simple re-calibration. Moreover, this measurement uses
"disk measurement" (see Section 9.2, "DISK MEASUREMENT"),
and teaches 2D measurement by “Use”. The circular window cannot
be used. The model origin of the 2D measurement location tool is to
be set precisely at the center of the central dot of the calibration jig by
using the image enlargement function (see Section 3.8,
"ENLARGING AND REDUCING IMAGES") and the Center button
(see Section 7.2, "MODIFYING LOCATION TOOL MODEL").

It is recommended to back up calibration data before starting simple


calibration.

The procedure is described below.


Create a robot program beforehand according to the sample program
introduced later. Two modes are available for re-calibration using a
robot program: reference data acquisition mode and calibration
execution mode.

- 79 -
B-81444EN/03 6.CALIBRATION

When a robot program is executed in the reference data acquisition


mode, the following message appears first:

Detect a grid pattern according to the indicated procedure. At this


time, check that no plus signs are indicated at positions other than the
dots.

Next, when re-calibration becomes necessary, execute the same


sample program in the calibration execution mode. The following
message appears first:

Detect a grid pattern similarly according to the indicated procedure.


At this time, check that no plus signs are indicated at positions other
then the dots. Finally, the results of re-calibration are displayed. In
response to an inquiry about whether to save data by overwriting,
click the OK button. This completes re-calibration.

- 80 -
6.CALIBRATION B-81444EN/03

The reference data and the results of simple re-calibration are stored in
a file named (camera-calib-name).csv in the folder named 3DVision
under the data folder specified on the option setting screen.

Sample program
In simple re-calibration, five register areas and one position register
area are used. In the sample program introduced later, these registers
are referred to as follows:

Register : R[5] to R[9]


Position register : PR[100]

When an area other than these areas is used, modify the sample
program accordingly. These register areas are used for the following
purposes:

R[5] : Sensor coordinate system specification


R[6] : Position register area specification (100 in this case)
R[7] : Measurement processing status
R[8] : Calibration jig dot interval (in mm)
R[9] : Mode specification (1 = Reference data acquisition mode, 0
= Calibration execution mode)
PR[100] : Measurement processing results

The registers and position register above are temporarily used. The
values set in the registers need not be preserved.

CAUTION
Do not modify the position data, user coordinate
system, and tool coordinate system that are used for
execution in the reference data acquisition mode.
If any of these is modified, re-calibration cannot be
executed precisely in the calibration execution mode.

- 81 -
B-81444EN/03 6.CALIBRATION

!
! PERCH CALIBRATION
! (sample program)
!
PAUSE
!
! Make sure a calibration target
! plate is fixed on a appropriate
! position, then go ahead.
!
PAUSE
!
! Confirm the following settings.
!
! R[5] : UTOOL_NUM of sensor
! frame
! R[6] : free position register
! number
! R[8] : grid space (mm)
! R[9] : calibration mode
! 1 : get standard data
! 0 : do calibration
!
R[5] = 1
R[6] = 100
R[8] = 15
R[9] = 0
!
UFRAME_NUM = 0 Teach calibration jig measurement positions
UTOOL_NUM = R[5] in P[1] and P[2].
! The positions are to be about ±50 mm of the
! Point 1 reference standoff.
! Caution : Orientation must be The posture must be set to directly face the
! the same as Point 2. calibration jig in the view line.
! The posture components (W, P, R) of
J P[1] 30% FINE these two points must be identical
WAIT 1.00(sec) (important).
R[7] = (-1)
PCVIS RUN PROGRAM_PCH1 [ST=R[7],OF=PR[R[6]]]
WAIT R[7] <> (-1)
IF R[7] = 0,JMP LBL[999]
Call the program for 3D measurement of the
!
calibration jig mentioned earlier.
! Point 2
Specify (vision-process-name)_PCH1. Here,
! Caution : Orientation must be
"PROGRAM" is the vision process name.
! the same as Point 1.
!
J P[2] 100% FINE
WAIT 1.00(sec)
R[7] = (-1)
PCVIS RUN PROGRAM_PCH2 [ST=R[7],OF=PR[R[6]]]
WAIT R[7] <> (-1)
IF R[7] = 0,JMP LBL[999]
ABORT Call the program for 3D measurement of the
calibration jig mentioned earlier.
LBL[999] Specify (vision-process-name)_PCH2. Here,
UALM[1] "PROGRAM" is the vision process name.

- 82 -
6.CALIBRATION B-81444EN/03

6.7.2 Differences from 3D Measurement Calibration


Simple re-calibration and 3D measurement calibration internally
perform the same calculation to obtain the same data, that is,
calibration data. However, the conditions used differ. The
differences in the conditions are described below. If suitable
conditions can be set, recovery can be speeded up by performing 3D
measurement calibration for the first time in teaching then automating
calibration using simple re-calibration.

Simple re-calibration
(1) Conditions and restrictions
- The position and posture of the calibration jig can be reproduced
precisely between the reference data acquisition mode and the
calibration execution mode.
- The settings of the user coordinate system and tool coordinate
system (sensor coordinate system) are the same between the
reference data acquisition mode and the calibration execution
mode.
- The position and posture of the robot are the same between the
reference data acquisition mode and the calibration execution
mode.
- The reference data acquisition mode is used for execution in the
state of precise calibration with respect to the fixed calibration
jig.
(2) Advantages
- The need to set a sensor coordinate system is eliminated.
- Each vision process need not reacquire nominal data.
- Fully automated program-based calibration is enabled at all times
including immediately after sensor replacement.
- A level of precision almost equivalent to that obtained in 3D
calibration can be obtained.

3D measurement calibration
(1) Conditions and restrictions
- After execution, in general, the sensor coordinate system needs
to be set again, and each vision process needs to reacquire
nominal data.
(2) Advantages
- Even in a situation disabling simple re-calibration as described
below, calibration can be performed if a calibration jig is
available.
- When the 3D Laser Vision Sensor is installed for the first time.
- When an environment where the calibration jig for simple
re-calibration cannot be fastened or the position and posture of
the calibration jig cannot be reproduced is set
- When any of user coordinate system setting data, sensor
coordinate system setting data, and robot position data set in the
reference data acquisition mode is lost or modified

- 83 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

7 TEACHING AND TESTING THE


LOCATION TOOL
Select the [LocationTool] tab on the vision data list screen, and
double-click a location tool icon that you want to train. You will see
the screen as shown below.

- 84 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

7.1 TRAIN MODEL

Click the down arrow to the right of the camera number combination
box, and select the number of the camera you want.
Click the button to display the camera live image, place a part at
the center of the field of view of the camera, and then click the
button to snap a picture of the part.
Click the Train button. A red rectangle will be plotted on the image as
shown below.
You can move it by dragging inside, or resize by dragging handles at
corners. Adjust the rectangle to enclose your part, and then click the
OK button in the dialog box.

CAUTION
The size of a rectangle for enclosing a part is related to
“OverlapLimit” (see Section 7.4 "OPTIONAL
FUNCTION OF LOCATION TOOL").
Make a setting to match the size of a workpiece
whenever possible.

Next, you can select an area outside the model where you don't want
to include in your model pattern inside the rectangle. This is called the
"Don't care region”. Move your mouse on the image while pressing
the left button. The image turns red. The red area will not be included
in the model pattern. If there is anything extra at the background of a
part or a part has an extra feature or stain which no other parts usually
do not, these foreign materials can be excluded from the model to be
taught by coloring it in red. Move your mouse while pressing the right
button to turn the image back to gray scale. If you have a don't care
region, color it in red and then click the OK button in the dialog box.
Otherwise, just click the OK button.

- 85 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

TIP
The thickness of a red pen for painting an area or for
restoring an area painted in red to the original state
can be chosen from the following three options:
- "Thin" when the mouse is used for painting
without using any key
- "Medium" when the mouse is used while holding
down the Cntl key
- "Thick" when the mouse is used while holding
down the Shift key
Choose an option suitable for an area to be taught.

The 3D Laser Vision Sensor starts learning the model pattern. When
finished, the red [Not Trained] will change to green [Trained].

TIP
Software diagnoses the taught model here, and
information to which you should pay attention is
displayed if any.
The same information as introduced in ”Diagnosis” in
Subsection 7.2.1, "Model Display" is displayed. See
the subsection.

In the 2D measurement mode, more than one model can be taught, and
it is possible to find out which of the taught models corresponds to a
detected part. In this case, each model must be assigned a separate
type number. Executing the 2D vision process informs the robot
controller of the type number of the detected model, thus enabling the
type of the detected model to be identified.
When you click the OK button to exit from the screen, the taught data
is saved.

- 86 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

When you click the Cancel button to exit from the screen, the taught
data is lost.

NOTE
You do not have to train the model in the exact same
condition as it is in the execution period. The condition
includes the lighting, the workpiece surface and the
background of the image.
If you train the model with clearly visible workpiece with
less noises, you will get better result.
For example, if the workpiece has reflective surface or
dirty surface, you may get better result by using
mock-up workpiece with plain surface.

- 87 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

7.2 MODIFYING LOCATION TOOL MODEL

A taught model can be modified or checked as required.

Train
By clicking the Train button, the model can be taught again
according to Section 7.1,"TRAIN MODEL". For teaching, the
image displayed at that time is used.
Mask
By clicking the Mask button, a feature to be ignored can be
painted using a red pen. For painting with a red pen, move the
mouse while clicking. Mask modification modifies only an area
painted using a red pen; other information items such as the
model origin are not modified.
Origin
Enter the coordinates of the model origin by specifying numeric
values.
Change
By using the cursor on the image, set the position of the model
origin.
Center
The software calculates the rotation center of the model pattern
and sets the model origin at the rotation center.

- 88 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

Show
A taught model is displayed. For details, see Subsection 7.2.1,
"Model Display".
Use reference point(s)
When a particular function is used, a point needs to be set in
addition to a model origin. Such a point is referred to as a
reference point. Checking this check box enables a reference
point to be set. For details, see Subsection 7.2.2,"Setting a
Reference Point".
Upon completion of modification, click the OK button in the dialog
box.
7.2.1 Model Display

Plot features
Checking this check box displays the feature points (green) of the
model on this screen.
Model contrasts
The contrasts of a taught model are displayed. Max., Min., and
Ave. indicate the highest contrast, lowest contrast, and, average
contrast of the taught model, respectively.
Diagnosis
oftware diagnoses the taught model, and information to which
you should pay attention is displayed if any. Information
displayed here is also displayed at the time of model teaching.
TIP
Depending on the model, the following message may be displayed:

- The position cannot be determined with the trained model pattern. Modify the part for
the model pattern.
- The found position may be unreliable with the trained model pattern. In such a case,
consider to use the extra care region or to modify the part for the model.
- The orientation cannot be determined with the trained model pattern. Uncheck the
checkbox for the Angle in the parameter tab.
- The found orientation may be unreliable with the trained model pattern. In such a
case, consider to use the extra care retion or to modify the part for the model.
- The size cannot be determined with the trained model pattern. Uncheck the
checkbox for the Size in the parameter tab.
- The found size may be unreliable with the trained model pattern. In such a case,
consider to use the extra care region or to modify the part for the model.
- The contrast of the trained pattern is low. Adjust the contrast threshold.

- 89 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

7.2.2 Setting a Reference Point


For "edge measurement" (see Section 9.4, "EDGE MEASUREMEN
T") in 3D measurement, a point needs to be set in addition to a model
origin. Such a point is referred to as a reference point.
3D measurement enables one reference point to be set. With 2D
measurement, no reference point can be set.
Change
By using the cursor on the image, set the position of a reference
point.
TIP
When a model origin and reference point are set, they
should be separated from each other as far as
possible.
If they are too close to each other, or a special setting
is made, the warning "2 points are too close.” is
issued to disable setting.

- 90 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

7.3 ADJUST LOCATION TOOL PARAMETER

Location tool has several parameters for specifying the search area
and detection threshold.
Select the [Parameters] tab. You will see the screen as shown below.

The following table lists the meaning of each parameter, the range of
its values, and how to set them.

Item Description
Number to find Number of objects you want to find.
The 3D Laser Vision Sensor attempts to find as many objects as possible within
the specified number. (Default : 1)
To change the default, enter a new value on the screen.
Accept Value from 1 to 100 that represents the threshold between success and failure.
The score of the found object is equal to or greater than this value, the search is
successful. Otherwise, it is unsuccessful. The score is between 1 and 100.
(Default: 75.)
If a small value is set, a detection error can occur, or the detection time can
increase.
Contrast thresh Threshold of contrast used during a search. The default is 30, which can change,
depending on the model.

- 91 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

Item Description
Set a higher value if the 3D Laser Vision Sensor finds stains or other wrong objects
with low contrast.
Usually, you do not need to change this value.
Search area Area to be searched on the image. (The default is the entire image.)
When a smaller area is selected, the processing speeds up. To change the
search area, click the Change button. A rectangle is displayed on the image.
Make an adjustment in the same way as for model teaching.
Bit-mapped window Used when an arbitrary search area such as a circular search area or
doughnut-shaped search area is to be specified. (The default is the unused state)
To use this window, click the Edit button. Make an adjustment by using a red pen
in the same way as for “Mask” for a model.
To return the window to the unused state, click the Remove button.
Search angle Range of angle to search for. (Default: -180 to 180degrees ) The area should
be within -180 to 180 degrees. -180 to 180 degrees is the default. The smaller the
range, the faster the process. Check off when the part is a circle or the part does
not rotate.
Search size Range of size to search for. (Default: 90 to 110% ) The area should be within
50 to 200 percents of the taught size. Smaller the range, faster the process. Check
off when you don't want to search for various sizes. When the distance between
the camera and object does not stabilize and the size of object in the image
changes subtly, this parameter is useful.
Search inclination Inclination to be used for search. (Default: 0 to 30 degrees. However, this
parameter is disabled by default.)
An angle from 0 to 60 can be set, with the inclination of the object being 0 at the
time of teaching. When a smaller area is specified, the processing speeds up.
When the check box is unchecked, inclination search is not performed. This
function is useful when the directions of the camera and object are unstable so that
a slightly different shape is viewed.

When you click the OK button to exit from the screen, your changes
are saved.
When you click the Cancel button to exit from the screen, your
changes are lost.

- 92 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

7.4 OPTIONAL FUNCTION OF LOCATION TOOL

Location tool has some optional functions for optimizing its operation
for a specific application.
Select the [Options] tab. You will see the screen as shown below.

The table below indicates the meaning, specifiable value range, and
setting method of each parameter.

Item Description
OverlapLimit If patterns found in the image overlap each other to a certain degree or more, the
location tool leaves only a pattern with a high score, rejecting others. An overlap
is determined using the rectangular overlapping area outside the model. (Default:
75%)
When 100% is set, rejection is disabled even if there are overlapping patterns.
FitErrorLimit Set an allowable degree of pattern figure distortion between the model pattern and
a pattern appearing on the image by specifying the number of picture elements.
(Default: 1.5 pix)
A larger value specified allows a greater figure distortion. As a result, a detection
error can occur. So, set a value that matches an actual figure distortion.
A larger value may be specified if the feature section is soft, and can actually be
deformed when compared with the model, or if the workpiece is thick or much
inclined when the distance between the camera and workpiece is short.
TimeoutTime If detection processing consumes a timer longer than the value set here, the
detection processing ends unsuccessfully. (Default: 30,000 ms)
If 0 is set, detection processing is performed endlessly.

- 93 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

The other setting items are described below.


Extra care region
This area is used for detection focusing on a part of a model in
detection processing. For example, this area is used when the
model is circular, and an angle is to be found from a partial
feature such as a partial hole and key groove. For details, see
Subsection 7.4.1, "Extra Care Region".
Plot detected features
When this check box is checked, the feature points of the model
are displayed when the results of detection are displayed.
The feature points displayed as the results of detection are
indicated in two colors: green and red.

Color Description
Green Feature point for which the corresponding feature point was found in the image
Red Feature point for which the corresponding feature point was not found in the image

Enable Statistics Func.


If this check box is checked, statistical information on whether
the feature point corresponding to each feature point of the model
has been found in the image is collected. For details, see
Subsection 7.4.2, "Statistic Function".

7.4.1 Extra care region


For some parts like the figure below that has small feature to decide its
orientation, the 3D Laser Vision Sensor may be less stable for
orientation. In such a case, you can get stable orientation by using the
extra care region.

Extra care region


When “Extra care region” is set to “Use”, the extra care region
becomes usable. In this state, click the Extra care region button.
In the initial state, the entire model is painted. As shown below,
make a setting so that the painting of a desired extra care region
is cancelled.

- 94 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

When an extra care region is used, the object is not detected if the
extra care region is not detected. This means that if an extra care
region cannot be detected even when the object itself is detectable, the
object is not detected. The detection threshold of an extra care
region is set separately from the threshold of the entire object.

Parameter Description
Accept If the score of a detected extra care region is equal to or greater than this value,
the detection is successful. If the score of a detected extra care region is less
than this value, the detection is not successful. Set a value from 1 to 100.
(Default: 70)

7.4.2 Statistic Function


If “Enable Statistics Func.” is checked, statistical information on
whether the feature corresponding to each feature point of the model
has been found in the image is recorded each time a detection
operation is successful.

Show
When you click this button after several detection operations, a
screen as shown below is displayed.

- 95 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

The color of a feature point displayed on the image represents the


probability that a portion of the model matches the feature of the
image. For example, the probability that the portion displayed in
green matches the image is 100%, the probability that the portion
displayed in yellow matches the image is 50%, and the probability that
the portion displayed in red matches the image is 0%. The red
portion does not match the feature of the image even once. So, for
stable detection, the red portion should be masked (an area to be
ignored).
“N Found” is the parameter for statistics. If this value is not large
enough, a biased statistical result is generated. Note that a bias can
also result if a detection operation is performed frequently using
similar images only.
The statistical function records statistical information when a
detection operation is performed by any of a detection test on the
location tool screen, program test execution, and execution from the
robot controller. However, no statistical information is recorded if
no feature is detected. If several features are detected in one
detection operation, the number of detected features is recorded.
Statistical result data is cleared by executing “Train” or “Mask”.

- 96 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

7.5 TESTING LOCATION TOOL

By making a detection test, whether a correct detection can be made is


checked.
Click the button to display a live image, then place a part in the
view of the camera.
Click the button to read the image of the part.

Find
On the image currently displayed, detection is executed.
Snap+Find
After an automatic snap operation, detection is executed.
Continuous
After automatic snap operation, the detection process is executed
in succession.

Hz
Vt

The results of detection are plotted on the image, and data items
such as processing time, position, angle, and score are displayed
as shown above. Vt and Hz for representing a position indicates
a coordinate value in the vertical direction and a coordinate value
in the horizontal direction, respectively, with the upper-left
corner of the image being the origin, as shown above.

- 97 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL

The processing time displayed here does not include a time for
snapping an image and posting results. A time required only for
detection is displayed.

Show runner-up
If this check box is checked, an object that is not detected but
almost satisfies the required score, contrast, angle, size, and so
forth is displayed as a detection result.
This setting is valid for this detection test only.
An object that is not detected with a slight deviation is displayed
with a yellow rectangle in the image, and its data is indicated in
red in the table. By comparing the results of detection indicated
in red with each parameter of the location tool, the parameter(s)
that caused the object to be not detected can be identified.

- 98 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03

CAUTION
When the results of detection processing performed
using the specified parameters are finally checked to
see if the specified parameter values are satisfied, this
function displays an object that is not detected because
of a failure to satisfy the specified parameter values.
For example, this function does not guarantee the
display of an object whose score is from 60 to less than
70, when the score threshold is 70.

ImageFeatures
From the currently displayed image, feature points to be used for
detection processing are detected and displayed in yellow. The
result of this display is affected by the search window,
bit-mapped window, and contrast threshold. In particular, this
function is useful for contrast threshold adjustment.

- 99 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS

8 TEACHING AND EXECUTION THE 2D


VISION PROCESS
On the 2D measurement vision data list, click the [VisionProcess] tab,
and double-click the vision process icon you want to use in teaching.
The Vision Process Training and Test screen shown below appears.

- 100 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03

8.1 SETUP VISION PROCESS

Vision process teaching is classified as basic setting and applied


setting. Basic setting is mandatory, and no default values are
assigned. For applied setting, default values are set. For each
application, modify the settings as required.
Basic Setting
Camera Setup
Select a camera setup used in the vision process. When you
press the Select button, the camera setup list is displayed.
Select the camera setup you want to use and click the OK button.
Location Tool
Select location tools used in this vision process. Click the Add
button. The location tool list will be displayed. Select the
location tool and click the OK button.
When the vision process is executed, the selected location tool
returns the found result, found position. If "Number to find" in
the location tool is greater than one, the location tool returns
them in descending order of score. (When the option switch
“Sort results by size” is disabled)
You can select the multiple location tools.
If the multiple location tools are selected, "Number to find" of
the first selected location tool becomes the number to find of
vision process, and the found results, the found positions, which
are found by the all selected location tools are returned in
descending order of score. (When the option switch “Sort results
by size” is disabled)
When the motion of the robot is to be switched, depending on
which location tool detected the position as a result of detection,
enable the option switch “Send Model ID to robot” to output the
type number of the location tool at the same time.

For details of output, see Section 8.3, "EXECUTION BY


ROBOT CONTROLLER".
Robot to send to
Select robots to which the execution result of the vision process
is sent. Click the Add button. The robot list will be displayed.
Select the robot to which the result is sent, and click the OK
button. You can select multiple robots. The results are sent in the
order they are selected.

Applied setting

- 101 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS

Shutter speed mode


The shutter speed mode is a function for changing the shutter
speed of the camera by software. They are useful when the
sensor is used in diverse environments. There are three stop
adjustment modes as listed below. Select one that suits your
application.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the setting of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.

Image Saving
Select a condition for saving an image. Select one of the
following three options:

Option Description
Don't save The vision process does not save images.
Save if failure When the vision process fails to find, it saves the image to a bmp file.
Saving such images is useful to investigate later.
Usually, select this option.
Save always When the vision process runs, it saves the image to a bmp file.
This is useful to evaluate stability of the system just after setting up the line.
Select "Save if failure" after you verify the system stability, because this option
uses much hardware area.

The .bmp file is saved with the file name "(vision_process_name)@


(date-time).bmp" to the image directory specified in the option screen.
See Chapter 11 about option screen.

TIP
The amount of available hard disk space is limited.
Using too much area of the hard disk drive may cause
problems such as the PC does not boot up. 3D Laser
Vision Sensor has a limitation for image files and does
not save image with exceeding the limit. You can
change the limit in the Options screen. See Chapter 10,
"Options" for details.

Output Type
Select an output format used for the search result from the two
formats shown below.

- 102 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03

Option Description
2-D Location and The 3D Laser Vision Sensor returns the 2-D location and orientation of the found
Orientation parts to the robot position register. X and Y indicates location, R indicates
orientation, and Z, W and P are always zero.
3-D Line The 3D Laser Vision Sensor returns the 3-D line passing throw the camera center
point and the model origin of the found part to the robot position register. X, Y and
Z indicates the intersect of the line and calibration plane, W and P indicates a
vector along the line, and R is 0 when the option switch “Add angle information to
results” is disabled. When the option switch is enabled, a detected angle is
reflected.

Options
Select other options. There are the following options.

Options Description
Log the resulting data When this switch is checked, each time the vision process is executed, the
processing time, found position, and other resulting data are logged in a file. The
file is created under the name "(vision-process-name).csv" in the “2dvision” folder
in the folder specified in Data folder on the Options screen. (See Chapter 11,
"Options”.) This file can be displayed as a graph with Excel or another
spreadsheet application.
Send model ID to robot When this switch is checked, the found model ID of a found location tool is sent to
the robot controller. This option switch is used when you specify multiple location
tools in the vision process to identify the model.
A model ID is specified on the location tool training screen. A model ID is stored
in the register with the same number as the position register in which a found
position is stored.
Sort results by size When two or more results are detected, they are usually output in descending
order of their score. If this item is checked, however, the results are output in the
order of their size, rather than their score.
Add angle information to Displayed only when “3-D Line” is selected. If this item is checked, rotation angle
results information is reflected in the final result of detection.

When you click OK to exit from the screen, the taught data is saved.
When you click Cancel to exit from the screen, the taught data is lost.

- 103 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS

8.2 TEST VISION PROCESS

Place a part in the field of the camera view, and click the Run button.
The found position is plotted on the image.
The resulting values are displayed on the run-time monitor.
To display the run-time monitor, select [Monitor] from the [View]
menu of the 3D Laser Vision Sensor .

The time displayed on the run time monitor is all the processing time
of the vision process, which includes the time for snapping image.
However, the test does not perform the communication, so the
displayed time does not include the communication time.

Image
Select an image subject to test execution.

Option Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. For details, see
Subsection 8.2.1, "Use BMP Files".

Continuous Run
This function is enabled when “Snap from a camera” is set.
Check this item and click the Run button. Image snapping and
detection are repeatedly executed.
Repeated execution lasts until you click Cancel in the following
dialog box:

- 104 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03

8.2.1 Use BMP Files


When “Load BMP files” is set, a wildcard can be used for image file
name specification. If multiple image files are subject to detection as
the result of using a wildcard, a screen as shown below appears.

Clicking starts execution.


The meaning of each icon is indicated below.

Icon Description
Executes in succession in reverse order.
Executes using the immediately following image and stops execution temporarily.
Stops execution temporarily.
Executes using the immediately preceding image and stops execution temporarily.
Executes in succession.

If an intermittent stop occurs due to a detection error, for example, this


function enables the state of detection before and after the occurrence
of the intermittent stop to be checked by storing images with “Save
always” selected in “Image Saving”. Moreover, when the location
tool or vision process is modified, a check can be made by using
stored images to see if there is a side effect.

- 105 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS

8.3 EXECUTION BY ROBOT CONTROLLER

A robot controller starts a vision process for the 3D Laser Vision


Sensor vision through Ethernet communication and receives results.
The execution of a vision process by a robot controller is summarized
as follows:
- The name of a vision process for the 3D Laser Vision Sensor is
specified for execution.
- If the setting of the robot corresponding to the robot controller is
already made with the 3D Laser Vision Sensor , and a connection
is already made, the robot controller can start the vision process for
the 3D Laser Vision Sensor .
- Results are sent to all robot controllers specified in the started
vision process for the 3D Laser Vision Sensor .
It is assumed that the robot controller is connected with the 3D Laser
Vision Sensor via an Ethernet cable and communication is possible.
(On the "robot" data teach screen of the 3D Laser Vision Sensor to be
started, check that the state of connection with the robot controller is
“Connected”.)
To start the vision process for the 3D Laser Vision Sensor from the
robot controller, execute the following instruction with a TP program:

PCVIS RUN (vision-process-name)_PRO [ST=R[n], OF=PR[m]]

or

PCVIS RUN (vision-process-name) [ST=R[n], OF=PR[m]]

In (vision-process-name), enter the name of a 2D vision process to be


executed for the 3D Laser Vision Sensor. Up to 16 characters
including "_PRO" can be entered.
In n and m, enter the numbers of a register and position register for
storing measurement results, respectively. (A number from 1 to 255
may be entered. If the number of a nonexistent register or position
register is specified, an error occurs at execution time.)
Measurement results are stored in a register and position register as
indicated below.

Location Description
R[n] A number indicating whether a measurement is successful is stored. When a
measurement is successful, the detection count is stored. When a measurement
is unsuccessful, 0 is stored.
R[m] ~ If “Send model ID to robot” is checked, type numbers as many as the detection
count are stored in registers starting with R[m]. When detection is unsuccessful,
no data is stored if “Send model ID to robot” is unchecked.
PR[m] ~ Measurement results as many as the detection count are stored in the XYZWPR
format in registers starting with PR[m]. When detection is unsuccessful, no data
is stored.

- 106 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03

A sample robot program for executing a 2D vision process is indicated


below. (The vision process name is PROGRAM1.)

1: UTOOL_NUM = 1
2: R[1]=(-1)
3: PCVIS RUN PROGRAM1 [ST=R[1], OF=PR[2]]
4: WAIT R[1]<>(-1)
5: IF R[1] = 0, JMP LBL[99]
:
10: ! ERROR ABORT
11: LBL[99]
[End]

TIP
In the case of a 2D vision process, measurement
results are output in a "sensor coordinate system" at all
times.

CAUTION
Results sent from the 3D Laser Vision Sensor to the
robot controller are not compensation data but
measured position data. For the method of calculating
compensation data, see Appendix C, "ROBOT
POSITION COMPENSATION".

CAUTION
Ensure that before the execution of the vision process
for the 3D Laser Vision Sensor started by the robot
controller is completed, the robot controller does not
start another vision process for the 3D Laser Vision
Sensor (prohibition of duplicate execution). In other
words, check first that after starting a vision process for
the 3D Laser Vision Sensor by the robot controller,
information on whether a measurement made by the
vision process is successful or not has been sent to a
robot register, then start the next vision process for the
3D Laser Vision Sensor.

- 107 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9 TEACHING AND TESTING THE 3D


VISION PROCESS
Select the [VisionProcess] tab from the 3D measurement vision data
list displayed on the screen, and double-click the icon of the vision
process icon you want to teach. The Vision Process Training and
Test dialog box shown below appears. (The following are for disk
measurement.)

- 108 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.1 TYPES OF 3D MEASUREMENT FUNCTIONS

3D measurement is provided with the following four functions:

Option Description
Disk measurement Measures and outputs the position and posture of a workpiece.
Displacement Measures and outputs the displacement (Z coordinate) of a workpiece. When 2D
measurement measurement is used together in this measurement, X, Y, and R coordinates are
also output.
Edge measurement Measures and outputs the edge point of a laser beam radiated onto a workpiece.
Cross section Measures and outputs a particular position such as the summit of a cross section
measurement based on the cross section figure of a workpiece obtained using a laser beam.

These functions are selected and set up in vision process teaching.


Before this, the camera setup (required setting) and location tool (if
necessary) must be set up. If the functions are to be used to detect
more than one type of object, vision process teaching must be carried
out as many times as the number of object types involved.

- 109 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.2 DISK MEASUREMENT

The disk measurement function is used to obtain the position and


posture of a flat workpiece. This function is useful for applications
that measure workpieces having a relatively large flat area such as
body panels of vehicles and parts to be machined by a machine tool.

9.2.1 Information Obtained


Disk measurement finds the 3D position and posture of a workpiece
as shown below.

3D Laser Vision Sensor


Laser slit beam projector
CCD camera

Laser ON Laser OFF

Laser beam image processing 2D image processing


Following information about Following information about 2D
plane where laser beam lines feature on plane can be
are placed can be obtained:d obtained:
・ Distance from sensor ・ Position (x, y)
・ Posture ・ Rotation θ

Synthesis

3D position and posture information (x, y, z, w, p, r)

TIP
Measurement not accompanied by 2D image
processing (that is, measurement based on laser beam
image processing only) is possible. In this case, the
position is assumed to be the point of intersection of
the two laser beam lines, and the rotation θ is assumed
to be 0. Because of measurement using a laser beam
only, this type of measurement has such an advantage
that a measurement can be made in the darkness.
However, position and rotation information cannot be
obtained.

- 110 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.2.2 Settings

3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.

Camera Setup
Select a type of camera setup to be used by the vision process.
Clicking the Select button displays a list of camera setup names.
Select a desired camera setup name and click the OK button.
Function
This is a used by the vision process. Select displacement
measurement or disk measurement, whichever is desired.
Window setting
This function enables only the necessary portion of the laser
information to be taken out. Not all parts of the laser beam
radiated toward the workpiece strike it. Windows are used to
exclude the laser beam radiated to any object other than the
workpiece.
Click the Window setting button. A dialog box for teaching the
window area appears. Proceed as directed in the dialog.

Teach an area to be used for 3D measurement as a window.

- 111 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Teach an area to be excluded from 3D measurement as a window.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancle button to make
a setting again.

Use circle window

- 112 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

When this check box is checked for window setting, a concentric


two-circle window is set.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instructions of the dialog box.
At first, set the center of the concentric circles.

Next, teach a point on the periphery of the outer circle of the


circular area to be used for 3D measurement.

- 113 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Then, teach a point on the periphery of the inner circle of the


circular area to be excluded from 3D measurement.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to make
a setting again.

- 114 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Use window mask


When this check box is checked for window setting, a screen as
shown below is displayed at the end of window setting. At this
time, portions to be excluded from 3D measurement can be
painted using a red pen. To paint with a red pen, move the
mouse while clicking. Upon completion of setting, click the
OK button to end the setting. In the screen shown below, a
rectangular window is used. However, this adjustment can be
made with a circular window as well.

Parameter
Performing 3D measurement requires making various
adjustments according to an object to be measured. Usually this
function is not used, but change the parameters as required.
Pressing the Parameter button opens the following form.

- 115 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Items Description
Distance Threshold Threshold of the distance between the two straight lines obtained from two laser
beams. If the distance is longer than the threshold, plane detection is regarded
as unsuccessful, resulting in a failure in plane detection. (Default: 3.0 mm)
Brigntness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWLING BRIGHTNESS
DISTRIBUTION".
Slit Points Threshold Threshold for extraction of a straight line obtained by laser. If the plane to be
measured is small, decrease this value within the range that ensures correct
detection. (Default: 50)
Slant Threshold This parameter locks the detection tilting angle. If the measurement result tells
that the workpiece is tilted more than this angle, an error is detected. (Typical
value: 90 degrees)

After you finish with adjustments, click the OK button to apply what
you specified. If you want to discard what you specified, click the
Cancel button to close the form.

Shutter speed mode


The shutter speed mode is a function for changing the shutter
speed of the camera by software. This mode is useful for using
the sensor in various environments. Select one of the tree
shutter speed modes indicated below.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi Mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.

- 116 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

2D setting
When a 3D measurement is made, 2D measurement information can
be used for position identification. A setting for this purpose is made
in 2D measurement setting. When 2D information is not required,
choose “Not Use” in the selection of “Use / Not Use” below.

Use / Not Use


When 2D measurement information is to be used, choose “Use”.
When 2D measurement information is not to be used, choose
“Not Use”.
When 2D measurement information is used, no 3D measurement
is made if detection fails in 2D measurement.
Location Tool
If “Use” is selected for 2D measurement, select a location tool to
be used with the vision process. Clicking theSelect button
displays a list of location tools. Select a location tool to be used,
then click the OK button.
2D feature gap from plane
If you selected "Use" for 2D setting shown above, set the gap
between the height of the feature targeted for 2D measurement
and the height of the plane targeted for 3D measurement. If the
2D measurement features are closer to the 3D Laser Vision
Sensor than the 3D measurement plane, a positive (+) value is
set.

- 117 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

3D vision sensor

2D measurement features

3D measurement plane

Gap (+)

Section (Example)

Shutter speed mode


The setting of the shutter speed adjustment mode is almost the
same as for “3D setting”. However, the setting of “Snap times”
is not applicable here.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the setting of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
When 0 is set, the same brightness as set by “Not Use” is produced.

Others

- 118 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Set “Other” according to each application.


However, the setting of “Robot to send to” must not be omitted.

Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.
When multiple robots are selected, the results of vision process
execution are sent to all selected robot controllers.
Output frame setting
The results of 3D Laser Vision Sensor measurement need to be
related to the coordinate system of the robot. Here, select one
of the three options indicated below to determine which
information is used for relationship.

Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
This option may be used in an application in which the 3D Laser Vision Sensor is
installed at a fixed position and the sensor coordinate system is set to the user
coordinate system.
User frame: By using the current robot position at the time of measurement, this option
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system. Select this option usually.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the

- 119 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Option Description
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.

CAUTION
The term "robot" used here means a robot controller
that started the 3D Laser Vision Sensor. The 3D
Laser Vision Sensor obtains the current position and
position register information from the robot controller
that started the 3D Laser Vision Sensor. Do not
confuse this robot controller with “Robot to send to”,
although both represent the same robot controller in
many cases.

Image and 3DV position data saving


Select a condition for saving images and 3D Laser Vision Sensor
position data. Select one of the options indicated in the table
below.

Option Description
Don’t save No image is saved.
Save if failure When a vision process is executed, images not detected are saved in files.
Select this option usually.
By saving an image not detected, the cause can be investigated later. For
investigation of the cause, test execution should be performed by using an image
file and adjusting the parameters. For test execution using an image file, see
Subsection 9.6.2, "Use BMP Files".
Save always Image files are saved each time a vision process is executed. This option is
useful in a stage where detection stability is evaluated, for example, immediately
after system startup.
After normal operation starts, switch the setting so that images are saved only
when undetected.

An image is saved in a file named (vision-process-name)12D@(date-


time).bmp (only when 2D measurement is used), (vision-process-
name)13DSubt1 @ (date-time).bmp, or (vision-process-name)13D
Subt2 @ (date-time).bmp.3D Laser Vision Sensor position data is
saved in a file named (vision-process-name)@(date-time).psn.
All of these files are saved in a folder specified in Image folder on the
option setting screen. (See Section 11.1, "OPTION SETTING".)

CAUTION
The hard disk has a limited capacity. If it is used up,

- 120 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

the PC cannot be started, or other troubles occur. The


3D Laser Vision Sensor system limits the hard disk
area that can be used to save image data on the
Options screen. Be careful not to save more image
data than the limit.

Options
Set up option switches. There are two option switches as listed
below.

Items Description
Log the resulting data Each time the vision process is executed, the processing time, found position,
and other resulting data are logged in a file. The file is created under the
name "vision-process-name.csv" in the “3dvision” folder in the folder specified
in Data folder on the Options screen. (See Chapter 11, "Options”.) A .csv
file can be displayed as a graph with Excel or another spreadsheet application.
Measure narrow area For a workpiece whose laser beam radiation area is narrow, a failure is likely to
occur.
In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.

If you click the OK button to exit from this dialog box, the taught data
is saved. If you click the Cancel button to exit the dialog box, the
taught data is lost.
Test
Image
Select an image subject to test execution.

Option Description
Snap from a camera In test execution, an image is snapped from the camera.

Load BMP files In test execution, an image file stored on hard disk is used. When 2D
measurement is used, “BMP File(2D)” is to be set for detection. When 2D
measurement is not used, “BMP file(Slit1)” is to be set for detection. For
details, see Subsection 9.6.2, "Use BMP Files".

Continuous Run
This check box is enabled when “Snap from a camera” is
selected. For details, see Subsection 9.6.1, "Continuous Run".
Get base position data
If “Use” is selected for 2D measurement as described earlier, the
window for the laser is dynamically moved for 3D measurement
based on the detection results of 2D measurement. So, the
detected 2D measurement feature needs to be related to the
relative position of the window for the laser. This operation is
conducted by test execution performed by checking this check
box. The result is not saved until you click the OK button or
Apply button. When data is acquired, the date and time of
acquisition is displayed.
CAUTION
Be sure to acquire reference position data in 3D

- 121 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

measurement that uses 2D measurement. Moreover,


perform this operation in the state where the window for
the laser was taught. If the program is executed without
satisfying this condition, the window is not set properly on
the image, resulting in an incorrect detection or a failure in
detection.

After the window for the laser is taught, the following can occur:
- You want to make a modification to the window for the laser.
- You did not acquire base position data.
In these cases, actions described below can be taken.

When you want to make a modification to the window for


the laser…
(1) An image file named "(vision-process-name)_WindowTeach.
bmp" is included in the folder named "3dvision" under “Data
folder”. (For information about “Data folder”, see Section 11.1,
"OPTION SETTING".) Display this file by choosing
[File]→[Open BMP file...] from the main menu. This produces
the same state as present when the window for the laser was
taught.
(2) A modification to the window for the laser can be made by
making a setting again in this state. Base position data, if
already acquired correctly, need not be acquired again.

When you did not acquire base position data…


(1) In specification of “Image” in Test, select “Load BMP files”.
(2) Check “Get base position data”. Be sure to check this check
box before setting (3) below. Otherwise, an image file selection
in (3) cannot be made.
(3) In specification of “BMP File(2D)”, select "(vision-process-
name)_WindowTeach.bmp" in the folder named "3dvision"
under “Data folder”.
(4) Click the Run button. The result is satisfactory when a correct
detection is made.

9.2.3 Teaching Procedure


This subsection describes the basic teaching procedure for disk
measurement. As an example, the plate of the calibration jig is
measured. This procedure can be used to check the precision of
calibration.

(1) Place a workpiece and determine a measurement position.


(2) When 2D measurement is used, teach the location tool.
In this example, a pattern around the center of the plate of the
calibration jig is selected as a model.
Particularly when the origin of a model needs to be set precisely as
in a case where the precision of calibration is checked, set the
model center of the location tool precisely by using the image
enlargement function (see Section 3.8, "ENLARGING AND

- 122 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

REDUCING IMAGES") and the Center button (see Section 7.2,


"MODIFYING LOCATION TOOL MODEL").

(3) Make settings for 3D measurement. Set camera setup data,


measurement functions, a transmission destination robot, window,
and so forth. For details of setting, see Subsection 9.2.2,
"Settings".
Particularly when 2D measurement is used together, select “Use”
in the setting of 2D measurement, and set the location tool taught
in (2) above as “Location Tool”.
The screen below shows how the window is set. The portion
outside the plane is masked by setting after checking “Use window
mask”.

(4) When 2D measurement is used together, obtain reference position


data while settings for 3D measurement are made. The screens

- 123 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

below show an example of the results of obtaining a reference


position, and an output message.

(5) For confirmation, perform test execution. The screen below


shows an example of the results of detection. Make a parameter
adjustment as required while viewing the results of detection. As
a result of detection, laser beam points are displayed. For the
points display, two colors are used: red (thick) and light blue
(thin).

Color Description

- 124 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Red (thick) Points extracted as a laser beam point string and used for calculation. If points
other than a point string such as noises are displayed in this color, the precision
may have deteriorated. In such a case, a parameter adjustment is required.
Light blue (thin) Points extracted as a laser beam point string but not used for calculation. In
some cases, points extracted correctly may be displayed in this color.

(6) When an adjustment is completed, the teaching ends.

- 125 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.3 DISPLACEMENT MEASUREMENT

Displacement measurement is used to obtain a Z value in a reference


coordinate system. Displacement measurement is useful for an
application that requires information such as a height from the
reference surface of a workpiece and the distance between a sensor
and workpiece.

9.3.1 Information Obtained


Displacement measurement finds the 3D position and rotation of a
workpiece as shown below. The "distance from the sensor" of the
plane where a laser beam line is placed is found using an average
value obtained by converting the laser slit point string to a 3D
position.

3D Laser Vision Sensor


Laser slit beam projector
CCD camera

Laser ON Laser OFF

Laser beam image processing 2D image processing


Distance of plane where laser Information about 2D feature on
beam lines are placed from plane can be obtained:
sensor can be obtained. ・ Position (x, y)
・ Rotation θ
d Synthesis

3D position and posture information (x, y, z, 0, 0, r)

TIP
Measurement not accompanied by 2D image
processing (that is, measurement based on laser beam
image processing only) is possible. In this case,
position and posture information other than a Z
coordinate cannot be obtained. Because of
measurement using a laser beam only, this type of
measurement has such an advantage that a
measurement can be made in the darkness.
However, position and rotation information cannot be
obtained.

- 126 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.3.2 Settings

3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.

Camera Setup
Select a camera setup option to be used with the vision process.
Click the Select button to display a list of camera setup options.
Select a desired camera setup name then click the OK button.
Function
This setting item is used to select a measurement function to be
used with the vision process. Select Displacement.
Window setting
This setting item is used to collect only a required portion of
laser information. A laser beam is not always radiated
sufficiently across a workpiece without being radiated outside the
workpiece. A window is used to exclude a laser beam radiated
outside the workpiece from processing.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instruction of the dialog box.

Teach an area to be used for 3D measurement as a window.

- 127 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to
make a setting again.

Use circle window


When this check box is checked for window setting, a circular
window is set.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instruction of the dialog box.
At first, set the center of a circle.

- 128 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Next, teach a point on the periphery of the circular area to be


used for 3D measurement.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to
make a setting again.

- 129 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Use window mask


When this check box is checked for window setting, a screen as
shown below is displayed at the end of window setting. At this
time, portions to be excluded from 3D measurement can be
painted using a red pen. To paint with a red pen, move the
mouse while clicking. Upon completion of setting, click the
OK button to end the setting. In the screen shown below, a
rectangular window is used. However, this adjustment can be
made with a circular window as well.

Parameter
For 3D measurement, various adjustments may need to be made
for a measurement object. Usually, the parameters need not be
adjusted. However, adjust parameters if necessary. Click the
Parameter button. The form shown below appears.

- 130 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Items Description
Brightness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWING BRIGHTNESS
DISTRIBUTION".
Displacement This parameter is valid when the option switch “Select slit points using dispersion
dispersion(STD) of displacement” in “Others” is checked. Because of noise on the image, some
points may represent an exceptional displacement when compared with others.
This parameter is used to statistically ignore those points. When a smaller value
is set, it is easier to remove noise components. However, necessary data is
discarded if the measurement range varies. (Default: 3 STD. STD means the
standard deviation.)
Displacement range(mm) This parameter is valid when the option switch “Select slit points using dispersion
of displacement” in “Others” is checked. This parameter is used to set a range of
displacement for an actual measurement object. Change the setting of this
parameter to match a measurement object. (Default: 100 mm)

Click the OK button when the adjustment is completed and the


new setting is to be applied. When canceling the attempt for
adjustment, click the Cancel button to close the form.
Shutter speed mode
The shutter speed mode is a function for changing the shutter
speed of the camera by software. This mode is useful for using
the sensor in various environments. Select one of the three
shutter speed modes indicated below.

Option Description

- 131 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.

2D setting
When a 3D measurement is made, 2D measurement information can
be used for position identification. A setting for this purpose is made
in 2D measurement setting. When 2D information is not required,
choose “Not Use” in the selection of “Use / Not Use” below.

Use / Not Use


When 2D measurement information is to be used, choose “Use”.
When 2D measurement information is not to be used, choose
“Not Use”.
When 2D measurement information is used, no 3D measurement
is made if detection fails in 2D measurement.

Location Tool

- 132 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

If “Use” is selected for 2D measurement, select a location tool to


be used with the vision process. Clicking the Select button
displays a list of location tools. Select a location tool to be used,
then click the OK button.
Shutter speed mode
The setting of the shutter speed adjustment mode is almost the
same as for “3D Setting”. However, the setting of “Snap times”
is not applicable in single mode.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the setting of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
When 0 is set, the same brightness as set by “Not Use” is produced.

Others
Set “Other” according to each application.
However, the setting of “Robot to send to” must not be omitted.

Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.

- 133 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

When multiple robots are selected, the results of vision process


execution are sent to all selected robot controllers.
Output frame setting
The results of 3D Laser Vision Sensor measurement need to be
related to the coordinate system of the robot. Here, select one
of the three options indicated below to determine which
information is used for relationship.

Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
This option is used in an application that measures the distance between the 3D
Laser Vision Sensor and a workpiece. Select this option usually.
User frame: By using the current robot position at the time of measurement, this option
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.

CAUTION
The term "robot" used here means a robot controller that
started the 3D Laser Vision Sensor . The 3D Laser
Vision Sensor obtains the current position and position
register information from the robot controller that started
the 3D Laser Vision Sensor. Do not confuse this robot
controller with the "transmission destination robot",
although both represent the same robot controller in many
cases.

Image and 3DV position data saving


Select a condition for saving images and 3D Laser Vision Sensor
position data. Select one of the three options indicated in the
table below.

Option Description
Don't save No image is saved.
Save if failure When a vision process is executed, images not detected are saved in files.
Select this option usually.
By saving an image not detected, the cause can be investigated later. For
investigation of the cause, test execution should be performed by using an image
file and adjusting the parameters. For test execution using an image file, see
Subsection 9.6.2, "Use BMP Files".
Save always Image files are saved each time a vision process is executed. This option is
useful in a stage where detection stability is evaluated, for example, immediately
after system startup.
After normal operation starts, switch the setting so that images are saved only

- 134 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Option Description
when undetected.
An image is saved in a file named (vision-process-name)12D@
(date-time).bmp (only when 2D measurement is used), (vision-process
-name)13DSubt1 @ (date-time).bmp, or (vision-process-name)13D
Subt2@(date-time).bmp.
3D Laser Vision Sensor position data is saved in a file named (vision-
process-name)@(date-time).psn. All of these files are saved in a
folder specified in Image folder on the option setting screen. (See
Section 11.1, "OPTION SETTING".)

TIP
The hard disk has a limited capacity. If the hard disk
is fully used up, trouble such as a failure in starting up
the personal computer can arise. With the 3D Laser
Vision Sensor, a limitation on the hard disk area that
can be used to save images is set on the option setting
screen so that images are not saved beyond the set
area.

Options
Set a desired option switch. Four option switches are available
as indicated in the table below.

Items Description
Log the resulting data Each time a vision process is executed, the results of execution such as a
processing time and detection position are recorded in a file. A file named
(vision-process-name).csv is created in the folder named “3dvision” under the
folder specified in Data folder on the option setting screen. (See Chapter 11,
"OPTIONS".)
The data in this csv file can be formed, for example, into a graph by spreadsheet
software such as Excel®.
Measure narrow area If a laser beam is directed to a narrow area of a workpiece, detection may tend to
fail. In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.
Select slit points using Some points do not represent an actual measurement object, for example, due to
dispersion of displacement noise on the image. This option deletes those points by statistically processing
the overall displacement of points. (Default: ON)
The related parameters are “Displacement dispersion(STD)” and “Displacement
range(mm)”. (See the description of “Parameter” of “3D setting”.)
Select slit points using Of a slit that usually has a uniform width, a portion that is placed over a hole edge,
dispersion of slit width for example, can be deformed or has a narrower width when compared with an
ordinary slit, thus resulting in a degraded point string extraction precision when
compared with ordinary portions. This option removes such a portion by
performing statistical processing. (Default: ON)

When you exit from the screen by clicking the OK button, the taught
data is saved.
When you exit from the screen by clicking the Cancel button, the
taught data is cancelled.
Test
Image
- 135 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Select an image subject to test execution.

Option Description
Snap from a camera In test execution, an image is snapped from the camera.

Load BMP files In test execution, an image file stored on hard disk is used. When 2D
measurement is used, “BMP file(2D)” is to be set for detection. When 2D
measurement is not used, “BMP file(Slit1)” is to be set for detection. For details,
see Subsection 9.6.2, "Use BMP Files".

Continuous Run
This check box is enabled when “Snap from a camera” is
selected. For details, see Subsection 9.6.1, "Continuous Run".
Get base position data
If “Use” is selected for 2D measurement as described earlier, the
window for the laser is dynamically moved for 3D measurement
based on the detection results of 2D measurement. So, the
detected 2D measurement feature needs to be related to the
relative position of the window for the laser. This operation is
conducted by test execution performed by checking this check
box. The result is not saved until you click the OK button or
Apply button. When data is acquired, the date and time of
acquisition is displayed.

CAUTION
Be sure to acquire reference position data in 3D
measurement that uses 2D measurement. Moreover,
perform this operation in the state where the window for
the laser was taught. If the program is executed without
satisfying this condition, the window is not set properly on
the image, resulting in an incorrect detection or a failure in
detection.

After the window for the laser is taught, the following can occur:
- You want to make a modification to the window for the laser.
- You did not acquire base position data.
In these cases, actions described below can be taken.

When you want to make a modification to the window for


the laser…
(1) An image file named "(vision-process-name)_WindowTeac
h.bmp" is included in the folder named "3dvision" under “Data
folder”. (For information about “Data folder”, see Section 11.1,
"OPTION SETTING".) Display this file by choosing
[File]→[Open BMP file...] from the main menu. This produces
the same state as present when the window for the laser was
taught.
(2) A modification to the window for the laser can be made by
making a setting again in this state. Base position data, if
already acquired correctly, need not be acquired again.

- 136 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

When you did not acquire base position data…


(1) In specification of “Image” in Test, select “Load BMP files”.
(2) Check “Get base position data”. Be sure to check this check
box before setting (3) below. Otherwise, an image file selection
in (3) cannot be made.
(3) In specification of “BMP File(2D)”, select "(vision-process
-name)_WindowTeach.bmp" in the folder named "3dvision"
under “Data folder”.
(4) Click the Run button. The result is satisfactory when a correct
detection is made.

9.3.3 Teaching Procedure


This subsection describes the basic teaching procedure for
displacement measurement. As an example, the plate of the
calibration jig is measured.

(1) Place a workpiece and determine a measurement position.


(2) When 2D measurement is used, teach the location tool.
In this example, a pattern around the center of the plate of the
calibration jig is selected as a model. When the origin of a model
needs to be set precisely, set the model center of the location tool
precisely by using the image enlargement function (see Section 3.8,
"ENLARGING AND REDUCING IMAGES") and the Center
button (see Section 7.2, "MODIFYING LOCATION TOOL
MODEL").

(3) Make settings for 3D measurement. Set camera setup data,


measurement functions, a transmission destination robot, window,
and so forth. For details of setting, see Subsection 9.3.2,
"Settings".
Particularly when 2D measurement is used together, select “Use”
in the setting of 2D measurement, and set the location tool taught
in (2) above as “Location Tool”.

- 137 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

The screen below shows how the window is set. The portion not
to be measured is masked by setting after checking “Use window
mask”.

(4) When 2D measurement is used together, obtain reference position


data while settings for 3D measurement are made. The screens
below show an example of the results of obtaining a reference
position, and an output message.

(5) For confirmation, perform test execution. The screen below


shows an example of the results of detection. Make a parameter
- 138 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

adjustment as required while viewing the results of detection. As


a result of detection, laser beam points are displayed. For string
display, two colors are used: red (thick) and light blue (thin).

Color Description
Red (thick) Points extracted as a laser beam point string and used for calculation. If points
other than a point string such as noises are displayed in this color, the precision
may have deteriorated. In such a case, a parameter adjustment is required.
Light blue (thin) Points extracted as a laser beam point string but not used for calculation. In
some cases, points extracted correctly may be displayed in this color.

(6) When an adjustment is completed, the teaching ends.

- 139 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.4 EDGE MEASUREMENT

When a workpiece to be measured includes a linear step or boundary,


this function measures one point on a straight line as a boundary.
This function is useful for an application that needs to find the
position of a step or boundary in 3D mode.

9.4.1 Information Obtained


Edge measurement finds the 3D position of an end point by finding
the intersection on a virtual plane of a laser beam with a straight line
identified using 2D image processing.
3DLaser
3D visionVision
sensorSensor

CCD camera
Laser slit beam projector

Laser ON Laser OFF

2D image processing
Laser beam image processing

Laser beam line is converted Straight line including end point


to 3D point string. that intersects with laser slit
beam is acquired.

Synthesized on virtual plane End point to be found

Straight line Laser beam


point string

3D position of end point and direction of


straight line including end point
(x, y, z, 0, 0, r)

CAUTION
n edge measurement, an end point to which a laser beam is projected is measured,
regardless of how the workpiece position differs. So, no identical position on a
workpiece is measured each time.
Note that for workpiece position compensation using edge measurement, multiple
measurements need to be combined with each other.

- 140 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.4.2 Settings

3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.

Camera Setup
Select a camera setup option to be used with the vision process.
Click the Select button to display a list of camera setup options.
Select a desired camera setup name then click the OK button.
Function
This setting item is used to select a measurement function to be
used with the vision process. Select “Edge”.

Window setting
This setting item is used to collect only a required portion of
laser information. A laser beam is not always radiated
sufficiently across a workpiece without being radiated outside the
workpiece. A window is used to exclude a laser beam radiated
outside the workpiece from processing.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instruction of the dialog box.

- 141 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Teach an area to be used for 3D measurement as a window.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to make
a setting again.

Use circle window


When this check box is checked for window setting, a circular
window is set.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instruction of the dialog box.

- 142 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

At first, set the center of a circle.

Next, teach a point on the periphery of the circular area to be used for
3D measurement.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to make a
setting again.

- 143 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Use window mask


When this check box is checked for window setting, a screen as
shown below is displayed at the end of window setting. At this
time, portions to be excluded from 3D measurement can be
painted using a red pen. To paint with a red pen, move the
mouse while clicking. Upon completion of setting, click the
OK button to end the setting. In the screen shown below, a
rectangular window is used. However, this adjustment can be
made with a circular window as well.

(Note on window setting)


On the setting of a window for edge measurement, a restriction
not applicable to the other functions is imposed. That is, the
center of a window must always be placed on the measurement
side relative to the straight line including an edge to be found.

- 144 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

A Straight line L including an end point P


(= Straight line formed by location tool
origin and reference point)

P
A description is provided using the figure above.
Suppose that a laser beam is radiated to the workpiece and the figure
shown above is obtained. Suppose also that point P is the point to
be obtained finally by edge measurement. In this case, straight line
L is set with the location tool so that L is obtained by the model
origin and reference point.
To obtain P in this state, a laser beam on the left side of straight line
L can be used on the image in a window like A. Alternatively, a
laser beam on the right side of straight line L can be used on the
image in a window like B.
From the position of the window center, the software determines
which laser beam to use for calculation.
The center of window A is placed on the left side of straight line L.
So, the software uses a laser beam on the left side of L.
The center of window B is placed on the right side of straight line L.
So, the software uses a laser beam on the right side of L.
Both of windows A and B include an opposite-side laser beam not
to be used relative to straight line L. From the specification above,
however, only those points that are on the same side relative to
straight line L are used, so that the precision is not affected.
Keep this restriction in mind when setting a window.

CAUTION
On the setting of a window for edge measurement, a
restriction not applicable to the other functions is
imposed.
Make a correct setting after understanding the
description above.

Parameter
For 3D measurement, various adjustments may need to be made
for a measurement object. Usually, the parameters need not be
adjusted. However, adjust parameters if necessary. Click the
Parameter button. The form shown below appears.

- 145 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Items Description
Brightness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWING BRIGHTNESS
DISTRIBUTION".
Slit number Number of a laser beam to be used. With this function, only the one of the two
laser beams that corresponds to a number specified here is used for
measurement. (Default: 1)
1: Displayed from the lower-left corner to the upper-right corner on the screen
2: Displayed from the upper-left corner to the lower-right corner on the screen
Upper limit(mm) Among 3D positions found from individual laser beam points, those positions that
are larger than this value along the Z axis in the sensor coordinate system are not
used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: 50 mm)
Lower limit(mm) Among 3D positions found from individual laser beam points, those positions that
are smaller than this value along the Z axis in the sensor coordinate system are
not used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: -50 mm)
Distance limit(mm) Among 3D positions found from individual laser beam points, those positions that
are separated by a distance greater than this value from the straight line including
an edge to be found are not used for calculation. Make a setting according to the
figure of the workpiece. (Default: 50 mm)

Click the OK button when the adjustment is completed and the


new setting is to be applied. When canceling the attempt for
adjustment, click the Cancel button to close the form.
Shutter speed mode
The shutter speed mode is a function for changing the shutter
speed of the camera by software. This mode is useful for using

- 146 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

the sensor in various environments. Select one of the three


shutter speed modes indicated below.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.

2D setting
In edge measurement, a step needs to be detected as a straight line by
using 2D measurement. In the selection of “Use / Not Use” below,
only “Use” can be selected.

Use / Not Use


In edge measurement, only “Use” can be selected.
If no object is detected in 2D measurement, 3D measurement is
not performed.

- 147 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Location Tool
Select a location tool to be used with the vision process.
Clicking the Select button displays a list of location tools.
Select a location tool to be used, then click the OK button.
Shutter speed mode
The setting of the shutter speed adjustment mode is almost the
same as for “3D setting”. However, the setting of Snap times is
not applicable in single mode.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
When 0 is set, the same brightness as set by “Not Use” is produced.

Others
Set “Others” according to each application.
However, the setting of “Robot to send to” must not be omitted.

Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.

- 148 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

When multiple robots are selected, the results of vision process


execution are sent to the selected robot controllers in the order of
selection.
Output frame setting
The results of 3D Laser Vision Sensor measurement need to be
related to the coordinate system of the robot. Here, select one
of the three options indicated below to determine which
information is to be used for relationship.

Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
The use of KAREL can serve a wide variety of cases, so that the default is this
option.
User frame: By using the current robot position at the time of measurement, this option
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.

CAUTION
The term "robot" used here means a robot controller that started the 3D Laser Vision Sensor. The 3D
Laser Vision Sensor obtains the current position and position register information from the robot
controller started the 3D Laser Vision Sensor. Do not confuse this robot controller with the
"transmission destination robot", although both represent the same robot controller in many cases.

Image and 3DV position data saving


Select a condition for saving images and 3D Laser Vision Sensor
position data. Select one of the three options indicated in the
table below.
Option Description
Don't save No image is saved.
Save if failure When a vision process is executed, images not detected are saved in files.
Select this option usually.
By saving an image not detected, the cause can be investigated later. For
investigation of the cause, test execution should be performed by using an image
file and adjusting the parameters. For test execution using an image file, see
Subsection 9.6.2, "Use BMP Files".
Save always Images are saved to files each time a vision process is executed. This option is
useful in a stage where detection stability is evaluated, for example, immediately
after system startup.
After normal operation starts, switch the setting so that images are saved only
when undetected.
Images are saved in two files. One file is (vision-process-name)12D
@(date-time).bmp, and the other is (vision-process-name)13DSubt1

- 149 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

@ (date-time).bmp or (vision-process-name)13DSubt2 @ (date-tim


e).bmp that corresponds to the slit number being used.
3D Laser Vision Sensor position data is saved in a file named
(vision-process-name)@(date-time).psn.
All of these files are saved in a folder specified in Image folder on the
option setting screen. (See Section 11.1, "OPTION SETTING".)

TIP
The hard disk has a limited capacity. If the hard disk
is fully used up, trouble such as a failure in starting up
the personal computer can arise. With the 3D Laser
Vision Sensor , a limitation on the hard disk area that
can be used to save images is set on the option setting
screen so that images are not saved beyond the set
area.

Options
Set a desired option switch. Two option switches are available
as indicated in the table below.
Items Description
Log the resulting data Each time a vision process is executed, the results of execution such as a
processing time and detection position are recorded in a file. A file named
(vision-process-name).csv is created in the folder named “3dvision” under the
folder specified in Data folder on the option setting screen. (See Chapter 11,
"OPTIONS".)
The data in this csv file can be formed, for example, into a graph by spreadsheet
software such as Excel®.
Measure narrow area If a laser beam is directed to a narrow area of a workpiece, detection may tend to
fail. In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.

When you exit from the screen by clicking the OK button, the taught
data is saved.
When you exit from the screen by clicking the Cancel button, the
taught data is cancelled.
Test
Image
Select an image subject to test execution.

Option Description
Snap from a camera In test execution, an image is snapped from the camera.

Load BMP files In test execution, an image file stored on hard disk is used. For details, see
Subsection 9.6.2, "Use BMP Files".

Continuous Run
This check box is enabled when “Snap from a camera” is
selected.
For details, see Subsection 9.6.1, "Continuous Run".
Get base position data

- 150 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

In edge measurement that always uses 2D measurement, the


window for the laser is dynamically moved for 3D measurement
based on the detection results of 2D measurement. So, the
detected 2D measurement feature needs to be related to the
relative position of the window for the laser. This operation is
conducted by test execution performed by checking this check
box. The result is not saved until you click the OK button or
Apply button. When data is acquired, the date and time of
acquisition is displayed.

CAUTION
Be sure to acquire reference position data in 3D
measurement that uses 2D measurement. Moreover,
perform this operation in the state where the window for
the laser was taught. If the program is executed without
satisfying this condition, the window is not set properly on
the image, resulting in an incorrect detection or a failure in
detection.

After the window for the laser is taught, the following can occur:
- You want to make a modification to the window for the laser.
- You did not acquire base position data.
In these cases, actions described below can be taken.

When you want to make a modification to the window for


the laser…
(1) An image file named "(vision-process-name)_WindowTea
ch.bmp" is included in the folder named "3dvision" under “Data
folder”. (For information about “Data folder”, see Section 11.1,
"OPTION SETTING".) Display this file by choosing
[File]→[Open BMP file...] from the main menu. This produces
the same state as present when the window for the laser was
taught.
(2) A modification to the window for the laser can be made by
making a setting again in this state. Base position data, if
already acquired correctly, need not be acquired again.

When you did not acquire base position data…


(1) In specification of “Image” in Test, select “Load BMP files”.
(2) Check “Get base position data”. Be sure to check this check
box before setting (3) below. Otherwise, an image file selection
in (3) cannot be made.
(3) In specification of “BMP File(2D)”, select "(vision-process
-name)_WindowTeach.bmp" in the folder named "3dvision"
under “Data folder”.
(4) Click the Run button. The result is satisfactory when a correct
detection is made.

- 151 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.4.3 Teaching Procedure


When compared with disk measurement and displacement
measurement, edge measurement requires a more complicated teaching
procedure because a straight line passing through an edge to be found
is set using a reference point. This subsection describes the
procedure.

(1) Place a workpiece and determine a measurement position.


(2) Teach the model of the location tool. The model may include
features other than a straight line passing through an edge to be
found. The screen below shows an example of the model.

(3) Set a model origin at a point on the straight line passing through an
edge to be found. Set a model origin precisely by using the
image enlargement function.

- 152 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

(4) Set a reference point at a point on the line passing through an edge
to be found. At this time, separate the model origin from the
reference point as far as possible. Set a reference point precisely
by using the image enlargement function.

(5) Adjust the other parameters of the location tool. The screen
below shows the detection result after adjustment.

(6) Next, make settings for 3D measurement. Make settings that


need not be adjusted, such as camera setup data, the measurement
function, and transmission destination robot. For details of
setting, see Subsection 9.4.2, "Settings".
(7) Set a 3D measurement window. On this window setting, a
restriction applicable only to edge measurement is imposed. See
(Note on window setting) in Subsection 9.4.2, "Settings". Here, a
setting is made to use a laser beam on the left side of the straight
line including an edge to be found. The screen below shows an
example of teaching a window.
- 153 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

(8) Obtain reference position data. The screens below show an


example of the results of obtaining a reference position, and an
output message.

(9) For confirmation, perform test execution. The screen below


shows the results of detection. Make a parameter adjustment as
required while viewing the results of detection. As a result of
detection, laser beam points are displayed. For string display,
two colors are used: red (thick) and light blue (thin).

- 154 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Color Description
Red (thick) Points extracted as a laser beam point string and used for calculation. If points
other than a point string such as noises are displayed in this color, the precision
may have deteriorated. In such a case, a parameter adjustment is required.
Light blue (thin) Points extracted as a laser beam point string but not used for calculation. In
some cases, points extracted correctly may be displayed in this color.

(10) When an adjustment is completed, the teaching ends.

- 155 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.5 CROSS SECTION MEASUREMENT

By using a cross-sectional figure obtained using a laser beam, the


cross section measurement function finds a position such as the
summit of a semicylinder-shaped workpiece whose measurement is
difficult with the conventional 2D measurement function.

9.5.1 Information Obtained


In cross section measurement, the cross section of a workpiece is
internally formed as an image by using a laser beam, and the 3D
position of a particular point on the cross section is found by 2D
image processing.
3DLaser
3D vision sensor
Vision Sensor

CCD camera
Laser slit beam projector

Laser ON

Laser beam image processing


Particular point to
Laser beam line is converted to be found
cross section image.
2D image processing

Position is detected from


cross section image.

3D position of particular point


on cross section
(x, y, z, 0, 0, 0)

CAUTION
In cross section measurement, the cross section of a location to which a laser beam is
projected is measured, regardless of how the workpiece position differs. So, no
identical location on a workpiece is measured each time.
Note that for workpiece position compensation using cross section measurement,
multiple measurements need to be combined with each other.
- 156 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.5.2 Settings

3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.

Camera Setup
Select a camera setup option to be used with the vision process.
Click the Select button to display a list of camera setup options.
Select a desired camera setup name then click the OK button.
Function
This setting item is used to select a measurement function to be
used with the vision process. Select “Cross section”.
Window setting
This setting item is used to collect only a required portion of
laser information. A laser beam is not always radiated
sufficiently across a workpiece without being radiated outside the
workpiece. A window is used to exclude a laser beam radiated
outside the workpiece from processing.

Clicking the Window setting button displays a dialog box for


teaching a window. Follow the instruction of the dialog box.

- 157 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Teach an area to be used for 3D measurement as a window.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to make a
setting again.

Use circle window


When this check box is checked for window setting, a circular
window is set.

Clicking the Window setting button displays a dialog box for


teaching a window. Follow the instruction of the dialog box.

At first, set the center of a circle.

- 158 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Next, teach a point on the periphery of the circular area to be used


for 3D measurement.

If the setting is correct, click the OK button to end the setting.


If the setting needs to be corrected, click the Cancel button to make
a setting again.

- 159 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Use window mask


When this check box is checked for window setting, a screen as
shown below is displayed at the end of window setting. At this
time, portions to be excluded from 3D measurement can be
painted using a red pen. To paint with a red pen, move the
mouse while clicking. Upon completion of setting, click the
OK button to end the setting. In the screen shown below, a
rectangular window is used. However, this adjustment can be
made with a circular window as well.

Parameter
For 3D measurement, various adjustments may need to be made
for a measurement object. Usually, the parameters need not be
adjusted. However, adjust parameters if necessary. Click the
Parameter button. The form shown below appears.

- 160 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Items Description
Brightness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWING BRIGHTNESS
DISTRIBUTION".
Slit number Number of a laser beam to be used. With this function, only the one of the two
laser beams that corresponds to a number specified here is used for
measurement. (Default: 1)
1: Displayed from the lower-left corner to the upper-right corner on the screen
2: Displayed from the upper-left corner to the lower-right corner on the screen
Upper limit(mm) Among 3D positions found from individual laser beam points, those positions that
are larger than this value along the Z axis in the sensor coordinate system are not
used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: 50 mm)
Lower limit(mm) Among 3D positions found from individual laser beam points, those positions that
are smaller than this value along the Z axis in the sensor coordinate system are
not used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: -50 mm)
Grid pitch(mm) This function performs detection by converting 3D cross section data based on a
laser beam to a 2D image. This value represents the scale of one pixel when 3D
cross section data is converted to a 2D image. As a smaller value is specified,
the resolution improves, but the projection area becomes smaller, and noise
exercises a greater influence. (Default: 0.25 mm)

Click the OK button when the adjustment is completed and the new
setting is to be applied. When canceling the attempt for adjustment,
click the Cancel button to close the form.

- 161 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Shutter speed mode


The shutter speed mode is a function for changing the shutter
speed of the camera by software. This mode is useful for using
the sensor in various environments. Select one of the three
shutter speed modes indicated below.

Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.

2D setting
The cross section detection function converts cross section data
obtained using a laser beam to a 2D image then identifies a position
by 2D measurement. So, the setting of 2D measurement must not be
omitted. In the selection of “Use / Not Use” below, only “Use” can
be selected.

- 162 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

Use / Not Use


In cross section detection, only “Use” can be selected.
Location Tool
Select a location tool to be used with the vision process.
Clicking the Select button displays a list of location tools.
Select a location tool to be used, then click the OK button.
Others
Set “Other” according to each application.
However, the setting of “Robot to send to” must not be omitted.

Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.
When multiple robots are selected, the results of vision process
execution are sent to the selected robot controllers in the order of
selection.
Output frame setting
The results of 3D Laser Vision Sensor measurement need to be
related to the coordinate system of the robot. Here, select one
of the three options indicated below to determine which
information is to be used for relationship.

Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
The use of KAREL can serve a wide variety of cases, so that the default is this
option.
User frame: By using the current robot position at the time of measurement, this option

- 163 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Option Description
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.

CAUTION
The term "robot" used here means a robot controller that started the 3D Laser
Vision Sensor. The 3D Laser Vision Sensor obtains the current position and
position register information from the robot controller started the 3D Laser Vision
Sensor. Do not confuse this robot controller with the "transmission destination
robot", although both represent the same robot controller in many cases.

Image and 3DV position data saving


Select a condition for saving images and 3D Laser Vision Sensor
position data. Select one of the three options indicated in the
table below.
Option Description
Don't save No image is saved.
Save if failure When a vision process is executed, images not detected are saved in files.
Select this option usually.
By saving an image not detected, the cause can be investigated later. For
investigation of the cause, test execution should be performed by using an image
file and adjusting the parameters. For test execution using an image file, see
Subsection 9.6.2, "Use BMP Files".
Save always Images are saved to files each time a vision process is executed. This option is
useful in a stage where detection stability is evaluated, for example, immediately
after system startup. For evaluation of detection stability, see Subsection 9.6.2,
"Use BMP Files".
After normal operation starts, switch the setting so that images are saved only
when undetected.

Images are saved in two files. One file is (vision-process-name)12D


@(date-time).bmp, and the other is (vision-process-name)13DSubt1
@ (date-time).bmp or (vision-process-name)13DSubt2 @ (date-tim
e).bmp that corresponds to the slit number being used.
3D Laser Vision Sensor position data is saved in a file named (visio
n-process-name)@(date-time).psn.
All of these files are saved in a folder specified in Image folder on the
option setting screen. (See Section 11.1, "OPTION SETTING".)

TIP
The hard disk has a limited capacity. If the hard disk is
- 164 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

fully used up, trouble such as a failure in starting up the


personal computer can arise. With the 3D Laser
Vision Sensor , a limitation on the hard disk area that
can be used to save images is set on the option setting
screen so that images are not saved beyond the set
area.

Options
Set a desired option switch. Three option switches are available
as indicated in the table below.

Items Description
Log the resulting data Each time a vision process is executed, the results of execution such as a
processing time and detection position are recorded in a file. A file named
(vision-process-name).csv is created in the folder named “3dvision” under the
folder specified in Data folder on the option setting screen. (See Chapter 11,
"OPTIONS".)
The data in this csv file can be formed, for example, into a graph by spreadsheet
software such as Excel®.
Measure narrow area If a laser beam is directed to a narrow area of a workpiece, detection may tend to
fail. In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.
Use slit points which are Usually, a laser beam produces only one image on a workpiece. Depending on
not peak the figure and surface state of a workpiece, however, a pseudo image may
additionally appear on the workpiece (which is referred to as "secondary
reflection"). If this phenomenon occurs on a workpiece, which is a real image
cannot be determined. So, all possible points need to be considered. When this
check box is checked, all possible points are considered, but noise can exercise an
influence to a greater extent. Check this check box only when secondary
reflection is observed.

When you exit from the screen by clicking the OK button, the taught
data is saved.
When you exit from the screen by clicking the Cancel button, the
taught data is cancelled.
Test
Image
Select an image subject to test execution.

Options Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. For details, see
Subsection 9.6.2, "Use BMP Files".

Continuous Run
This check box is enabled when “Snap from a camera” is
selected. For details, see Subsection 9.6.1, "Continuous Run".

Get base position data


This check box is not required for this function.
Get cross section image for teach

- 165 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

This check box is used to obtain the taught image of the location
tool required for cross section measurement. If this check box
is checked upon completion of teaching related to 3D
measurement, the Run button of Test is enabled even when no
data is set as a location tool on the 2D measurement side.
Clicking the Run button in this state acquires the taught image of
a cross section model. Teach the location tool by using the
image.

9.5.3 Teaching Procedure


The teaching procedure for cross section detection differs from the
procedures for other measurement functions. That is, an image is
created using a laser beam, then a location tool for 2D measurement is
taught using the image. This Subsection describes the procedure.

(1) Place a workpiece and determine a measurement position.


(2) Make settings for 3D measurement. Make settings that need not
be adjusted, such as camera setup data, the measurement function,
and transmission destination robot. For details of setting, see
Subsection 9.5.2, "Settings".
(3) Set a window for 3D measurement. The screen below shows an
example of teaching a window.

(4) Check the check box “Get cross section image for teach” then
click the Run button to obtain a cross section image. The screens
below show an example of execution and an output message.
This cross section image is saved as an image file under a file
name as indicated in the message. Teach the location tool by
using this image.

- 166 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

(5) At the end of cross section image acquisition, the relationship


between a cross section image and laser beam is indicated as
shown in the screen below. By viewing this screen, check the
relationship between a cross section image and laser beam.

- 167 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

The correspondence between real images and cross section images is


as follows:
Real image Cross section image

(6) By choosing [File] → [Open BMP file...] from the main menu,
display the cross section image obtained in (4). Then, teach the
location tool. Set a model origin precisely by enlarging the
image as shown in the screen below.

(7) Check the detection state by test execution of the location tool.
The screen below shows an example of the results of detection.
- 168 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

(8) Return to the 3D vision process then set the location tool.

(9) By test execution, check the results of detection. The screen


below shows an example of the results of detection. While
viewing the results of detection, make a parameter adjustment as
required. When an adjustment is completed, the teaching ends.

- 169 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

- 170 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.6 VISION PROCESS TEST EXECUTION

Place the target object within the view field of the camera, and click
the Run button.
The measurement results are displayed on the run-time monitor of the
vision process.
The run-time monitor appears when [Monitor] is selected from the
[View] menu of the 3D Laser Vision Sensor.

The time displayed on the run-time monitor is all the processing time
of the vision process. However, vision process test execution does
not include communication, so the displayed time includes no
communication time.

The output results of test execution are based on the coordinate system
indicated below. When an image is loaded from a file, the output
results differ depending on whether a position data file is present or
not.

Position data file Present Not present


Snap from a camera Sensor coordinate system Sensor coordinate system

Load BMP files Coordinate system used when the Sensor coordinate system
position data file was recorded

CAUTION
In test execution, the robot controller does not start the
3D Laser Vision Sensor, so the setting of “Output frame
setting” does not apply.
This is because the current position and position
register information, which is required for calculating
the output based on the user coordinate system,
cannot be acquired.
Therefore, note that the output results obtained by test
execution differ from the output results obtained when
the 3D Laser Vision Sensor is started by the robot
controller.

Information about 3D measurement parameters is displayed on the


status monitor.
To display the status monitor, choose [3DV Status Monitor] from
[View] on the menu bar of the 3D Laser Vision Sensor. There are
two types of screens: “Run-time Monitor” and “Latest Images”.

- 171 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

Run-time Monitor

Run-time Monitor Description


Slit1(pts) The number of points in the point string used for measurement processing for slit 1
(extending from the lower left part to upper right part on the screen) is indicated.
Slit2(pts) The number of points in the point string used for measurement processing for slit 2
(extending from the upper left part to lower right part on the screen) is indicated.
LLDist(mm) Only for disk measurement, a value is indicated.
This value represents the distance between two straight lines obtained from two
laser beams. A long distance shows a failure in correct plane detection.
Lean(deg) Only for disk measurement, a value is indicated.
In test execution, the inclination angle of the workpiece with respect to the Z-axis of
the coordinate system determined according to the table presented above.
(Note: When the 3D Laser Vision Sensor is started by the robot controller, this
value indicates the inclination angle of the workpiece with respect to the Z-axis of
the coordinate system set by “Output frame setting”.)
Status Status of the detection. OK indicates that the detection is successful, and NG
indicates that the detection failed.

Latest Image

Last Image Description


Slit 1 The image of slit 1 (extending from the lower left part to upper right part on the
screen) in the latest measurement is displayed.
Slit 2 The image of slit 2 (extending from the upper left part to lower right part on the
screen) in the latest measurement is displayed.

- 172 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

9.6.1 Continuous Run


You can fetches and detects images repeatedly. Check the
“Continuous Run” option and click the Run button.
Continuous run is kept until the Cancel button in the dialog box is
clicked.

9.6.2 Use BMP Files


When “Load BMP files” is set, a wildcard can be used for image file
name specification. If multiple image files are subject to detection as
the result of using a wildcard, a screen as shown below appears.

Clicking starts execution.


The meaning of each icon is indicated below.

Icon Description
Executes in succession in reverse order.
Executes using the previous image and stops execution temporarily.
Stops execution temporarily.
Executes using the next image and stops execution temporarily.
Executes in succession.

If an intermittent stop occurs due to a detection error, for example, this


function enables the state of detection before and after the occurrence
of the intermittent stop to be checked by storing images with “Save
always” selected in “Image and 3DV position data saving”.
Moreover, when the location tool or vision process is modified, a
check can be made by using stored images to see if there is a side
effect.

- 173 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.7 EXECUTION BY ROBOT CONTROLLER

A robot controller starts a vision process for the 3D Laser Vision


Sensor through Ethernet communication and receives results. The
execution of a vision process by a robot controller is summarized as
follows:
- The name of a vision process for the 3D Laser Vision Sensor is
specified for execution.
- If the setting of the robot corresponding to the robot controller is
already made with the 3D Laser Vision Sensor, and a connection
is already made, the robot controller can start the vision process
for the 3D Laser Vision Sensor .
- Results are sent to all robot controllers specified in the started
vision process for the 3D Laser Vision Sensor .
It is assumed that the robot controller is connected with the 3D Laser
Vision Sensor via an Ethernet cable and communication is possible.
(On the "robot" data teach screen of the 3D Laser Vision Sensor to be
started, check that the state of connection with the robot controller is
“Connected”.)
To start the vision process for the 3D Laser Vision Sensor from the
robot controller, execute the following instruction with a TP program:

PCVIS RUN (vision-process-name)_SLT [ST=R[n], OF=PR[m]]

In (vision-process-name), enter the name of a 3D vision process to be


executed for the 3D Laser Vision Sensor. Up to 16 characters
including "_SLT" can be entered.
In n and m, enter the numbers of a register and position register for
storing measurement results. (A number from 1 to 255 may be
entered. If the number of a nonexistent register or position register is
specified, an error occurs at execution time.)
Measurement results are stored in a register and position register as
indicated below.

Location Description
R[n] A number indicating whether a measurement is successful is stored. When a
measurement is successful, 1 is stored. When a measurement is unsuccessful, 0
is stored.
PR[m] Measurement results are stored in the XYZWPR format. When a measurement is
unsuccessful, no data is stored.

A sample robot program for executing a 3D vision process is indicated


below. (The vision program name is PROGRAM1.)

- 174 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

1: UTOOL_NUM = 1
2: R[1]=(-1)
3: PCVIS RUN PROGRAM1_SLT [ST=R[1], OF=PR[2]]
4: WAIT R[1]<>(-1)
5: IF R[1] = 0, JMP LBL[99]
:
10: ! ERROR ABORT
11: LBL[99]
[End]

CAUTION
If “User frame:Conv with robot CurPos” is set in
“Output frame setting” for the 3D vision process, be
sure to set the sensor coordinate system as the tool
coordinate system and set an appropriate user
coordinate system before starting the 3D Laser Vision
Sensor to make measurements. If a different
coordinate system is set for each tool coordinate
system, correct results cannot be output.

CAUTION
Results sent from the 3D Laser Vision Sensor to the
robot controller are not compensation data but
measured position data. For the method of
calculating compensation data, see Appendix C,
"ROBOT POSITION COMPENSATION".

CAUTION
Ensure that before the execution of the vision process
for the 3D Laser Vision Sensor started by the robot
controller is completed, the robot controller does not
start another vision process for the 3D Laser Vision
Sensor (prohibition of duplicate execution). In other
words, check first that after starting a vision process
for the 3D Laser Vision Sensor by the robot controller,
information on whether a measurement made by the
vision process is successful or not has been sent to a
register, then start the next vision process for the 3D
Laser Vision Sensor.

- 175 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS

9.8 SHOWING BRIGHTNESS DISTRIBUTION

Of 3D measurement parameters, those other than Brightness


Threshold can be checked by the status monitor (see Section 9.6,
"VISION PROCESS TEST EXECUTION"). This section describes
how to check information about Brightness Threshold.

First perform 3D measurement once. Then, select [3DV Status


Monitor] from the [View] menu of the 3D Laser Vision Sensor to
open the "3DV Status Monitor" screen. On the [Latest Images] tab
of the Status Monitor screen, display the image of slit 1 or slit 2.

Displaying slit images

Select [Brightness Distribution (D)] from the [Tool] menu of the 3D


Laser Vision Sensor . Then, on the slit image, specify the point on
the line for which you want to indicate the brightness distribution by
using the mouse.

- 176 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03

The brightness distribution of the line including the specified point is


indicated.

Showing the distribution of this line.

When Brightness Threshold needs to be adjusted, determine the


threshold value that can separate the slit beam from the background by
using the result of brightness distribution.

- 177 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM

10 CREATION AND EXECUTION OF A


ROBOT PROGRAM
This chapter presents a sample robot program, which executes the
following most frequently used application:
- Correct the position at which the robot holds a workpiece

- 178 -
10.CREATION AND EXECUTION OF A ROBOT PROGRAM B-81444EN/03

10.1 CORRECTING THE POSITION AND POSTURE FOR


HOLDING A WORKPIECE

This section shows how to create and execute a program that instructs
the robot to operate in such a way that the robot changes its position
and posture for holding a workpiece as the position and orientation of
the workpiece change.
The following illustrates the basic principle:
Teaching
Measurement

When a workpiece is at a certain position and in a certain orientation, the robot


holds the workpiece in such a way.
1. Measure the "certain position and certain orientation" in advance.
2. Teach "holding the workpiece such a way".

Measurement
Automatic calculation

As the position and orientation of a workpiece change, the way to hold the
workpiece is changed accordingly. .
1. Measure the current "position and orientation of a workpiece".
2. The way to hold the workpiece is calculated automatically.

The upper figure shows that a workpiece placed at a certain position is


measured then an operation for holding the workpiece is taught. This
is called "nominal execution" and the measurement results obtained
- 179 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM

here are called "nominal data". In the lower figure, the position and
posture of the workpiece change from those in the upper figure. In
the state shown in the lower figure, a measurement is performed, and
from the results of the measurement and nominal data, the operation
for holding the workpiece is calculated automatically. This is called
"actual execution". Subsection 10.1.1, "Creating a Robot Program",
explains how to create a program that executes these operations, and
Subsection 0, "Executing a Robot Program", explains how to execute
the program.
For the overall teaching flow, see Section 1.3, "3D LASER VISION
SENSOR SETUP PROCEDURE".

10.1.1 Creating a Robot Program


First, create a robot program as shown below.
It is assumed that tool coordinate system 1 is the sensor coordinate
system of the 3D Laser Vision Sensor, and TF(tool frame) 2 is the
coordinate system of the hand, and that the vision process is
PROGRAM1 for 3D measurement. For the cautions related to
teaching with a vision process, see Section 9.7, "EXECUTION BY
ROBOT CONTROLLER".

A 1: UTOOL_NUM = 1
2: J P[1] 50% FINE
3: R[1]=(-1)
B 4: PCVIS RUN PROGRAM1_SLT [ST=R[1], OF=PR[2]]
5: WAIT R[1]<>(-1)
6: IF R[1] = 0, JMP LBL[999]
7:
C 8: OFS_RJ3(10,2,0,10,0,11,0)
9: UFRAME[9] = PR[11]
10: PR[11] = UFRAME[9]
11:
11: IF R[10] <> 1, JMP LBL[1]
12: R[10] = 0
13: PAUSE
14: LBL[1]
15: UTOOL_NUM = 2
D 16: J P[2] 50% CNT100
17: L P[3] 500 mm/sec CNT100
18: L P[4:GRASP POINT] 100 mm/sec FINE
19: L P[5] 300 mm/sec CNT100
:
30: ! ERROR ABORT
31: LBL[999]
32: UALM[1]
[End]

The meanings of the registers and position registers used by this robot
program are as follows:
R[1] : Execution status of vision process "PROGRAM1"
R[10]: Nominal flag (1: nominal execution, other than 1:

- 180 -
10.CREATION AND EXECUTION OF A ROBOT PROGRAM B-81444EN/03

actual execution)
PR[2]: Execution result of vision process "PROGRAM1"
PR[10] : Nominal data
PR[11] : Compensation data
In addition, the robot program uses UF(user frame) 9.

This program consists of the following four major parts:


A : Moving the robot to a measurement position
B : Making a measurement (If measurement fails, a jump to label
[999] is made to perform error handling.)
C : Saving nominal data or calculating compensation data
D : Holding a workpiece
P[2] is an intermediate point of change from the posture for measuring
the workpiece to the posture for holding the workpiece.
P[3] is an approach point.
P[4] is a holding point.
P[5] is a retraction point.
Points are added or deleted as required.
"OFS_RJ3" in C is a KAREL program provided by FANUC. Under
the conditions of this program, the following processing is performed
internally:

Start

Yes
Is R[10]= 1?

No

From PR[10] and PR[2], Copy PR[2] to PR[10]


calculate compensation data

Write compensation data


to PR[11]

End

Because the calculated compensation data (PR[11]) uses sensor


interface method B (see Appendix C, "ROBOT POSITION
COMPENSATION"), the format of PR[11] needs to be converted to
matrix format. Therefore, the steps of placing the content of PR[11]
in an empty user coordinate system then restoring it to the original
place are required.

- 181 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM

NOTE
For more details on the OFS_RJ3 specifications,
contact FANUC.

10.1.2 Executing a Robot Program

Nominal execution
(1) Place a workpiece at a certain position within the range that
ensures robot operation, and teach P[1], which is the
measurement position, in the robot program created as described
in Subsection 10.1.1, "Creating a Robot Program". Usually,
when vision process PROGRAM1 is taught, the position where
the workpiece is placed and the workpiece measurement position
should have been determined. So, this step requires only
teaching of these positions.
(2) After position teaching for P[1] is completed, enter 1 in R[10]
manually and execute the robot program starting from the first
line.
(3) As a result of program execution, the robot moves to the
measurement position, and a measurement is made. If the
measurement is successful, OFS_RJ3 internally sets the
measurement result in PR[10] as nominal data.
(4) Then, the robot program stops temporarily at the “PAUSE” line.
Set R[10] to 0 not to update the nominal data by mistake at the
next execution.
(5) Subsequently, while keeping the workpiece in the state present
when it was measured in (3), teach the operation of holding the
workpiece from P[2] to P[5].
(6) Add an “Offset” instruction to the operation statements for
holding the workpiece in the program.

15: UTOOL_NUM = 2
16: J P[2] 50% CNT100Offset,PR[11]
17: L P[3] 500 mm/sec CNT100 Offset,PR[11]
18: L P[4:GRASP POINT] 100 mm/sec FINE Offset,PR[11]
19: L P[5] 300 mm/sec CNT100 Offset,PR[11]

(7) Nominal execution ends.

- 182 -
10.CREATION AND EXECUTION OF A ROBOT PROGRAM B-81444EN/03

CAUTION
To perform nominal execution, do the following as
described in the underlined parts:
- Check that the register indicating nominal execution
(the nominal flag) is set correctly.
- Exercise care so that the workpiece state at the time
of measurement is kept unchanged until the robot holds
the workpiece.
If one of the above conditions is not met, execution
must be performed again from the workpiece
measurement (from (2) in the above procedure).

Actual execution
(1) Check that 0 is set in R[10], and execute the robot program
starting from the first line.
(2) The workpiece is measured, and the operation of holding the
workpiece is performed according to the changes in workpiece
position and orientation.

CAUTION
To perform actual execution, as described in the
underlined part above, check that the register indicating
nominal execution (the nominal flag) is not set. If
actual execution is performed with the nominal flag set,
the nominal data is rewritten. (In this case, the
nominal data can be restored by teaching the operation
of holding the workpiece again.)

- 183 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM

10.2 OTHER APPLICATION EXAMPLES

Robot systems using the 3D Laser Vision Sensor can find a wide
variety of applications by modifying the robot program presented in
Subsection 10.1.1, "Creating a Robot Program". Such applications
include the following:

(1) The 3D Laser Vision Sensor installed in another robot or


installed at a fixed position is used to correct a displacement in
holding of a workpiece, changes the posture when aligning the
workpiece, then set the workpiece in a jig for the next process.
(2) When a large workpiece is picked up, multiple locations are
measured, and the measurement results are synthesized to obtain
the measurement result of the entire workpiece.
(3) The measurement position is corrected by the first measurement
so that measurements can be made from the front side where
possible to improve accuracy.

Many other applications are possible. When you need details on


programs that execute the above application examples, or when you
want to know the feasibility of an application not presented above,
contact FANUC.

- 184 -
11.OPTIONS B-81444EN/03

11 OPTIONS
In option setting, settings for the 3D Laser Vision Sensor except
teaching of vision data is performed.

- 185 -
B-81444EN/03 11.OPTIONS

11.1 OPTION SETTING


Select [Options] from the [View] menu. The Options screen shown
below appears.

Item Description
StartUpCamera Camera number that is selected when you start up the 3D Laser Vision
Sensor . 1 is the default.
Data folder Folder name to store vision data that you trained.
C:¥Program Files¥Fanuc¥3D Vision¥Data is the default.
Image folder Folder name to store images, such as unmatched images, used by the 3D
Laser Vision Sensor .
C:¥Program Files¥Fanuc¥3D Vision¥Image is the default.

- 186 -
11.OPTIONS B-81444EN/03

Item Description
If image folder exceeds limit The Vision process saves images to files when you select image saving option
in vision process training. If most of the hard disk is used by saving images
such as unmatched images, the trouble such as PC can not start may occur.
Even when most of the hard disk space is not yet used up, if the folder
specified in “Image folder” becomes too large, it can take too much time to
access the folder, abnormally increasing the processing time of the 3D Laser
Vision Sensor. In the 3D Laser Vision Sensor, the size of the folder specified
in “Image folder” is suppressed not to exceed the limit specified in the next
item, “Size if image folder”.
This setting selects an action to be taken not to exceed the limit.

Option Description
Don’t save When the limit is reached, the subsequent
images are not stored.
Save after deleting oldest When the limit is reached, the image file with
one the oldest date in the folder is deleted, and a
new image file is stored. This setting is used
by default.

CAUTION
When the file system of the personal computer is
FAT32, it may take too much time to store an image if
“Save after deleting oldest one” is selected. As
described in Subsection 2.1.1, "Personal Computer",
the NTFS file system should be used. (At least, the
file system of the image folder should be NTFS.)
Size if image folder When changing this size, ensure that there is a hard disk space of about
100MB left even if the image folder becomes full. The maximum value is
1000MB.
Show run-time monitor Check to automatically show the run-time monitor at start-up of the 3D Laser
at start-up Vision Sensor application. You can show the run-time monitor by clicking
[Monitor] button on the tool bar. OFF is the default.
Advanced mode By default, some items that would not be used in usual do not appear. Check it
to show all of items.
Keep image in last execution When this check box is checked, the images of the most recently executed
vision process are saved in an image folder named LastImage*.bmp. (*
denotes presence of multiple files. The number of images saved varies
depending on the vision process used.) This folder is used when a trouble
occurs in the system. By default, this check box is checked.
Comm. to robot Setting for communication between the robot controller and 3D Laser Vision
Sensor. Communication is used for invoking the 3D Laser Vision Sensor
process from the robot controller and allowing the program to store the
processing results in robot controller's registers and position registers.

- 187 -
B-81444EN/03 11.OPTIONS

11.2 ETHERNET SETTING

Click the Property button of [Comm. to robot] – [Ethernet] on the


Options screen. You will see the screen as shown below. You can set
about automatic re-connecting function of Ethernet communication.

Item Description
TCP/IP keep alive time Indicates the time in milliseconds until which the Windows NT TCP/IP service
waits to determine the communication partner did not return a response. The
3D Laser Vision Sensor uses this service to determine whether the robot
controller is powered off. When this value is too small, communication may
become unstable. When this value is too large, the 3D Laser Vision Sensor
cannot resume communication with the robot controller because it cannot
determine whether the robot controller is powered off. The default value for
the TCP/IP service is two hours (7,200,000 milliseconds), which is too long to
determine whether the robot controller is powered off. Therefore, change this
value to about 15 seconds by entering 15,000. For this change to take effect,
you must restart the PC.
PING timeout time The 3D Laser Vision Sensor checks whether the robot controller is powered on
by periodically issuing Ping. Ping determines the robot controller is powered
off after waiting for the time period specified in this field without receiving a
response from it. If the robot controller does not return a response with this
value set to a larger value, a heavy load is applied, affecting the performance of
and a slow response from the PC. Decreasing this value off-loads the PC, but
makes the operation of Ping unstable. The default value is 250 milliseconds.
PING interval time Indicates the interval in milliseconds at which Ping is issued to check whether
the robot controller is powered on. Decreasing this value imposes loads on
the PC. Increasing this value off-loads the PC, but it takes a longer time until
the 3D Laser Vision Sensor checks whether the robot controller is powered on.
The default value is five seconds.
Comm. software timeout Indicates the maximum time in seconds until which the 3D Laser Vision Sensor
waits for a response from the robot controller during communication with the
controller over the Ethernet. The default is 60 seconds.
No alarm outputs even if Checking this check box suppresses alarm indication even if a specified
specified program does not program is not present when the robot controller specifies execution of the
exist program. This check box should be checked only in such a special case that
more than one personal computer is connected to one robot controller. By
default, this check box is not checked.

- 188 -
11.OPTIONS B-81444EN/03

TIP
When the PC vision instruction is executed in a
program for the robot controller, the robot controller is
set as "robot" data, and all personal computers (3D
Laser Vision Sensor software) that are in the
"connected" state are started.
For the connection between the robot controller and
personal computers (3D Laser Vision Sensor
software), see Chapter 5, "ROBOT COMMUNICATION
SETTING".

11.3 3DV OPTION SETTING

Optional setup for the 3D Laser Vision Sensor . Select [3DV Options]
from the [View] menu. You will see the screen as shown below.

Item Description
Application Select the kind of application to be used when training is performed. "2D, 3D
measurement" is the default. Checking “Bin picking” then clicking the OK
button changes the menu configuration to display a menu specifically
designed for the “Bin pinking” function. For information about “Bin pinking”,
refer to the separate operator's manual.

- 189 -
APPENDIX
B-81444EN/03 APPENDIX A.ERROR CODES

A ERROR CODES
The following table lists the major error codes.
Error code Message Cause Response
80041007H No frame grabber The frame grabber is not available to The frame grabber may not be inserted
was opend. the software of the 3D Laser Vision into the PCI bus securely, the frame
Sensor . grabber may be faulty, or more than one
3D Laser Vision Sensor program may be
running on the same personal computer.
Check for these items.
80042003H 2D detection failed. A failure occurred in 2D detection Using a location tool, adjust parameters
and teach models again, so that 2D
detection can be performed securely.
80042005H Image loading failed. The video image file is not found or Check for the video image file, and take
is damaged. pictures again as required.
80042006H Image saving failed. The target folder is not found or Check whether the target folder exists.
write-protected. Also check that the file property is correct.
80042007H Pixel value cannot Camera calibration has not been Calibrate the camera (again).
be translated to line. performed, or the "Scale factor" is
invalid such as being zero or less.
80042008H Position cannot be The "CCD cell size" is invalid such Check "CCD cell size" in the Change
translated to pixel as being zero or less. camera type dialog box. Correct it as
value. required, and calibrate the camera
(again).
80042009H Invalid program An attempt was made to execute a Check the vision process name and
name. command with a measurement measurement number, and correct them
number of 2 or greater. However, as required.
the vision process name is different
from that used during the previous
execution of the process.
8004200CH Invalid snap An attempt was made to execute a Check the measurement number and
number. command with a measurement correct it as required.
number of 2 or greater. However,
the measurement number is not
consecutive to that used during the
previous execution of the command
or is invalid such as being greater
than the specified measurement
count.
80042011H Lack of points. The number of bright points If the window is not appropriate, adjust the
generated with laser beams as window size and position.
captured with the camera is If the laser beams are defeated by the
insufficient. background, decrease the amount of light
from the outside, or decrease “Brightness
Threshold” within the range that ensures
correct detection.
80042013H No plane. No straight line forming a flat surface . If the laser beam point string is
was found, or the distance between insufficient, respond in the same way as
two straight lines is too large to form for “80042011H Lack of points.”.

- 193 -
A.ERROR CODES APPENDIX B-81444EN/03

Error code Message Cause Response


a flat surface. If the distance between the two straight
lines obtained from laser beams exceeds
“Distance Threshold”, increase the
threshold value according to the level
difference, if any found at a portion to
which a laser beam is directed.
Alternatively, the positional relationship for
the sensor may have been changed for a
cause such as a collision. In this case,
re-calibration is required.
80042014H No line. It is impossible to form a straight line Respond in the same way as for
from a string of bright points “80042011H Lack of points.”.
generated with laser beams. Alternatively, decrease “Slit Points
Threshold” within the range that ensures
correct detection to make forming of a
straight line easier.
The measurement object is narrow, check
the “Measure narrow area” option.
8004201BH Data form is The Camera Setup, Location Tool or Before trying to switch the displayed
opened. Vision Process dialog box is open. menu, close the Camera Setup, Location
So, it is impossible to switch the Tool, or Vision Process dialog box.
displayed menu.
8004201FH Turning on laser An attempt was made to perform 3D The I/O board may not have been
failed. measurement with the laser power installed properly, or there may be a
supply turned off. disconnection such as a cable
disconnection.
80042023H Cannot get the The user coordinate system cannot Check the status of communication with
current user frame. be acquired from the robot. the robot controller.
80042024H Cannot get the The tool coordinate system cannot Check the status of communication with
current tool frame. be acquired from the robot. the robot controller
80042025H Position register is The position register of the robot After saving the current position, initialize
uninitialized. accessed by the 3D Laser Vision the position register.
Sensor has not been initialized
80042028H Iteration numbers No solution could be obtained by Adjust the window for the laser, or modify
exceed limits. performing calculation a specified parameters so that just as many laser
number of times. beam points as required can be detected.
If this error occurs during setup of the
sensor coordinate system, also check
whether dots on the calibration jig can be
detected correctly.
8004202AH Cross section From the cross section image Correct the parameter settings for the
cannot be found. generated by cross section location tool to which the cross section
measurement, the taught cross model was taught. Alternatively, adjust
section model could not be detected. the window for the laser, or modify
parameters so that just as many laser
beam points as required can be detected.
8004202BH Fitting curve failed. In edge measurement, laser beam Adjust the window for the laser, or modify
points could not be applied to a parameters so that just as many laser
curve. beam points as required can be detected.
8004202CH Needs reference In edge measurement, no reference Set a reference point for the location tool.
point(s). point has been set for the taught
location tool.

- 194 -
B-81444EN/03 APPENDIX B.CALIBRATION JIG

B CALIBRATION JIG
This appendix explains a jig used for simple re-calibration (see
Section 6.7, "SIMPLE RE-CALIBRATION"), sensor coordinate
system setup (see Subsection 6.6.2, "Setting a Sensor Coordinate
System"), and grid pattern calibration (see Section 6.4, "GRID
PATTERN CALIBRATION (2D MEASUREMENT)").
In each calibration operation, the following grid points (grid pattern)
are shown to the 3D Laser Vision Sensor to allow automatic
recognition of the positional relationship between the jig and the 3D
Laser Vision Sensor, distortion of the lens and focal length.
Y

Origin

The jig must fit within the field of view of the 3D Laser Vision
Sensor.
The number of the grid points must be 49 (= 7x7) as shown above.
Place the bigger grid points like the letter "L" and set the corner of the
letter "L" to the origin as shown above.
The size of the jig used varies from one application to another
according to the size of the view field of the camera.
The minimum required number of grid points is as shown above
(about 7×7 = 49). The more grid points, the higher the precision.
However, if there are too many grid points, it may become impossible
to detect the object correctly.

- 195 -
B.CALIBRATION JIG APPENDIX B-81444EN/03

The grid pattern must have large grid points arranged like letter L as
shown above.
The corner of the letter L is placed at the grid pattern origin. A string
of three large dots corresponds to the X-direction, while a string of
two large dots corresponds to the Y-direction.
The large grid points have a diameter 1.5 times that of the small grid
points so that the calibration jig can be recognized even if it is placed
obliquely to the camera.

- 196 -
B-81444EN/03 APPENDIX C.ROBOT POSITION COMPENSATION

C ROBOT POSITION COMPENSATION


There are two robot position compensation methods. Select either
one according to the application of your system.

Center interface method A Center interface method B


Compensation is applied to movement in parallel to Compensation is applied to a selected user coordinate
the X- or Y-axis and rotation around the Z-axis with system itself. So, it is usable for robot movement
respect to the model origin of a detected object. routes.
Position registers in Cartesian coordinate format Position registers in matrix format (Nxyz, Oxyz, Axyz,
(XYZWPR) are used for compensation. Actually, X, Lxyz) are used for compensation.
Y, and R are used.
The 3D Laser Vision Sensor sends information The robot controller runs the dedicated KAREL
about a detected position, rather than compensation program* on data received at its position register from
data, to the robot controller. the 3D Laser Vision Sensor to obtain compensation
So, the nominal position is stored in a position register data.
in the robot controller to obtain compensation data
using the following formula:
PR[3] = PR[2] – PR[1]
where PR[1] = nominal data, PR[2] = actual data, and
PR[3] = compensation data
While the actual amount of compensation can be While the compensation accuracy is not affected by the
predicted from compensation data, the compensation model origin setting error and TCP setting error,
accuracy is affected by the model origin setting error compensation to be applied to each point cannot easily
and TCP setting error. be found from compensation data.

(*) If you want further information, please contact FANUC or its


branch offices.

- 197 -
D.RESTRICTIONS WITH A RAIL AXIS APPENDIX B-81444EN/03

D RESTRICTIONS WITH A RAIL AXIS


When the 3D Laser Vision Sensor is installed in a robot with a rail
axis, several restrictions are imposed on the position of the rail axis
when a workpiece is measured, when the workpiece is taken out, and
other operation stages. The restrictions may vary depending on the
applications. This appendix explains two examples that have a
relatively wide application range. For other applications, contact
FANUC.

Picking up a workpiece by using the 3D Laser Vision Sensor as a hand eye


(1) When the rail axis is an "integrated axis"
Basically, there is no restriction to the position of the rail axis.
If, however, the orientation of the rail axis does not match the
one axis of the robot coordinate system completely, the deviation
is added as an error. Usually, it is difficult to completely match
these axes with each other, so this application is not
recommended unless the system can allow a relatively large
error.
(2) When the rail axis is a "non-integrated axis" or “nobot "
The position of the rail axis when a workpiece is measured and
the position of the rail axis when a workpiece is taken out must
be the same.

Correcting a displacement in holding of a workpiece by using the 3D Laser Vision


Sensor installed at a fixed position
(1) When the rail axis is an "integrated axis"
Basically, there is no restriction to the position of the rail axis.
If, however, the orientation of the rail axis does not match the
one axis of the robot coordinate system completely, the deviation
is added as an error. Usually, it is difficult to completely match
these axes with each other, so this application is not
recommended unless the system can allow a relatively large
error.
(2) When the rail axis is a "non-integrated axis" or "nobot "
When the 3D Laser Vision Sensor is installed at a fixed position,
the sensor coordinate system of the 3D Laser Vision Sensor must
be set as the user coordinate system (explanation omitted). The
position of the rail axis when the user coordinate system is set
and the position of the rail axis at the time of measurement (when
a workpiece is shown to the 3D Laser Vision Sensor) must be the
same. When the displacement in holding of a workpiece is
corrected, then the workpiece is placed on the jig, however, the
position of the rail axis may differ.

- 198 -
B-81444EN/03 INDEX

Index

3 CONNECTING THE CAMERA ...................30


CONNECTION STATUS..............................49
3D Laser Vision Sensor ................................12
Continuous Run ..........................................175
3D LASER VISION SENSOR ......................25
CORRECTING THE POSITION AND
3D LASER VISION SENSOR BASIC
POSTURE FOR HOLDING A
OPERATIONS ...........................................36
WORKPIECE ..........................................181
3D Laser Vision Sensor System
Creating a Robot Program..........................182
Configuration.............................................. 4
CREATING A VISION DATA LIST FILE...45
3D MEASUREMENT AND 2D
CREATION AND EXECUTION OF A
MEASUREMENT......................................43
ROBOT PROGRAM ................................180
3D MEASUREMENT CALIBRATION ........53
CROSS SECTION MEASUREMENT........158
3DV OPTION SETTING ............................191


Detecting a Grid Pattern and a Laser .........54
ADJUST LOCATION TOOL PARAMETER93
Determining the IP Address.........................27
APC-3322A ....................................................31
Differences from 3D Measurement
AUTOMATIC CONNECTION .....................49
Calibration.................................................85
B DISK MEASUREMENT.............................112
DISPLACEMENT MEASUREMENT ........128
BACKING UP VISION DATA......................45
DISPLAY.......................................................14

DISPLAY LIVE IMAGES.............................38
CALIBRATION .............................................51 E
Calibration Data Detail Display ..................56
EDGE MEASUREMENT ...........................142
CALIBRATION JIG....................................197
ENLARGING AND REDUCING IMAGES .39
CALIBRATION TYPES ................................52
ERASING ALARMS .....................................40
Camera Adapter and Adapter Cable............12
ERROR CODES ..........................................195
Camera Cable................................................13
ETHERNET ..................................................27
CHANGING THE SHUTTER SPEED MODE
Ethernet Cable..............................................13
FOR SNAP AND LIVE..............................38
ETHERNET COMMUNICATION
Checking the Precision of Calibration .........59
SOFTWARE...............................................24
COMMUNICATION CONDITION ..............48
ETHERNET SETTING ..............................190
Communication Test.....................................29
Example of Setting a Grid Pattern Frame ..70
COMMUNICATION TEST...........................49
Executing a Robot Program........................184
CONFIGURATION.......................................11
EXECUTION BY ROBOT CONTROLLER108,
Connecting Cables ........................................27

-i-1-
INDEX B-81444EN/03

176 O
Execution of Simple Re-calibration..............81
OPTION SETTING.....................................188
EXIT 3D LASER VISION SENSOR ............37
OPTIONAL FUNCTION OF LOCATION
Extra care region ..........................................96
TOOL .........................................................95
F OPTIONS ....................................................187
OTHER APPLICATION EXAMPLES .......186
Find Grid Pattern .........................................67
OUTLINE OF 3D LASER VISION SENSOR4
Frame Grabber Board...................................12
OVERVIEW OF THE MANUAL....................2
FRAME GRABBER BOARD ........................15


PAINTING AN IMAGE WITH A RED PEN40
GRID PATTERN CALIBRATION (2D
PC I/O Cable..................................................13
MEASUREMENT) ....................................66
Personal Computer .......................................12
Grid Pattern Frame ......................................68
PREFACE........................................................1

PROTECTOR ................................................35
I/O Board .......................................................13 R
I/O BOARD....................................................19
RESTRICTIONS WITH A RAIL AXIS ......200
Information Obtained .........112, 128, 142, 158
ROBOT COMMUNICATION SETTING .....47
INSTALLATION SEQUENCE.....................14
ROBOT POSITION COMPENSATION.....199
Installing Hardware ...............................15, 19
Installing the 3D Laser Vision Sensor S
Software .....................................................25
Safety of Laser Sensor ....................................5
Installing the Driver Software ...............17, 21
SELECT A CAMERA....................................37
L SENSOR COORDINATE SYSTEM .............73
Sensor Coordinate System and Measurement
Laser Beam .................................................... 5
Results .......................................................73
Laser Slit Detail Display ..............................58
Setting a Reference Point .............................92
LIST OF VISION DATA...............................44
Setting a Sensor Coordinate System ...........74

Setting Laser Slit Detection Parameters.....58
Making Settings for the 3D Laser Vision Setting of 3D Measurement Calibration......53
Sensor Side ................................................28 Setting the Camera.......................................30
Making Settings for the Robot Side .............27 Settings .......................................129, 143, 159
Model Display ...............................................91 SETTINGS ..................................................113
MODIFYING LOCATION TOOL MODEL..90 SETUP...........................................................10
MOVING IMAGES .......................................39 SETUP VISION PROCESS........................103
SHOW DETAILS OF CAMERA
CALIBRATION DATA ..............................72
- i-2 -
B-81444EN/03 INDEX

SHOWING BRIGHTNESS DISTRIBUTION TRAIN MODEL ............................................87


..................................................................178 TURN ON THE LASER ...............................39
SIMPLE 2-POINT CALIBRATION..............63 TYPES OF 3D MEASUREMENT
SIMPLE RE-CALIBRATION .......................81 FUNCTIONS ...........................................111
SNAP A NEW IMAGE FROM A CAMERA .37 U
START 3D LASER VISION SENSOR .........37
Uninstalling the 3D Laser Vision Sensor
Statistic Function..........................................97
Software.....................................................25
STATUS OF FINDING.................................40
Uninstalling the Driver Software ..........18, 23

Use BMP Files ....................................107, 175
TEACHING AND EXECUTION THE 2D V
VISION PROCESS..................................102
VISION DATA ..............................................41
TEACHING AND TESTING THE 3D
VISION DATA TYPES .................................42
VISION PROCESS..................................110
VISION PROCESS TEST EXECUTION ...173
TEACHING AND TESTING THE
LOCATION TOOL ....................................86 W
Teaching Procedure ............124, 139, 154, 168
Warning Label ................................................5
TEST VISION PROCESS...........................106
TESTING LOCATION TOOL ......................99

-i-3-
Revision Record

FANUC VISION V-500i A/3DL 3D Sensor OPERATOR'S MANUAL(B-81444EN)

-Addition and modification for 3D Vision Sensor


Mar., 2004 detection function
03 -Deletion of 3D Vision Sensor bin picking function

-Laser class of 3D Vision Sensor is changed.


(Class 3B to Class 3A)
Mar.,2002
02 -Addition and modification for 3D Vision Sensor
detection function
-Addition and modification for 3D Vision Sensor
bin picking function

Feb.,2001
01

Edition Date Contents Edition Date Contents

You might also like