Professional Documents
Culture Documents
! WARNING
This equipment generates, uses, and can radiate radio
frequency energy and if not installed and used in accordance
with the instruction manual, may cause interference to radio
communications. As temporarily permitted by regulation, it
has not been tested for compliance with the limits for Class A
computing devices pursuant to subpart J of Part 15 of FCC
Rules, which are designed to provide reasonable protection
against such interference. Operation of the equipment in a
residential area is likely to cause interference, in which case
the user, at his own expense, will be required to take
whatever measure may be required to correct the
interference.
! WARNING
Information appearing under WARNING concerns the
protection of personnel. It is boxed and in bold type to set it
apart from other text.
! CAUTION
Information appearing under CAUTION concerns the protection of
equipment, software, and data. It is boxed to set it apart from
other text.
FANUC Robotics North America, Inc. is not, and does not represent itself as an expert in laser safety
systems, equipment, or the specific safety aspects of your company and/or its workforce. It is your
responsibility as the owner, employer, or user to take such steps as may be necessary to ensure the
safety of all personnel in the workplace.
You as the owner, employer, Laser Safety Officer (LSO), or user of laser systems, are obligated to
monitor current safety standards and be sure your safety procedures are in conformance with current
standards.
The appropriate level of safety for an installation can best be determined by safety professionals most
familiar with the particular application or installation. FANUC Robotics therefore recommends that
each customer consult with such professionals in order to provide a workplace that allows for the safe
application, use, and operation of FANUC Robotics laser systems.
Safety References
The following laser safety considerations touch briefly on the reasonable and adequate use of laser
systems. Refer to the American National Standards Institute, Inc. (ANSI) Standard For the Safe Use
of Lasers, ANSI Z136.1-2000 and Specifications for Accident Prevention Signs, ANSI Z35.1-1972 for
additional information on laser safety.
Note American National Standards are subject to periodic review and users are cautioned to obtain the
latest editions.
The Laser Safety Officer (LSO) is an individual with the authority and responsibility to monitor and
enforce the control of laser hazards and to effect the knowledgeable evaluation and control of laser
hazards. The conditions under which the laser is used, the level of safety training of individuals using
the laser, and other environmental and personnel factors are important considerations in determining
the full extent of safety control measures. Since such situations require informed judgments by
responsible persons, it is recommended that you appoint and thereafter consult a Laser Safety Officer
Recommended duties of the LSO are detailed in the For the Safe Use of Lasers manual, ANSI
Z136.1-2000, Section 1, plus additional information located in Sections 3, 4, and 5. See item in
the References section.
i
Laser Safety
All personnel who intend to operate, program, repair, or otherwise use the laser system should be
familiar with the safeguarding devices identified in this section. The intent of this section is to help
make you aware that the listed components exist in the laser system. It is not intended as an exhaustive
list or as a thorough explanation of the devices.
FANUC Robotics recommends that all individuals associated with the laser system be trained in
an approved FANUC Robotics training course and become familiar with the proper operation of
the laser system.
This chapter describes the laser system as it is originally equipped by FANUC Robotics and does not
cover modifications or reconstructions that might be performed by other individuals or companies.
Warning
Note Refer to ANSI specifications and Occupational Safety and Health Administration (OSHA)
guidelines for further information.
• Class 1
• Class 2
• Class 3 (a and b)
• Class 4
It is your responsibility as the owner, employer, LSO, or user to take such steps as might be necessary
to ensure that the laser system is installed in accordance with installation specifications. The means
and degree of safeguarding your laser system should correspond directly to the type and level of
potential hazards presented by your specific installation and application.
ii
Laser Safety
Note The safeguarding measures described in this section are intended to reflect current industry
standards, and therefore your LSO should monitor such standards for further safeguarding.
• Access restriction
• Eye protection (refer to ANSI Z136.1, Section 4.6)
• Area controls
• Barriers, shrouds, beam stops, and so forth
• Administrative and procedural controls
• Education and training
Protective Housings
A protective housing shall be provided for all classes of lasers or laser systems. The protective
housing may require interlocks and labels.
Since service personnel may remove protective housings, e.g., for alignment, special safety procedures
may be required and the use of appropriate eyewear is recommended.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.1. Refer to item 1 in
the References section.
Portions of the protective housing that are only intended to be removed from any laser or laser system
by service personnel, which then permits direct access to laser radiation associated with a Class
3b or Class 4 laser or laser system, shall either:
If the interlock can be bypassed or defeated, a warning label with the appropriate indications shall be
located on the protective housing near the interlock. The label shall include language appropriate to
the laser hazard. The interlock design shall not permit the service access panel to be replaced with
the interlock bypassed or defeated.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.3. Refer to item 1 in
the References section.
iii
Laser Safety
A Class 4 laser or laser system shall be provided with a master switch. This master switch shall effect
beam termination and/or system shutoff and shall be operated by a key, or by a coded access (such as
a computer code). The authority for access to the master switch shall be vested in the appropriate
supervisory personnel.
During periods of prolonged non-use (e.g. laser storage), the master switch shall be left in a disabled
condition (key removed or equivalent).
A single master switch on a main control unit shall be acceptable for multiple laser installations where
the operational controls have been integrated.
All energy sources associated with Class 3b or Class 4 lasers or laser systems shall be designed to
permit lockout/tagout procedures required by the Occupational Safety and Health Administration of
the U.S. Department of Labor.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.4. Refer to item 1 in
the References section.
In applications of Class 3b or Class 4 lasers or laser systems where a beam path is unenclosed, a laser
hazard analysis shall be effected by the LSO to establish the NHZ if not furnished by the manufacturer.
The LSO will define the area where laser radiation is accessible at levels above the appropriate MPE,
or levels which could impair visual performance (applicable to only visible radiation, i.e., 400-700
nm).
If variable beam divergences are possible, a NHZ determination maybe required for more than one
beam divergence (i.e. minimum divergence / maximum divergence). The LSO shall ensure correct
control measures are in place for planned operating divergences prior to laser operation.
In some cases, the total hazard assessment may be dependent upon the nature of the environment,
the geometry of the application or the spatial limitations of other hazards associated with the laser
use. This may include, for example, localized fume or radiant exposure hazards produced during
laser material processing or surgery, robotic working envelopes, location of walls, barriers, or other
equipment in the laser environment.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.6. Refer to item 1 in
the References section.
iv
Laser Safety
In applications of Class 3b or Class 4 lasers or laser systems where the beam path is confined by
design to significantly limit the degree of accessibility of the open beam, a hazard analysis shall be
effected by the LSO to establish the NHZ if not furnished by the manufacturer. The analysis will
define the area where laser radiation is accessible at levels above the appropriate MPE and will define
the zone requiring control measures. The LSO shall establish controls appropriate to the magnitude
and extent of the accessible radiation.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.6. Refer to item 1 in
the References section.
A Class 3b laser or laser system should and Class 4 laser or laser systems shall be provided with a
remote interlock connector. The interlock connector facilitates electrical connections to an emergency
master disconnector interlock, or to a room, entryway, floor, or area interlock, as may be required
for a Class 4 controlled area.
When the terminals of the connector are open circuited, the accessible radiation shall not exceed
the appropriate MPE levels.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.7. Refer to item 1 in
the References section.
A Class 4 laser or laser system shall be provided with a permanently attached beam stop or attenuator.
The beam stop or attenuator shall be capable of preventing access to laser radiation in excess of the
appropriate MPE level when the laser or laser system output is not required, as in warm up procedures.
There are a few instances, such as during service, when a temporary beam attenuator placed over the
beam aperture can reduce the level of accessible laser radiation to levels at or below the applicable
MPE level. In this case, the LSO may deem that laser eve protection is not required.
Note For those lasers or laser systems that do not require a warm-up time, the main power switch
may be substituted for the requirement of a beam stop or attenuator.
Follow the standard guidelines set forth in ANSI Z136.1-2000, section 4.3.8. Refer to item 1 in
the References section.
v
Laser Safety
An alarm (for example, an audible sound such as a bell or chime), a warning light (visible through
protective eyewear), or a verbal “countdown” command for single pulse or intermittent operations
should be used with Class 3b, and shall be used with Class 4 lasers or laser systems during activation
or startup.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.9.4. Refer to item 1 in
the References section.
A laser hazard analysis, including determination of the NHZ, shall be effected by the LSO. If the
analysis determines that the classification associated with the maximum level of accessible radiation
is Class 3b or Class 4, a laser controlled area shall be established and adequate control measures
instituted.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.10. Refer to item 1 in
the References section.
All Class 4 area or entryway safety controls shall be designed to allow both rapid egress by laser
personnel at all times and admittance to the laser controlled area under emergency conditions.
All personnel who require entry into a laser controlled area shall be appropriately trained, provided
with appropriate protective equipment, and shall follow all applicable administrative and procedural
controls.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.10.2. Refer to item 1 in
the References section.
Note The guidelines set forth below are intended to reflect current industry standards, and therefore
your LSO should monitor such standards for further safeguarding developments.
• Be controlled to permit lasers and laser systems to be operated only by personnel who have been
trained in the operation of the laser, laser system and laser safety.
• Be posted with the appropriate warning sign(s). An appropriate warning sign shall be posted at the
entryway(s) and, if deemed necessary by the LSO, should be posted within the laser controlled
area.
vi
Laser Safety
• Be operated in a manner such that the path is well defined and projects into a controlled airspace
when the laser beam must extend beyond and indoor controlled area, particularly to the outdoors
under adverse atmospheric conditions, i.e.; rain, fog, snow, etc.
• Be under the direct supervision of an individual knowledgeable in laser safety.
• Be located so that access to the area by spectators is limited and requires approval.
• Have any potentially hazardous beam terminated in a beamstop of an appropriate material.
• Have only diffusely reflecting materials in or near the beam path, where feasible.
• Provide personnel within the laser controlled area with the appropriate eye protection.
• Have the laser secured such that the exposed beam path is above or below eye level of a person in
any standing or seated position, except as required for medical use.
• Have all windows, doorways, open portals, etc. from an indoor facility be either covered or
restricted in such a manner as to reduce the transmitted laser radiation to levels at or below
the applicable ocular MPE.
• Require storage or disabling (for example, removal of the key) of the laser or laser system when
not in use to prevent unauthorized use.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.10.1. Refer to item 1 in
the References section.
In those conditions where removal of panels or protective housings, over-riding of protective housing
interlocks, or entry into the NHZ becomes necessary (such as for service), and the accessible laser
radiation exceeds the applicable MPE, a temporary laser controlled area shall be devised for the
laser or laser system.
Such an area, which by its nature will not have the built-in protective features as defined for a laser
controlled area, shall provide all safety requirements for all personnel, both within and outside the area.
A notice sign shall be posted outside the temporary laser controlled area to warn of the potential
hazard.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.3.12 for Class 4 laser
systems. Refer to item 1 in the References section.
Alignment Procedures
Mechanical alignments are to be completed by your FANUC Robotics representative. Any alignments
performed by other than FANUC Robotics personel could result in the warranty becoming void.
vii
Laser Safety
Warning
SAFEGUARDING PERSONNEL
FANUC Robotics recommends that all personnel potentially associated with the laser systems be
trained in an approved FANUC Robotics training course and become familiar with the proper
operation of the system.
Note Additional guidelines for general robotic safety are set forth in the robotic safety section of the
manual. You should consult these guidelines before operating a robot and robotic systems.
In addition to the robot Safety guidelines set forth in this section, the following list of precautions
should be considered to help safeguard the teacher and operator of a laser system:
• Wear the necessary personal protective equipment. Refer to the Personal Protective Equipment
section for specifics.
• Do not watch laser operation without the proper protective eyewear as described in the Personal
Protective Equipment section. The robot warning indicator will turn ON just prior to laser
activation.
• Check that all laser safeguards are in place and functioning as intended.
• Make sure the work envelope is secured prior to initiating production operation. Securing the
area requires that all safety barriers are in place and all safety signals are active. Follow the
standard guidelines set forth in ANSI Z136.1-2000, for all safety barriers. Refer to item 1 in the
References section.
Your eyes could be exposed directly to the laser beam when the Maximum Permissible Exposure
(MPE) has been be exceeded.
Eye protection devices which are specifically designed for protection against radiation from Class 4
lasers or laser systems shall be administratively required and their use enforced when engineering
or other procedural and administrative controls are inadequate to eliminate potential exposure in
excess of the applicable MPE level.
Laser protective eyewear may include goggles, face shields, spectacles, or prescription eyewear
using special filter materials or reflective coatings (or combination of both) to reduce the potential
ocular exposure below the applicable MPE level.
viii
Laser Safety
Laser protective eyewear shall be specifically selected to withstand either direct or diffusely scattered
beams. In this case, the protective filter shall exhibit a damage threshold for a specified exposure
time, typically 10 seconds. The eyewear shall be used in a manner so that the damage threshold is
not exceeded in the “worst case” exposure scenario. Important in the selection of laser protective
eyewear is the factor of flammability.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.6.2. Refer to item 1 in
the References section.
Maintenance of the laser should follow procedures in the laser system documentation.
Where possible, perform maintenance without power to the robot and the laser. Use lockout and
tagout procedures as defined by your plant safety procedures, and release or block all stored energy,
such as air.
FANUC Robotics recommends that all repair operations be performed with the controller power
turned OFF and the attenuator bolted over the laser aperture.
Routine maintenance, such as cleaning, replacement of lens, greasing gears, or changing air filters,
can be performed with the attenuator in place.
Wear the necessary personal protective equipment. Refer to the Personal Protective Equipment section
for specifics. Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.6. Refer
to item 1 in the References section.
It is your responsibility as the owner, employer, LSO, or user to take such steps as might be necessary
to ensure that potentially affected personnel are aware of and use available protective equipment.
Protective Eyewear
All laser protective eyewear shall be clearly labeled with the optical density and wavelength for which
protection is afforded. Color coding or other distinctive identification of laser protective eyewear is
recommended in multi-laser environments. Refer to the Safeguarding Personnel section set forth in
ANSI Z49.1-1988, item 6 and ANSI Z136.1-2000, item 1 in the References section for specifics.
Physical and chemical hazards to the eye can be reduced by the use of face shields, goggles, and
similar protective devices.
ix
Laser Safety
Periodic inspection should be made of protective eyewear to ensure they are in satisfactory condition.
All warning signs and labels should be displayed and contain appropriate warning and cautionary
statements. A label should be provided and placed on the laser housing or control panel. However,
if the housing and control panel are separated by more than two meters, labels should be placed on
both the housing and control panel.
• The warning labels affixed to the laser housing and control panels should not under any
circumstances be removed.
• In the event either label becomes detached, it should be immediately reattached to the housing
or the control panel.
Follow the standard guidelines set forth in ANSI Z35.1-1972 (or latest revision thereof) and Federal
Register 21CFR, section 1040.10 (g); Warning Labels; certification 1010.2 and 1010.3. Refer to item
1 in the References section.
Design of Signs
Sign dimensions, letter size and color, etc., shall be in accordance with American National Standard
Specification for Accident Prevention Signs, ANSI Z535 series (latest revision thereof).
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7.1. Refer to item 1 in
the References section.
Note Refer to the applicable maintenance or operations manual for a description of the signs and
labels used in your specific system.
Symbols
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7. Refer to item 1 in the
References section.
Note Classification labeling in accordance with the Federal Laser Product Performance Standard as
described in item 3 of the References section, can be used as a guideline for the labeling discussed
in this section.
x
Laser Safety
Figure 1. Sample Warning Sign for Certain Class 3b and Class 4 lasers
Signal Words
The following signal words are used with the ANSI Z535 design laser signs and labels:
• The signal word “Danger” shall be used with all signs and labels associated with all Class 3a
lasers and laser systems that exceed the appropriate MPE for irradiance and all Class 3b and Class
4 lasers and laser systems.
• The signal word “Caution” shall be used with all signs and labels associated with Class 2 laser
and laser systems and all Class 3a lasers and laser systems that do not exceed the appropriate
MPE for irradiance.
• The signal word “Notice“ shall be used on signs posted outside a temporary laser controlled
area, for example, during periods of service.
Note When a temporary laser controlled area is created, the area outside the temporary area remains
Class 1, while the area within is either Class 3b or Class 4 and the appropriate danger warning is also
required within the temporary laser controlled area.
xi
Laser Safety
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7.3. Refer to item 1 in
the References section.
Location of Signs
All signs shall be conspicuously displayed in locations where they best will serve to warn onlookers.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 4.7.4.3. Refer to item 1 in
the References section
xii
Laser Safety
NON-BEAM HAZARDS
ANSI Z136.1, Section 7
You as the owner, employer, LSO, or user of laser systems, are obligated to monitor current safety
standards and be sure your safety procedures are in conformance with current standards. It is your
responsibility to take such steps as might be necessary to inform personnel of possible hazards within
a laser system.
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 7. Refer to item 1 in the
References section.
Electrical Hazards
The use of lasers or laser systems can present an electric shock hazard. This may occur from contact
with exposed utility power utilization, device control, and power supply conductors operating at
potentials of 50 volts and above. These exposures can occur during laser set up or installation,
maintenance, modification, and service where equipment protective covers are often removed to
allow access to active components as required for those activities. Those exposed can be equipment
installers, users, technicians, and uninformed members of the public, such as passers by.
The effect upon those who accidentally come into contact with energized conductors at or above 50
volts can range from a minor “tingle,” to startle reaction, to serious personal injury, or death. Because
the pathways of current are all-pervasive, such as ground, it is not possible to characterize all the
parameters in any situation to predict the occurrence or outcome of an electric shock accident. Electric
shock is a very serious opportunistic hazard, and loss of life has occurred during electrical servicing
and testing of laser equipment incorporating high-voltage power supplies.
Protection against accidental contact with energized conductors by means of a barrier system is the
primary methodology to prevent electric shock accidents with laser equipment. Hazard warnings
and safety instructions extend the safety system to embody exposures caused by conditions of use,
maintenance, and service, and provide protection against the hazards of possible equipment misuse.
Additional electrical safety requirements are imposed upon laser devices, systems, and those
who work with them, by the United States Department of Labor, Occupational Safety and Health
Administration (OSHA), the National Electrical Code (NFPA 70), and related state and local, laws
and regulations. These requirements govern equipment connection to the electrical utilization system,
electrical protection parameters, and specific safety training. These requirements must be observed
with all laser installations. The following potential electrical problems have frequently been identified
during laser facility audits:
xiii
Laser Safety
Follow the standard guidelines set forth in ANSI Z136.1-2000, Section 7.2. Refer to item 1 in the
References section.
Fire Hazards
Class 4 laser beams represent a fire hazard. Enclosure of Class 4 laser beams can result in potential
fire hazards if enclosure materials are likely to be exposed to irradiances exceeding 10 W or
beam powers exceeding 0.5 W. Under some situations where flammable compounds or substances
exist, it is possible that fires can be initiated by Class 3 lasers. The LSO should encourage the use of
flame retardant materials wherever applicable with all laser applications.
Opaque laser barriers, e.g., curtains, can be used to block the laser beam from exiting the work area
during certain operations. While these barriers can be designed to offer a range of protection, they
normally cannot withstand high irradiance levels for more than a few seconds without some damage,
e.g., production of smoke, open fire, or penetration. Users of commercially available laser barriers
should obtain appropriate fire prevention information form the manufacturer. Users can also refer to
the National Fire Protection Association (NFPA) Code #115 for further information on controlling
laser induced firs.
Operators of Class 4 lasers should be aware of the ability of unprotected wire insulation and plastic
tubing to catch on fire from intense reflected or scattered beams, particularly from lasers operating at
invisible wavelengths.
REFERENCES
The following list consists of all safety guideline reference manuals that you need to refer to:
• 1 American National Standards Institute, Inc. (ANSI) Standard For the Safe Use of Lasers,
ANSI Z136.1-2000 and Specifications for Accident Prevention Signs, ANSI Z35.1-1972 (or
latest revisions thereof)
• 2 Connections and Maintenance Manual for FYSL
xiv
Laser Safety
• 3 Federal Laser Product Performance Standard, Federal Register 21CFR, Part III; Vol 50, #161
- 8/20/85
• 4 American Conference of Governmental Industrial Hygienists
• 5 Polymeric Materials for use in Electric Equipment, Underwriters Laboratories Standard, UL
746C
• 6 American National Standards Institute, Inc. (ANSI) Standard Safety in Welding and Cutting,
ANSI Z49.1-1988; Chapter 4, Sections 4.2 and 4.3.
• 7 ANSI/RIA R15.06-1999 American National Standard for Industrial Robots and Robot Systems -
Safety Requirements
• 8 ANSI Z535 Safety Sign and Color Standards
• Guidelines for Laser Safety and Hazard Assessment, Occupational Safety and Health
Administration (OSHA) Pub 8–1.7 8/19/91 (or latest revisions thereof).
• Compliance Guide for Laser Products, US Department of Health and Human Services, Food and
Drug Administration (FDA) 86-8260 9/85 (or latest revisions thereof).
• National Electrical Codes/Joint Industrial Council (NEC/JIC).
• Federal Register 21CFR, Part III; Vol 50, #161 - 8/20/85 Laser Products; Amendments to
Performance Standard; Final Rule, along with Labeling rules 1010.2 and 1010.3 (or latest
revisions thereof) and the FDA Department of Health and Human Services; Parts 1000 & 1040.
• Public Law 90-602, 90th Congress, H.R. 10790-10/18/68 (or latest revisions thereof).
• Reviewing the Federal Standard for Laser Products, Lasers & Optronics-3/88.
• JIC Electrical Standards for General Purpose Machine Tools; EGP-1-67
• Electrical Standards for Mass Production Equipment; EMP-1-67.
• NEC/NFPA ‘79 Electrical Code - These are the minimum electrical requirements with any
supplements from your local area.
xv
B-81444EN/03 CONTENTS
CONTENTS
1 PREFACE ......................................................................................................................1
1.1 OVERVIEW OF THE MANUAL........................................................................................................................ 2
1.2 OUTLINE OF 3D LASER VISION SENSOR .................................................................................................... 4
1.2.1 3D Laser Vision Sensor System Configuration ................................................................................... 4
1.2.2 Safety of Laser Sensor.......................................................................................................................... 5
1.2.3 Laser Beam ........................................................................................................................................... 5
1.2.4 Warning Label ...................................................................................................................................... 5
1.3 3D LASER VISION SENSOR SETUP PROCEDURE....................................................................................... 8
2 SETUP .........................................................................................................................10
2.1 CONFIGURATION ........................................................................................................................................... 11
2.1.1 Personal Computer............................................................................................................................. 12
2.1.2 Frame Grabber Board ........................................................................................................................ 12
2.1.3 3D Laser Vision Sensor ...................................................................................................................... 12
2.1.4 Camera Adapter and Adapter Cable.................................................................................................. 12
2.1.5 Camera Cable ..................................................................................................................................... 13
2.1.6 I/O Board............................................................................................................................................. 13
2.1.7 PC I/O Cable ....................................................................................................................................... 13
2.1.8 Ethernet Cable.................................................................................................................................... 13
2.2 INSTALLATION SEQUENCE.......................................................................................................................... 14
2.3 DISPLAY ........................................................................................................................................................... 14
2.4 FRAME GRABBER BOARD ........................................................................................................................... 15
2.4.1 Installing Hardware ........................................................................................................................... 15
2.4.2 Installing the Driver Software........................................................................................................... 17
2.4.3 Uninstalling the Driver Software ...................................................................................................... 18
2.5 I/O BOARD ....................................................................................................................................................... 19
2.5.1 Installing Hardware ........................................................................................................................... 19
2.5.2 Installing the Driver Software........................................................................................................... 21
2.5.3 Uninstalling the Driver Software ...................................................................................................... 23
2.6 ETHERNET COMMUNICATION SOFTWARE .............................................................................................. 24
2.7 3D LASER VISION SENSOR........................................................................................................................... 25
2.7.1 Installing the 3D Laser Vision Sensor Software ............................................................................... 25
2.7.2 Uninstalling the 3D Laser Vision Sensor Software .......................................................................... 25
2.8 ETHERNET ....................................................................................................................................................... 27
2.8.1 Connecting Cables .............................................................................................................................. 27
2.8.2 Determining the IP Address............................................................................................................... 27
2.8.3 Making Settings for the Robot Side................................................................................................... 27
2.8.4 Making Settings for the 3D Laser Vision Sensor Side...................................................................... 28
2.8.5 Communication Test........................................................................................................................... 29
2.9 CONNECTING THE CAMERA ....................................................................................................................... 30
2.9.1 Setting the Camera ............................................................................................................................ 30
2.9.2 APC-3322A.......................................................................................................................................... 31
2.10 PROTECTOR..................................................................................................................................................... 35
3 3D LASER VISION SENSOR BASIC OPERATIONS ..................................................36
3.1 START 3D LASER VISION SENSOR.............................................................................................................. 37
3.2 EXIT 3D LASER VISION SENSOR................................................................................................................. 37
3.3 SELECT A CAMERA........................................................................................................................................ 37
3.4 SNAP A NEW IMAGE FROM A CAMERA .................................................................................................... 37
3.5 DISPLAY LIVE IMAGES ................................................................................................................................. 38
3.6 CHANGING THE SHUTTER SPEED MODE FOR SNAP AND LIVE .......................................................... 38
3.7 TURN ON THE LASER.................................................................................................................................... 39
3.8 ENLARGING AND REDUCING IMAGES ..................................................................................................... 39
3.9 MOVING IMAGES ........................................................................................................................................... 39
3.10 ERASING ALARMS ......................................................................................................................................... 40
3.11 PAINTING AN IMAGE WITH A RED PEN .................................................................................................... 40
- c-1 -
CONTENTS B-81444EN/03
- c-2 -
B-81444EN/03 CONTENTS
A ERROR CODES.........................................................................................................193
- c-3 -
1.PREFACE B-81444EN/03
1 PREFACE
This chapter describes the overview of this manual, outline of the
FANUC 3D Laser Vision Sensor, and safety precautions which should
be noted before operating this robot.
-1-
B-81444EN/03 1.PREFACE
Overview
The "FANUC VISION V-500iA/3DL (3D Laser Vision Sensor)
Operator's Manual" describes how to operate the 3D Laser Vision
Sensor controlled by the R-J3 or R-J3iB control unit.
In this manual, only the operation and the technique of programming
for the dedicated sensor functions are explained assuming that the
installation and the setup of the robots have the 3D Laser Vision
Sensor are completed. Refer to the "HANDLING TOOL Operations
Manual" about other operations of FANUC Robots.
CAUTION
This manual is based on 3D Laser Vision Sensor
software Version 1.02.0118. Note that functions
and setting items not described in this manual are
available or some notation differences are present,
depending on the version.
-2-
1.PREFACE B-81444EN/03
Related manuals
The following manuals are available for the 3D Laser Vision Sensor.
Controller Operations Manual (R-J3iB) Topics: Robot functions, operations, programming, interfaces,
HANDLING TOOL alarms
B-81464EN-2 Use: Applicable design, robot installation, teaching,
adjustment
FANUC VISION V-500iA/3DL Topics: 3D Laser Vision Sensor function, operation,
Operator's Manual programming, alarms
(This manual) Use: Teaching, adjustment
B-81444EN
R-J3iB Topics: Installation and set-up, connection to peripheral
Maintenance Manual equipment, maintenance of the system
B-81465EN Use: Installation, start-up, connection, maintenance
Mechanical unit Force sensor / 3D Laser Topics: Connection of the sensors, robot, and control devices,
Vision Sensor maintenance of the sensors
Maintenance Manual Use: Connection and maintenance of the sensors
B-81155EN
-3-
B-81444EN/03 1.PREFACE
System configuration
3D Laser Vision Sensor system includes a 3D Laser Vision Sensor
with the mechanical units and the robot controller.
Peripheral equipment and external control equipment can also be
added.
Robot Controller
Mechanical Unit
-4-
1.PREFACE B-81444EN/03
-5-
B-81444EN/03 1.PREFACE
-6-
1.PREFACE B-81444EN/03
-7-
B-81444EN/03 1.PREFACE
Installing and connecting Install and connect the 3D Laser Vision Sensor on the floor.
3D vision sensor For details, see Chapter 2, "SETUP".
Robot setting
Set a robot controller to be connected with the 3D Laser Vision
Sensor.
For details, see Chapter 5, "ROBOT COMMUNICATION
SETTING".
Camera setup
-8-
1.PREFACE B-81444EN/03
Based on the camera setup and location tool settings mentioned above,
a vision process is created for an actual measurement. A vision
process for 2D measurement and a vision process for 3D measurement
need to differ from each other.
For details, see Chapter 8, "TEACHING AND EXECUTION THE 2D
VISION PROCESS", and Chapter 9, "TEACHING AND TESTING
THE 3D VISION PROCESS".
-9-
B-81444EN/03 2.SETUP
2 SETUP
This chapter describes the following operating procedures, which are
necessary to set up the 3D Laser Vision Sensor system.
- 10 -
2.SETUP B-81444EN/03
2.1 CONFIGURATION
Camera
Frame grabber
3D Laser
PC Vision Sensor
100 BaseT Adapter
R-J3iB Ethernet cable cable
Sensor cable
The following table lists the necessary devices and their quantity.
Device Quantity
1 Personal computer 1 set
2 Frame grabber board 1 set
3 3D Laser Vision Sensor 1 each
4 Camera adapter 1 each
5 Camera cable 1 set
6 Adapter cable 1 each
7 Sensor cable 1 set
8 I/O board 1 set
9 PC I/O cable 1 each
10 Ethernet cable 1 each or more
(depending on the network configuration)
- 11 -
B-81444EN/03 2.SETUP
- 12 -
2.SETUP B-81444EN/03
- 13 -
B-81444EN/03 2.SETUP
2.3 DISPLAY
(1) Click the right button of the mouse (right-click the mouse) on the
desktop at a portion where nothing is displayed.
(2) Select [Properties]. The [Display Properties] dialog box
appears.
(3) Select the [Setttings] tab.
(4) Click the [Colors] button and [Screen area] button. A list of
modes appears. Select, from the list, a mode in which the color
palette supports 65536 colors or more, the desktop area supports
1024 × 768 pixels or more, and the refresh rate that matches the
specification of the monitor used with the PC.
(5) Click the OK button to apply the setting and close the dialog
box.
- 14 -
2.SETUP B-81444EN/03
(6) On the screen below, choose “Search for a suitable driver for my
device [recommended]” then click Next >.
- 15 -
B-81444EN/03 2.SETUP
(7) On the screen below, check “CD-ROM drives” then click Next
>.
For details such as the settings of the status switches and jumper pins
on the frame grabber board, and insertion into the PCI bus slot, refer
- 16 -
2.SETUP B-81444EN/03
NOTE
If new hardware detection is not performed
automatically, choose [Start] → [Settings] → [Control
Panel] → [Add/Remove Hardware]
Installing
To install the driver software, follow this procedure:
(1) Get a 3D Laser Vision Sensor software installation CD-ROM and
insert it into the CD-ROM drive of the PC.
(2) Execute SETUP.EXE in the folder D:¥AvalData*** from
"Run…" (D:¥ indicates the name of the CD-ROM drive and ***
indicates the version of the driver software.)
(3) Install the software as directed in the menu that appears.
- 17 -
B-81444EN/03 2.SETUP
- 18 -
2.SETUP B-81444EN/03
(6) On the screen below, choose “Search for a suitable drive for my
device [recommended]” then click Next >.
- 19 -
B-81444EN/03 2.SETUP
(7) On the screen below, check “CD-ROM drives” then click Next
>.
- 20 -
2.SETUP B-81444EN/03
Refer to the applicable I/O board and PC operator's manuals for details
of each step of the setup procedure, such as setting the jumper pins on
the I/O board and attaching the I/O board to the PCI bus.
NOTE
If new hardware detection is not performed
automatically, choose [Start] → [Settings] → [Control
Panel] → [Add/Remove Hardware].
Installing
To install the driver, follow this procedure:
(1) Get a 3D Laser Vision Sensor software installation CD-ROM and
insert it into the CD-ROM drive of the PC.
(2) Execute SETUP.EXE in the folder D:¥CONTEC API-DIO
*****¥Disk1 from "Run…" (D:¥ indicates the name of the
CD-ROM drive and ***** indicates the version of the driver
software.)
(3) Install the software as directed in the menu that appears.
(4) Upon completion of driver installation, choose [Start] →
[Programs] → [CONTEC API-PIC(W32)] → [API-TOOL
Configuration]. The board is automatically recognized. The
screen below shows an example of how the board is recognized.
- 21 -
B-81444EN/03 2.SETUP
- 22 -
2.SETUP B-81444EN/03
- 23 -
B-81444EN/03 2.SETUP
- 24 -
2.SETUP B-81444EN/03
- 25 -
B-81444EN/03 2.SETUP
- 26 -
2.SETUP B-81444EN/03
2.8 ETHERNET
(4) Set a name for the controller in "Robot name". (Any name, for
example "a," is acceptable.)
(5) Set the IP address of the controller in "Robot IP addr".
(6) Set up a subnet mask in "Subnet Mask".
(7) Set the name and IP address of the controller, the name and IP
address of the router (if the router is available), and the name and
IP address of the PC to which the 3D Laser Vision Sensor is
connected in “Host Name (LOCAL)" and "Internet Address”.
(8) Turn the power for the robot controller off and on again.
- 27 -
B-81444EN/03 2.SETUP
- 29 -
B-81444EN/03 2.SETUP
The models that can be used in this system include the following
seven: XC-ST70, XC-ST50, XC-ST30, XC-ES50, XC-ES30,
XC-EI50, and XC-EI30 from SONY.
2.9.2 APC-3322A
When a camera adapter and adapter cable recommended in Subsection
2.1.4, "Camera Adapter and Adapter Cable" are used, up to three
cameras for 2D image processing can be connected in addition to one
3D Laser Vision Sensor.
However, the method of connection depends on the model selected.
CAUTION
If the trigger signal line is not connected correctly, the
shutter speed witch function of the camera does not
operate normally.
TIP
The reason why the trigger signal line is connected to
the VIDEO2 terminal when the DC-700 is connected to
the sensor head of the 3D Laser Vision Sensor is that a
camera cable dedicated to the 3D Laser Vision Sensor
is used for the camera cable (ultimately connected to
the sensor head) of the wrist section of the robot.
When a commercially available camera cable (see
Subsection 2.1.5 "Camera Cable") is used, for
example, to install the 3D Laser Vision Sensor on a
fixed-installation basis or to check for a broken cable
temporarily, make a connection to the TRIG terminal.
- 31 -
B-81444EN/03 2.SETUP
Frame grabber
board
Camera adaptor #2
Camera #2
Red VIDEO 1
DC IN
VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
Green TRIG
Camera adaptor #3
Camera #3
Orange VIDEO 1
DC IN
VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
Green TRIG
Camera adaptor #4
Camera #4
Yellow VIDEO 1
DC IN
VIDEO 2 / SYNC
Blue HD CAMERA
Purple VD
Green TRIG
- 32 -
2.SETUP B-81444EN/03
CAUTION
If the trigger signal is not set correctly, the shutter
speed control function of the camera does not operate
normally.
TIP
The reason why the trigger signal is set using pin 9
when the camera connection section of the APU-332 is
connected to the sensor head of the 3D Laser Vision
Sensor is that a camera cable dedicated to the 3D
Laser Vision Sensor is used for the camera cable
(ultimately connected to the sensor head) of the wrist
section of the robot. When a commercially available
camera cable (see Subsection 2.1.5, "Camera Cable")
is used, for example, to install the 3D Laser Vision
Sensor on a fixed-installation basis or to check for a
broken cable temporarily, use pin 11 for setting.
CAUTION
Before setting the trigger output signal, be sure to
unplug the power socket of the APU-332.
- 33 -
B-81444EN/03 2.SETUP
3D vision sensor
Camera adapter
DC IN
/ SYNC
CAM1
Camera #2
DC IN
CAM2 / SYNC
BOARD Camera #3
DC IN
CAM3
/ SYNC
Frame grabber
Camera #4
board CAM4
CH4 DC IN
/ SYNC
- 34 -
2.SETUP B-81444EN/03
2.10 PROTECTOR
Clicking the OK button in the dialog box closes the dialog box. In
this case, nothing is detected.
If no protector is connected to the PC, stop the 3D Laser Vision
Sensor, and connect a protector correctly, then start the 3D Laser
Vision Sensor .
If you want to use a printer, attach its printer cable to the far end of the
protector. In this case, the printer can be used in the same way as
when it is connected directly to the printer port.
CAUTION
Turn off the power to the personal computer before
connecting the protector.
- 35 -
B-81444EN/03 3.3D LASER VISION SENSOR BASIC OPERATIONS
- 36 -
3.3D LASER VISION SENSOR BASIC OPERATIONS B-81444EN/03
Select [Snap] from the [View] menu. Or, click the button on
the tool bar.
- 37 -
B-81444EN/03 3.3D LASER VISION SENSOR BASIC OPERATIONS
Select [Live] from the [View] menu. Or, click the button on the
tool bar. The button remains pressed and a live image is
displayed.
To stop displaying live image, click the button to snap the
image.
The shutter speed mode of the camera for snapping images and
displaying live images can be changed. The current mode is
displayed in the lower-right corner of the screen. By choosing from
the modes displayed below, the shutter speed of the camera for
subsequent image snapping and live image display is changed.
Moreover, when a program is executed, the display is changed
according to the setting of the program (see Chapter 8, "TEACHING
AND EXECUTION THE 2D VISION PROCESS", and Chapter 9,
"TEACHING AND TESTING THE 3D VISION PROCESS".)
- 38 -
3.3D LASER VISION SENSOR BASIC OPERATIONS B-81444EN/03
Select [Laser] from the [View] menu. Or, click the button on
the tool bar.
To turn off the laser, click the button again.
- 39 -
B-81444EN/03 3.3D LASER VISION SENSOR BASIC OPERATIONS
While the 3D Laser Vision Sensor is finding the object, for example
the vision process is running, "Finding" appears at the lower right of
the screen.
- 40 -
4.VISION DATA B-81444EN/03
4 VISION DATA
The data used for setting or training the 3D Laser Vision Sensor is
called vision data.
Choose [3DV] or [2DV] from [View] on the menu bar, or
click the (3D measurement) button or the (2D
measurement) button on the tool bar. The list of vision data is
displayed to enable vision data to be taught.
- 41 -
B-81444EN/03 4.VISION DATA
There are four types of vision data as described in the table below.
Description
Type of vision data
Camera setup Camera setup data is used to convert position information detected on an image into
robot coordinates. There are three types of calibration methods for the 3D Laser
Vision Sensor : 3D measurement calibration, simple 2-point calibration, and grid
pattern calibration.
The robot controller may use a camera setup name for execution. So, set a camera
setup name by using "alphanumeric characters" or "half-size kana characters". Up to
12 half-size characters may be used.
Location tool Location tool data is a 2D image processing tool that finds the taught shape on an
image and recognizes the position. The data that describes the shape of an object to
be found is called a model. In this case, the object to be found is taught as a model
for the location tool and whether or not it is found is checked.
An arbitrary location tool name can be set using characters that can be entered. No
limit is imposed on the number of characters.
Program The program defines what the 3D Laser Vision Sensor does according to a request
(vision process) from the robot controller. The program is also called as the vision process to
distinguish it from the robot controller program. When the camera setup and location
tool have been selected, if the program is started by the robot controller, it finds the
part by executing the location tool, converts the found position into robot coordinates
by using the camera setup, and then returns the result to the robot controller.
The robot controller may use a program name for execution. So, set a program
name by using "alphanumeric characters" or "half-size kana characters". Up to 12
half-size characters may be used.
Robot Robot data describes the robot controller that communicates with the 3D Laser Vision
Sensor via Ethernet.
An arbitrary robot name can be set using characters that can be entered. No limit is
imposed on the number of characters.
- 42 -
4.VISION DATA B-81444EN/03
Camera
- 43 -
B-81444EN/03 4.VISION DATA
On this screen, you can newly create, delete, copy, or rename vision
data.
- 44 -
4.VISION DATA B-81444EN/03
All of the settings of taught camera setup data, location tool data,
vision process data, and robot data are saved in the following folder
set on the option setting screen (see Section 11.1 "OPTION
SETTING"):
[Data folder]
All files in this folder contain user-taught data. So, such data, when
lost, cannot be restored. For safety, it is recommended to back up all
files contained in this folder.
The settings of camera setup data, location tool data, vision process
data, and robot data can be checked by opening each setting screen.
Besides, the settings can be checked by creating and opening a list file
(CSV format) containing the settings of all data.
Choose [Output Taught Data] from [Tool] on the menu bar. The
screen shown below is displayed. In a folder specified here, a file
listing all taught data is recorded.
The standard save destination is [Data folder] on the option setting
screen (see Section 11.1"OPTION SETTING").
In the case of simultaneous execution, the files below are also saved at
the same time. These files are related to the bin picking function
- 45 -
B-81444EN/03 4.VISION DATA
Item Description
Overall search SearchProgram(date-time).csv
program (Example: SearchProgram2003-08-27-13-32-22.csv)
Trial measurement TrialProgram(date-time).csv
program (Example: TrialProgram2003-08-27-13-32-22.csv)
Overlap detection OverlapProgram(date-time).csv
program (Example: OverlapProgram2003-08-27-13-32-22.csv)
Fine detection FineProgram(date-time).csv
program (Example: FineProgram2003-08-27-13-32-22.csv)
Environment data Environment(date-time).csv
(Example: Environment2003-08-27-13-32-22.csv)
- 46 -
5.ROBOT COMMUNICATION SETTING B-81444EN/03
- 47 -
B-81444EN/03 5.ROBOT COMMUNICATION SETTING
Item Description
Enable Communication If you check this item, the Ethernet communication to this robot is enabled. If the
robot controller specified by the IP address and PC are connected (see Section
5.2, "CONNECTION STATUS"), the 3D Laser Vision Sensor process can be
invoked from the robot controller. And if the robot is selected in the 3D Laser
Vision Sensor process, the found result and found position are sent from the 3D
Laser Vision Sensor to the robot controller. If you do not check this item, even
when the robot is selected in the 3D Laser Vision Sensor process, the
communication is not performed.
Protocol Choose “RobotServer” or “R90”. Both are communication protocol names.
The choice of “R90” speeds up communication unless multiple robot controllers
are connected. If multiple robot controllers are connected, choose
“RobotServer”.
IP Address IP address of the robot controller to connect.
To cancel the setting, click the Cancel button. The setting returns to
the one before modification, and the screen display returns to the
screen for setting communication with the robot controller.
CAUTION
You must not specify the same IP address for multiple
robots. In the same way, you must not create the
multiple robots with the same IP address. Otherwise,
when the robot controller invokes the 3D Laser Vision
Sensor process, the same 3D Laser Vision Sensor
process is invoked twice.
- 48 -
5.ROBOT COMMUNICATION SETTING B-81444EN/03
- 49 -
B-81444EN/03 5.ROBOT COMMUNICATION SETTING
The figure above shows alarm history information displayed when the
3D Laser Vision Sensor software is started in the normal state.
The robot name is "Robot".
“Receive a reply” indicates that the network can be used between the
personal computer and robot controller. “Connection established”
indicates that communication is established between the 3D Laser
Vision Sensor software and robot controller.
The figure above shows alarm history information displayed when the
3D Laser Vision Sensor software is started with the power to the robot
controller turned off.
“No reply” indicates that the network cannot be used between the
personal computer and robot controller, and the result is that the 3D
Laser Vision Sensor software has failed to establish communication
with the robot controller.
TIP
If the state of communication between the personal
computer and robot controller is unstable (for example,
when the cable is partly broken and the broken point is
connected or disconnected from time to time), ”Receive
a reply” and “No reply” are recorded in the alarm history
from time to time.
If the 3D Laser Vision Sensor abruptly stops returning
the result of detection to the robot or the 3D Laser
Vision Sensor abruptly consumes a longer processing
time, for example, check the state of communication
reported in the alarm history.
- 50 -
6.CALIBRATION B-81444EN/03
6 CALIBRATION
The 3D Laser Vision Sensor finds the target workpiece in the image
received from the camera and informs the robot controller of the
location of the workpiece. For the robot controller to perform
position compensation, the positional data on the image measured
with the 3D Laser Vision Sensor must have been converted to
positional data represented using the robot coordinate system (base
frame or user frame). Performing this data conversion requires
information about how and which workpiece in the tool coordinate
system or user coordinate system the 3D Laser Vision Sensor is
looking at. The operation performed to set up this information is
called 3D Laser Vision Sensor calibration.
- 51 -
B-81444EN/03 6.CALIBRATION
- 52 -
6.CALIBRATION B-81444EN/03
TIP
This calibration must be performed at all times in the
following cases:
- When the 3D Laser Vision Sensor is used for the first
time, for example, in the initial installation or
immediately after the sensor is replaced
- When the relative positional relationship between the
camera unit and laser unit may have changed for a
cause such as a collision
When the 3D Laser Vision Sensor is installed or
uninstalled, calibration is not required if no modification
is made to the 3D Laser Vision Sensor itself.
- 53 -
B-81444EN/03 6.CALIBRATION
Grid spacing
Enter the grid point interval of the grid pattern jig to be used.
For information about a grid point interval, see Appendix B,
"CALIBRATION JIG". (Default value: 15 mm)
Camera type
The type of camera used with the 3D Laser Vision Sensor is
“SONY ES-30”. (Default value: “SONY ES-30”)
This value need not be changed usually. If this value needs to
be changed for a cause, click the Change… button, then choose a
desired type of camera on the screen shown below. Choose a
desired type of camera from Camera Type then click the Close
button.
Grid position
Enter the grid pattern position of the calibration jig. Usually,
+50 mm is to be set in Plane1, and -50 mm is to be set in Plane2.
For information about the position of the calibration jig specified
by +50 mm and -50 mm, see Subsection 6.2.2, "Detecting a Grid
Pattern and a Laser". (Default values: +50 mm in Plane1 and
-50 mm in Plane2) The setting of Plane1 is reflected when you
click the Snap1 button. The setting of Plane2 is reflected when
you click the Snap2 button.
- 54 -
6.CALIBRATION B-81444EN/03
First, put the grid pattern plate of the calibration jig in the place closer
to the camera (+50 mm), and click the Snap 1 button. A dialog box
for teaching the window area appears. Teach the position of the
window so that the grid pattern fits in the window, and click the OK
button.
- Check that red lines representing laser beam points are displayed
across the window.
The figure below shows an example of correct detection.
Next, place the grid pattern plate of the calibration jig in the place
farther from the camera (-50 mm), then click the Snap2 button. The
dialog box for teaching the window is displayed. Teach the window
properly so that the grid pattern plate fits nicely, then click the OK
button.
Make the same checks as done when you click the Snap1 button.
This completes calibration.
- 56 -
6.CALIBRATION B-81444EN/03
If detection has not been performed correctly, start all over again.
At the same time, the detail data of the calibrated camera can be
checked.
Camera data
Information such as the scale and lens distortion of the calibrated
camera is displayed. Check “Focal length” and “Lens
distortion” according to the following guideline:
Camera position
The position and posture of the camera in the sensor coordinate
system are displayed. For information about the sensor
coordinate system, see Section 6.6, "SENSOR COORDINATE
SYSTEM".
- 57 -
B-81444EN/03 6.CALIBRATION
- 58 -
6.CALIBRATION B-81444EN/03
CAUTION
These parameters, when tuned properly, can produce a
better calibration result. However, the precision of
calibration can deteriorate, depending on the tuning of
these parameters.
If the laser beam cannot be calibrated properly,
measures such as adjusting the illumination
environment and moving the robot to another position
should be taken before tuning these parameters.
- 59 -
B-81444EN/03 6.CALIBRATION
- 60 -
6.CALIBRATION B-81444EN/03
- 61 -
B-81444EN/03 6.CALIBRATION
CAUTION
3D Laser Vision Sensor can handle only the X-Y plane
of the robot user coordinate system, for instance, the
Y-Z plane, Y-X plane (reverse side of X-Y plane) can
not be handled.
Be sure to perform calibration at the same height as an
actual workpiece. If the calibration plane is not at the
same height as an actual workpiece, an error can
result.
From the Camera No. drop-down list, select the number of the camera
you want to calibrate.
Robot to get pos. is used to get the current robot position by the
Ethernet communication during camera calibration teaching.
If you click the Select button, the list of robots is displayed as follows.
Select the robot from the list.
- 62 -
6.CALIBRATION B-81444EN/03
Prepare two points in the field of view of the camera, to which a robot
can touch up by using the TCP.
The two points must be at the same height as the workpiece. You will
get better accuracy when you place the two points apart as from each
other as possible as shown below.
FOV
x
x
P1
P2
- 63 -
B-81444EN/03 6.CALIBRATION
Robot position
When the personal computer and robot controller are connected
with each other via Ethernet, the current robot position can be
obtained by using the Record button instead of by reading X and
Y of the current position. Click the Record button while the
center of point 1 or point 2 is touched up by the TCP of the
robot.
CAUTION
Before the robot position can be obtained,
communication must be enabled. For the setting of
communication, see Chapter 5, "ROBOT
COMMUNICATION SETTING". Moreover, before
obtaining the current robot position, check that a user
coordinate system and tool coordinate system are
selected correctly on the robot controller side.
Find
If the object representing point 1 or point 2 is a circle or ellipse,
the object can be detected and the center of point 1 or point 2 can
be automatically set by clicking the Find button, instead of by
clicking the Select button and setting the center of point 1 or
point 2 by dragging the red + mark. If the object is not a circle
or ellipse, do not use this method, because precise detection
cannot be performed.
- 64 -
6.CALIBRATION B-81444EN/03
A grid pattern calibration uses a grid pattern jig of the default shape.
About the grid pattern jig, see Appendix.B, "Calibration Jig”.
Be careful about the following points when you use the grid pattern
calibration for 2-D application:
NOTE
When the workpiece height does not match the height
after calibration, a parallax error occurs. Place the
calibration fixture at the same height as the workpiece.
It is assumed that the workpiece height corresponds to Z
= 0 in the user coordinate system. Set up the user
coordinate system so that it matches the workpiece
height.
Select the [CameraSetup] tab from the vision data list, and
double-click the camera calibration type to be taught to the robot, then
select the grid pattern calibration as the desired calibration type. You
will see the screen as shown below.
- 65 -
B-81444EN/03 6.CALIBRATION
3. Enter the interval between grid points of a grid pattern jig in Grid
spacing.
4. Click the Change button to select the type of a camera to be used.
After selecting the camera type from Camera Type, click the
Close button.
- 66 -
6.CALIBRATION B-81444EN/03
Click the Grid Pattern Frame button to set up the grid pattern frame;
that is the information where the calibration jig is place in the robot
coordinate system when you calibrate the camera.
The 3D Laser Vision Sensor provides two methods to setup grid
pattern frame as shown below.
Method Description
Direct Directly enter the location and orientation of grid frame in the
robot user coordinate system ( See Appendix B,
“CALIBRATION JIG”. ) by XYZWPR format.
3-point Enter X, Y and Z of three points that indicate the coordinates
of the grid pattern (origin, X-axis point and Y-axis point). The
3D Laser Vision Sensor computes the location and orientation
of the grid frame.
Direct method
Select "Direct" from Setup method.
- 67 -
B-81444EN/03 6.CALIBRATION
3-point method
Select "3-points" from Setup method.
Move the robot to touch up the origin, X-axis and Y-axis points of the
calibration jig by its TCP, and enter X, Y and Z of each robot position.
(For the origin, X-axis point, and Y-axis point, see Appendix B,
"CALIBRATION JIG".)
Clicking the Record button in the state of touch-up by the TCP of the
robot reads the current robot position for setting.
Under Offset of origin, enter the offset from the touched up origin to
the grid pattern origin, which can be obtained from the drawing of the
calibration jig.
After you complete the training, click the Show details button, display
the Calibration data details screen, and confirm that the value of focal
length is almost adequate. See also Section 6.5, "SHOW DETAILS
OF CAMERA CALIBRATION DATA" about the Calibration data
details screen.
- 68 -
6.CALIBRATION B-81444EN/03
CAUTION
Regardless of whether “Direct” or “3-points” is chosen, communication must be enabled
before the robot position can be obtained.
For the setting of communication, see Chapter 5, "ROBOT COMMUNICATION
SETTING". Moreover, before obtaining the current robot position, check that a user
coordinate system and tool coordinate system are selected correctly on the robot
controller side.
Origin
b X-axis point
c
a
- 69 -
B-81444EN/03 6.CALIBRATION
Click the Show details button. You will see the screen as shown
below.
Data displayed for simple 2-point calibration differs from data
displayed for grid pattern calibration.
Item Description
Camera data Scale, lens distortion and so on of the calibrated camera.
Camera Location and orientation of the camera in the robot
position coordinate system.
- 70 -
6.CALIBRATION B-81444EN/03
.
=. 400 mm
Z (depends on the sensor type)
Sensor coordinate system
CAUTION
The term "sensor coordinate system" is used in two ways:
A. Coordinate system set by calibration as described here.
B. Value indicating the position of the coordinate system of
“A.” above relative to the mechanical interface coordinate
system of the robot (value set as a tool coordinate system).
Practically, both “A.” and “B.” represent the same thing.
However, be careful not to confuse one from the other.
- 71 -
B-81444EN/03 6.CALIBRATION
Σf
④
③
⑤
Σs Σu
② ①
Measurement
Output ① = ⑤ × ②
= ④ × ③ × ②
Output ② = ②
- 72 -
6.CALIBRATION B-81444EN/03
TIP
A sensor coordinate system must always be set in the
following cases:
- After performing 3D measurement calibration. (For
details, see Section 6.2, "3D MEASUREMENT
CALIBRATION".)
- If installation position reproducibility is doubtful as in
the case of absence of positioning pins when the 3D
Laser Vision Sensor is installed and removed. The
setting is unnecessary if simple re-calibration can be
performed. (For details, see Section 6.7, "SIMPLE
RE- CALIBRATION".)
TIP
In the setting of a sensor coordinate system, the
position of the calibration jig need not be reproduced
precisely. However, if reproducibility is available, the
program described later can be reused to speed up
restoration work required as in the case where the
sensor is replaced and a sensor coordinate system
needs to be set again.
- 73 -
B-81444EN/03 6.CALIBRATION
When
PCVIS RUN CALIB_MNT* [ST=R[1],OF=PR[1]]
(where * represents 1, 2, or E)
of the program is executed, the following screen is displayed:
Set the window so that the window contains all grid patterns.
When all points are successfully detected, the screen shown below is
displayed.
- 74 -
6.CALIBRATION B-81444EN/03
By adjusting the position and brightness, retry until all dots are
detected successfully. Upon successful completion, a measurement
is automatically made using a laser beam.
CAUTION
In each measurement based on the sensor coordinate
system setting function, an error occurs if all dots are
not detected each time. Adjust the position and
brightness so that all dots can be detected.
- 75 -
B-81444EN/03 6.CALIBRATION
Sample program
The sample program for setting a sensor coordinate system uses two
register areas and one position register area. In the sample program,
these registers are referred to as follows:
Register : R[1], R[2]
Position register : PR[1]
When an area other than these areas is used, modify the program
accordingly, observing the caution described below.
These register areas are used for the following purposes:
R[1] : Measurement processing status
R[2] : Calibration jig dot interval (in mm)
PR[1] : Measurement processing results
- 76 -
6.CALIBRATION B-81444EN/03
CAUTION
When using registers not used in this sample program,
be sure to use a series of registers. This means that
when R[N] is used for measurement processing status
setting, R[N+1] must always be used for calibration jig
dot interval setting.
! Setting Sensor Frame
!
In R[2], specify a grid pattern point
UFRAME_NUM = 0
interval to be used.
UTOOL_NUM = 1
!
! Grid patter pitch In P[1] to P[2], teach a position for calibration jig measurement.
! The distance between the calibration plate and sensor is
R[2] = 15.0 approximately a reference standoff.
Set P[1] and P[2] so that only R of the posture component
J P[1] 30% FINE differs (important).
R[1] = (-1)
PAUSE
UTOOL [1] = PR[1]
ABORT
LBL[999]
UALM[1]
- 77 -
B-81444EN/03 6.CALIBRATION
P[1] : Posture in which the optical axis of the camera unit of the 3D
Laser Vision Sensor faces downward directly
P[2] : Posture in which only R is turned 90°
P[3] : Posture in which the sensor is inclined by a proper amount
and R is turned by -135° from P [1]
P[1]{
GP1:
UF : 0, UT : 1, CONF : 'N U T, 0, 0, 0',
X= 927.23 mm, Y= 20.86 mm, Z= -94.86 mm,
W= 0.00 deg, P = 0.00 deg, R= -90.00 deg
};
P[2]{
GP1:
UF : 0, UT : 1, CONF : 'N U T, 0, 0, 0',
X= 927.23 mm, Y= 20.86 mm, Z= -94.86 mm,
W= 0.00 deg, P = 0.00 deg, R= 0.00 deg
};
P[3]{
GP1:
UF : 0, UT : 1, CONF : 'N U T, 0, 0, 0',
X= 926.09 mm, Y = 13.07 mm, Z = -98.48 mm,
W= -26.00 deg, P = 6.00 deg, R= 135.00 deg
};
CAUTION
Proper position data depends on how the 3D Laser
Vision Sensor is installed on the robot. Note that the
posture above is not recommended in all cases.
However, the condition "Set P [1] and P[2] so that
only R of the posture component differs.” indicated
in the sample program is required at all times.
- 78 -
6.CALIBRATION B-81444EN/03
Next, create a vision process for measuring the calibration jig (see
Chapter 9, "TEACHING AND TESTING THE 3D VISION
PROCESS"). In CameraSetup, specify camera calibration data
subject to simple re-calibration. Moreover, this measurement uses
"disk measurement" (see Section 9.2, "DISK MEASUREMENT"),
and teaches 2D measurement by “Use”. The circular window cannot
be used. The model origin of the 2D measurement location tool is to
be set precisely at the center of the central dot of the calibration jig by
using the image enlargement function (see Section 3.8,
"ENLARGING AND REDUCING IMAGES") and the Center button
(see Section 7.2, "MODIFYING LOCATION TOOL MODEL").
- 79 -
B-81444EN/03 6.CALIBRATION
- 80 -
6.CALIBRATION B-81444EN/03
The reference data and the results of simple re-calibration are stored in
a file named (camera-calib-name).csv in the folder named 3DVision
under the data folder specified on the option setting screen.
Sample program
In simple re-calibration, five register areas and one position register
area are used. In the sample program introduced later, these registers
are referred to as follows:
When an area other than these areas is used, modify the sample
program accordingly. These register areas are used for the following
purposes:
The registers and position register above are temporarily used. The
values set in the registers need not be preserved.
CAUTION
Do not modify the position data, user coordinate
system, and tool coordinate system that are used for
execution in the reference data acquisition mode.
If any of these is modified, re-calibration cannot be
executed precisely in the calibration execution mode.
- 81 -
B-81444EN/03 6.CALIBRATION
!
! PERCH CALIBRATION
! (sample program)
!
PAUSE
!
! Make sure a calibration target
! plate is fixed on a appropriate
! position, then go ahead.
!
PAUSE
!
! Confirm the following settings.
!
! R[5] : UTOOL_NUM of sensor
! frame
! R[6] : free position register
! number
! R[8] : grid space (mm)
! R[9] : calibration mode
! 1 : get standard data
! 0 : do calibration
!
R[5] = 1
R[6] = 100
R[8] = 15
R[9] = 0
!
UFRAME_NUM = 0 Teach calibration jig measurement positions
UTOOL_NUM = R[5] in P[1] and P[2].
! The positions are to be about ±50 mm of the
! Point 1 reference standoff.
! Caution : Orientation must be The posture must be set to directly face the
! the same as Point 2. calibration jig in the view line.
! The posture components (W, P, R) of
J P[1] 30% FINE these two points must be identical
WAIT 1.00(sec) (important).
R[7] = (-1)
PCVIS RUN PROGRAM_PCH1 [ST=R[7],OF=PR[R[6]]]
WAIT R[7] <> (-1)
IF R[7] = 0,JMP LBL[999]
Call the program for 3D measurement of the
!
calibration jig mentioned earlier.
! Point 2
Specify (vision-process-name)_PCH1. Here,
! Caution : Orientation must be
"PROGRAM" is the vision process name.
! the same as Point 1.
!
J P[2] 100% FINE
WAIT 1.00(sec)
R[7] = (-1)
PCVIS RUN PROGRAM_PCH2 [ST=R[7],OF=PR[R[6]]]
WAIT R[7] <> (-1)
IF R[7] = 0,JMP LBL[999]
ABORT Call the program for 3D measurement of the
calibration jig mentioned earlier.
LBL[999] Specify (vision-process-name)_PCH2. Here,
UALM[1] "PROGRAM" is the vision process name.
- 82 -
6.CALIBRATION B-81444EN/03
Simple re-calibration
(1) Conditions and restrictions
- The position and posture of the calibration jig can be reproduced
precisely between the reference data acquisition mode and the
calibration execution mode.
- The settings of the user coordinate system and tool coordinate
system (sensor coordinate system) are the same between the
reference data acquisition mode and the calibration execution
mode.
- The position and posture of the robot are the same between the
reference data acquisition mode and the calibration execution
mode.
- The reference data acquisition mode is used for execution in the
state of precise calibration with respect to the fixed calibration
jig.
(2) Advantages
- The need to set a sensor coordinate system is eliminated.
- Each vision process need not reacquire nominal data.
- Fully automated program-based calibration is enabled at all times
including immediately after sensor replacement.
- A level of precision almost equivalent to that obtained in 3D
calibration can be obtained.
3D measurement calibration
(1) Conditions and restrictions
- After execution, in general, the sensor coordinate system needs
to be set again, and each vision process needs to reacquire
nominal data.
(2) Advantages
- Even in a situation disabling simple re-calibration as described
below, calibration can be performed if a calibration jig is
available.
- When the 3D Laser Vision Sensor is installed for the first time.
- When an environment where the calibration jig for simple
re-calibration cannot be fastened or the position and posture of
the calibration jig cannot be reproduced is set
- When any of user coordinate system setting data, sensor
coordinate system setting data, and robot position data set in the
reference data acquisition mode is lost or modified
- 83 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
- 84 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
Click the down arrow to the right of the camera number combination
box, and select the number of the camera you want.
Click the button to display the camera live image, place a part at
the center of the field of view of the camera, and then click the
button to snap a picture of the part.
Click the Train button. A red rectangle will be plotted on the image as
shown below.
You can move it by dragging inside, or resize by dragging handles at
corners. Adjust the rectangle to enclose your part, and then click the
OK button in the dialog box.
CAUTION
The size of a rectangle for enclosing a part is related to
“OverlapLimit” (see Section 7.4 "OPTIONAL
FUNCTION OF LOCATION TOOL").
Make a setting to match the size of a workpiece
whenever possible.
Next, you can select an area outside the model where you don't want
to include in your model pattern inside the rectangle. This is called the
"Don't care region”. Move your mouse on the image while pressing
the left button. The image turns red. The red area will not be included
in the model pattern. If there is anything extra at the background of a
part or a part has an extra feature or stain which no other parts usually
do not, these foreign materials can be excluded from the model to be
taught by coloring it in red. Move your mouse while pressing the right
button to turn the image back to gray scale. If you have a don't care
region, color it in red and then click the OK button in the dialog box.
Otherwise, just click the OK button.
- 85 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
TIP
The thickness of a red pen for painting an area or for
restoring an area painted in red to the original state
can be chosen from the following three options:
- "Thin" when the mouse is used for painting
without using any key
- "Medium" when the mouse is used while holding
down the Cntl key
- "Thick" when the mouse is used while holding
down the Shift key
Choose an option suitable for an area to be taught.
The 3D Laser Vision Sensor starts learning the model pattern. When
finished, the red [Not Trained] will change to green [Trained].
TIP
Software diagnoses the taught model here, and
information to which you should pay attention is
displayed if any.
The same information as introduced in ”Diagnosis” in
Subsection 7.2.1, "Model Display" is displayed. See
the subsection.
In the 2D measurement mode, more than one model can be taught, and
it is possible to find out which of the taught models corresponds to a
detected part. In this case, each model must be assigned a separate
type number. Executing the 2D vision process informs the robot
controller of the type number of the detected model, thus enabling the
type of the detected model to be identified.
When you click the OK button to exit from the screen, the taught data
is saved.
- 86 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
When you click the Cancel button to exit from the screen, the taught
data is lost.
NOTE
You do not have to train the model in the exact same
condition as it is in the execution period. The condition
includes the lighting, the workpiece surface and the
background of the image.
If you train the model with clearly visible workpiece with
less noises, you will get better result.
For example, if the workpiece has reflective surface or
dirty surface, you may get better result by using
mock-up workpiece with plain surface.
- 87 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
Train
By clicking the Train button, the model can be taught again
according to Section 7.1,"TRAIN MODEL". For teaching, the
image displayed at that time is used.
Mask
By clicking the Mask button, a feature to be ignored can be
painted using a red pen. For painting with a red pen, move the
mouse while clicking. Mask modification modifies only an area
painted using a red pen; other information items such as the
model origin are not modified.
Origin
Enter the coordinates of the model origin by specifying numeric
values.
Change
By using the cursor on the image, set the position of the model
origin.
Center
The software calculates the rotation center of the model pattern
and sets the model origin at the rotation center.
- 88 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
Show
A taught model is displayed. For details, see Subsection 7.2.1,
"Model Display".
Use reference point(s)
When a particular function is used, a point needs to be set in
addition to a model origin. Such a point is referred to as a
reference point. Checking this check box enables a reference
point to be set. For details, see Subsection 7.2.2,"Setting a
Reference Point".
Upon completion of modification, click the OK button in the dialog
box.
7.2.1 Model Display
Plot features
Checking this check box displays the feature points (green) of the
model on this screen.
Model contrasts
The contrasts of a taught model are displayed. Max., Min., and
Ave. indicate the highest contrast, lowest contrast, and, average
contrast of the taught model, respectively.
Diagnosis
oftware diagnoses the taught model, and information to which
you should pay attention is displayed if any. Information
displayed here is also displayed at the time of model teaching.
TIP
Depending on the model, the following message may be displayed:
- The position cannot be determined with the trained model pattern. Modify the part for
the model pattern.
- The found position may be unreliable with the trained model pattern. In such a case,
consider to use the extra care region or to modify the part for the model.
- The orientation cannot be determined with the trained model pattern. Uncheck the
checkbox for the Angle in the parameter tab.
- The found orientation may be unreliable with the trained model pattern. In such a
case, consider to use the extra care retion or to modify the part for the model.
- The size cannot be determined with the trained model pattern. Uncheck the
checkbox for the Size in the parameter tab.
- The found size may be unreliable with the trained model pattern. In such a case,
consider to use the extra care region or to modify the part for the model.
- The contrast of the trained pattern is low. Adjust the contrast threshold.
- 89 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
- 90 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
Location tool has several parameters for specifying the search area
and detection threshold.
Select the [Parameters] tab. You will see the screen as shown below.
The following table lists the meaning of each parameter, the range of
its values, and how to set them.
Item Description
Number to find Number of objects you want to find.
The 3D Laser Vision Sensor attempts to find as many objects as possible within
the specified number. (Default : 1)
To change the default, enter a new value on the screen.
Accept Value from 1 to 100 that represents the threshold between success and failure.
The score of the found object is equal to or greater than this value, the search is
successful. Otherwise, it is unsuccessful. The score is between 1 and 100.
(Default: 75.)
If a small value is set, a detection error can occur, or the detection time can
increase.
Contrast thresh Threshold of contrast used during a search. The default is 30, which can change,
depending on the model.
- 91 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
Item Description
Set a higher value if the 3D Laser Vision Sensor finds stains or other wrong objects
with low contrast.
Usually, you do not need to change this value.
Search area Area to be searched on the image. (The default is the entire image.)
When a smaller area is selected, the processing speeds up. To change the
search area, click the Change button. A rectangle is displayed on the image.
Make an adjustment in the same way as for model teaching.
Bit-mapped window Used when an arbitrary search area such as a circular search area or
doughnut-shaped search area is to be specified. (The default is the unused state)
To use this window, click the Edit button. Make an adjustment by using a red pen
in the same way as for “Mask” for a model.
To return the window to the unused state, click the Remove button.
Search angle Range of angle to search for. (Default: -180 to 180degrees ) The area should
be within -180 to 180 degrees. -180 to 180 degrees is the default. The smaller the
range, the faster the process. Check off when the part is a circle or the part does
not rotate.
Search size Range of size to search for. (Default: 90 to 110% ) The area should be within
50 to 200 percents of the taught size. Smaller the range, faster the process. Check
off when you don't want to search for various sizes. When the distance between
the camera and object does not stabilize and the size of object in the image
changes subtly, this parameter is useful.
Search inclination Inclination to be used for search. (Default: 0 to 30 degrees. However, this
parameter is disabled by default.)
An angle from 0 to 60 can be set, with the inclination of the object being 0 at the
time of teaching. When a smaller area is specified, the processing speeds up.
When the check box is unchecked, inclination search is not performed. This
function is useful when the directions of the camera and object are unstable so that
a slightly different shape is viewed.
When you click the OK button to exit from the screen, your changes
are saved.
When you click the Cancel button to exit from the screen, your
changes are lost.
- 92 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
Location tool has some optional functions for optimizing its operation
for a specific application.
Select the [Options] tab. You will see the screen as shown below.
The table below indicates the meaning, specifiable value range, and
setting method of each parameter.
Item Description
OverlapLimit If patterns found in the image overlap each other to a certain degree or more, the
location tool leaves only a pattern with a high score, rejecting others. An overlap
is determined using the rectangular overlapping area outside the model. (Default:
75%)
When 100% is set, rejection is disabled even if there are overlapping patterns.
FitErrorLimit Set an allowable degree of pattern figure distortion between the model pattern and
a pattern appearing on the image by specifying the number of picture elements.
(Default: 1.5 pix)
A larger value specified allows a greater figure distortion. As a result, a detection
error can occur. So, set a value that matches an actual figure distortion.
A larger value may be specified if the feature section is soft, and can actually be
deformed when compared with the model, or if the workpiece is thick or much
inclined when the distance between the camera and workpiece is short.
TimeoutTime If detection processing consumes a timer longer than the value set here, the
detection processing ends unsuccessfully. (Default: 30,000 ms)
If 0 is set, detection processing is performed endlessly.
- 93 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
Color Description
Green Feature point for which the corresponding feature point was found in the image
Red Feature point for which the corresponding feature point was not found in the image
- 94 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
When an extra care region is used, the object is not detected if the
extra care region is not detected. This means that if an extra care
region cannot be detected even when the object itself is detectable, the
object is not detected. The detection threshold of an extra care
region is set separately from the threshold of the entire object.
Parameter Description
Accept If the score of a detected extra care region is equal to or greater than this value,
the detection is successful. If the score of a detected extra care region is less
than this value, the detection is not successful. Set a value from 1 to 100.
(Default: 70)
Show
When you click this button after several detection operations, a
screen as shown below is displayed.
- 95 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
- 96 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
Find
On the image currently displayed, detection is executed.
Snap+Find
After an automatic snap operation, detection is executed.
Continuous
After automatic snap operation, the detection process is executed
in succession.
Hz
Vt
The results of detection are plotted on the image, and data items
such as processing time, position, angle, and score are displayed
as shown above. Vt and Hz for representing a position indicates
a coordinate value in the vertical direction and a coordinate value
in the horizontal direction, respectively, with the upper-left
corner of the image being the origin, as shown above.
- 97 -
B-81444EN/03 7.TEACHING AND TESTING THE LOCATION TOOL
The processing time displayed here does not include a time for
snapping an image and posting results. A time required only for
detection is displayed.
Show runner-up
If this check box is checked, an object that is not detected but
almost satisfies the required score, contrast, angle, size, and so
forth is displayed as a detection result.
This setting is valid for this detection test only.
An object that is not detected with a slight deviation is displayed
with a yellow rectangle in the image, and its data is indicated in
red in the table. By comparing the results of detection indicated
in red with each parameter of the location tool, the parameter(s)
that caused the object to be not detected can be identified.
- 98 -
7.TEACHING AND TESTING THE LOCATION TOOL B-81444EN/03
CAUTION
When the results of detection processing performed
using the specified parameters are finally checked to
see if the specified parameter values are satisfied, this
function displays an object that is not detected because
of a failure to satisfy the specified parameter values.
For example, this function does not guarantee the
display of an object whose score is from 60 to less than
70, when the score threshold is 70.
ImageFeatures
From the currently displayed image, feature points to be used for
detection processing are detected and displayed in yellow. The
result of this display is affected by the search window,
bit-mapped window, and contrast threshold. In particular, this
function is useful for contrast threshold adjustment.
- 99 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS
- 100 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03
Applied setting
- 101 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the setting of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
Image Saving
Select a condition for saving an image. Select one of the
following three options:
Option Description
Don't save The vision process does not save images.
Save if failure When the vision process fails to find, it saves the image to a bmp file.
Saving such images is useful to investigate later.
Usually, select this option.
Save always When the vision process runs, it saves the image to a bmp file.
This is useful to evaluate stability of the system just after setting up the line.
Select "Save if failure" after you verify the system stability, because this option
uses much hardware area.
TIP
The amount of available hard disk space is limited.
Using too much area of the hard disk drive may cause
problems such as the PC does not boot up. 3D Laser
Vision Sensor has a limitation for image files and does
not save image with exceeding the limit. You can
change the limit in the Options screen. See Chapter 10,
"Options" for details.
Output Type
Select an output format used for the search result from the two
formats shown below.
- 102 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03
Option Description
2-D Location and The 3D Laser Vision Sensor returns the 2-D location and orientation of the found
Orientation parts to the robot position register. X and Y indicates location, R indicates
orientation, and Z, W and P are always zero.
3-D Line The 3D Laser Vision Sensor returns the 3-D line passing throw the camera center
point and the model origin of the found part to the robot position register. X, Y and
Z indicates the intersect of the line and calibration plane, W and P indicates a
vector along the line, and R is 0 when the option switch “Add angle information to
results” is disabled. When the option switch is enabled, a detected angle is
reflected.
Options
Select other options. There are the following options.
Options Description
Log the resulting data When this switch is checked, each time the vision process is executed, the
processing time, found position, and other resulting data are logged in a file. The
file is created under the name "(vision-process-name).csv" in the “2dvision” folder
in the folder specified in Data folder on the Options screen. (See Chapter 11,
"Options”.) This file can be displayed as a graph with Excel or another
spreadsheet application.
Send model ID to robot When this switch is checked, the found model ID of a found location tool is sent to
the robot controller. This option switch is used when you specify multiple location
tools in the vision process to identify the model.
A model ID is specified on the location tool training screen. A model ID is stored
in the register with the same number as the position register in which a found
position is stored.
Sort results by size When two or more results are detected, they are usually output in descending
order of their score. If this item is checked, however, the results are output in the
order of their size, rather than their score.
Add angle information to Displayed only when “3-D Line” is selected. If this item is checked, rotation angle
results information is reflected in the final result of detection.
When you click OK to exit from the screen, the taught data is saved.
When you click Cancel to exit from the screen, the taught data is lost.
- 103 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS
Place a part in the field of the camera view, and click the Run button.
The found position is plotted on the image.
The resulting values are displayed on the run-time monitor.
To display the run-time monitor, select [Monitor] from the [View]
menu of the 3D Laser Vision Sensor .
The time displayed on the run time monitor is all the processing time
of the vision process, which includes the time for snapping image.
However, the test does not perform the communication, so the
displayed time does not include the communication time.
Image
Select an image subject to test execution.
Option Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. For details, see
Subsection 8.2.1, "Use BMP Files".
Continuous Run
This function is enabled when “Snap from a camera” is set.
Check this item and click the Run button. Image snapping and
detection are repeatedly executed.
Repeated execution lasts until you click Cancel in the following
dialog box:
- 104 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03
Icon Description
Executes in succession in reverse order.
Executes using the immediately following image and stops execution temporarily.
Stops execution temporarily.
Executes using the immediately preceding image and stops execution temporarily.
Executes in succession.
- 105 -
B-81444EN/03 8.TEACHING AND EXECUTION THE 2D VISION PROCESS
or
Location Description
R[n] A number indicating whether a measurement is successful is stored. When a
measurement is successful, the detection count is stored. When a measurement
is unsuccessful, 0 is stored.
R[m] ~ If “Send model ID to robot” is checked, type numbers as many as the detection
count are stored in registers starting with R[m]. When detection is unsuccessful,
no data is stored if “Send model ID to robot” is unchecked.
PR[m] ~ Measurement results as many as the detection count are stored in the XYZWPR
format in registers starting with PR[m]. When detection is unsuccessful, no data
is stored.
- 106 -
8.TEACHING AND EXECUTION THE 2D VISION PROCESS B-81444EN/03
1: UTOOL_NUM = 1
2: R[1]=(-1)
3: PCVIS RUN PROGRAM1 [ST=R[1], OF=PR[2]]
4: WAIT R[1]<>(-1)
5: IF R[1] = 0, JMP LBL[99]
:
10: ! ERROR ABORT
11: LBL[99]
[End]
TIP
In the case of a 2D vision process, measurement
results are output in a "sensor coordinate system" at all
times.
CAUTION
Results sent from the 3D Laser Vision Sensor to the
robot controller are not compensation data but
measured position data. For the method of calculating
compensation data, see Appendix C, "ROBOT
POSITION COMPENSATION".
CAUTION
Ensure that before the execution of the vision process
for the 3D Laser Vision Sensor started by the robot
controller is completed, the robot controller does not
start another vision process for the 3D Laser Vision
Sensor (prohibition of duplicate execution). In other
words, check first that after starting a vision process for
the 3D Laser Vision Sensor by the robot controller,
information on whether a measurement made by the
vision process is successful or not has been sent to a
robot register, then start the next vision process for the
3D Laser Vision Sensor.
- 107 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 108 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Option Description
Disk measurement Measures and outputs the position and posture of a workpiece.
Displacement Measures and outputs the displacement (Z coordinate) of a workpiece. When 2D
measurement measurement is used together in this measurement, X, Y, and R coordinates are
also output.
Edge measurement Measures and outputs the edge point of a laser beam radiated onto a workpiece.
Cross section Measures and outputs a particular position such as the summit of a cross section
measurement based on the cross section figure of a workpiece obtained using a laser beam.
- 109 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Synthesis
TIP
Measurement not accompanied by 2D image
processing (that is, measurement based on laser beam
image processing only) is possible. In this case, the
position is assumed to be the point of intersection of
the two laser beam lines, and the rotation θ is assumed
to be 0. Because of measurement using a laser beam
only, this type of measurement has such an advantage
that a measurement can be made in the darkness.
However, position and rotation information cannot be
obtained.
- 110 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
9.2.2 Settings
3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.
Camera Setup
Select a type of camera setup to be used by the vision process.
Clicking the Select button displays a list of camera setup names.
Select a desired camera setup name and click the OK button.
Function
This is a used by the vision process. Select displacement
measurement or disk measurement, whichever is desired.
Window setting
This function enables only the necessary portion of the laser
information to be taken out. Not all parts of the laser beam
radiated toward the workpiece strike it. Windows are used to
exclude the laser beam radiated to any object other than the
workpiece.
Click the Window setting button. A dialog box for teaching the
window area appears. Proceed as directed in the dialog.
- 111 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 112 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 113 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 114 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Parameter
Performing 3D measurement requires making various
adjustments according to an object to be measured. Usually this
function is not used, but change the parameters as required.
Pressing the Parameter button opens the following form.
- 115 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Items Description
Distance Threshold Threshold of the distance between the two straight lines obtained from two laser
beams. If the distance is longer than the threshold, plane detection is regarded
as unsuccessful, resulting in a failure in plane detection. (Default: 3.0 mm)
Brigntness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWLING BRIGHTNESS
DISTRIBUTION".
Slit Points Threshold Threshold for extraction of a straight line obtained by laser. If the plane to be
measured is small, decrease this value within the range that ensures correct
detection. (Default: 50)
Slant Threshold This parameter locks the detection tilting angle. If the measurement result tells
that the workpiece is tilted more than this angle, an error is detected. (Typical
value: 90 degrees)
After you finish with adjustments, click the OK button to apply what
you specified. If you want to discard what you specified, click the
Cancel button to close the form.
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi Mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.
- 116 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
2D setting
When a 3D measurement is made, 2D measurement information can
be used for position identification. A setting for this purpose is made
in 2D measurement setting. When 2D information is not required,
choose “Not Use” in the selection of “Use / Not Use” below.
- 117 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
3D vision sensor
2D measurement features
3D measurement plane
Gap (+)
Section (Example)
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the setting of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
When 0 is set, the same brightness as set by “Not Use” is produced.
Others
- 118 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.
When multiple robots are selected, the results of vision process
execution are sent to all selected robot controllers.
Output frame setting
The results of 3D Laser Vision Sensor measurement need to be
related to the coordinate system of the robot. Here, select one
of the three options indicated below to determine which
information is used for relationship.
Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
This option may be used in an application in which the 3D Laser Vision Sensor is
installed at a fixed position and the sensor coordinate system is set to the user
coordinate system.
User frame: By using the current robot position at the time of measurement, this option
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system. Select this option usually.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
- 119 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Option Description
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.
CAUTION
The term "robot" used here means a robot controller
that started the 3D Laser Vision Sensor. The 3D
Laser Vision Sensor obtains the current position and
position register information from the robot controller
that started the 3D Laser Vision Sensor. Do not
confuse this robot controller with “Robot to send to”,
although both represent the same robot controller in
many cases.
Option Description
Don’t save No image is saved.
Save if failure When a vision process is executed, images not detected are saved in files.
Select this option usually.
By saving an image not detected, the cause can be investigated later. For
investigation of the cause, test execution should be performed by using an image
file and adjusting the parameters. For test execution using an image file, see
Subsection 9.6.2, "Use BMP Files".
Save always Image files are saved each time a vision process is executed. This option is
useful in a stage where detection stability is evaluated, for example, immediately
after system startup.
After normal operation starts, switch the setting so that images are saved only
when undetected.
CAUTION
The hard disk has a limited capacity. If it is used up,
- 120 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Options
Set up option switches. There are two option switches as listed
below.
Items Description
Log the resulting data Each time the vision process is executed, the processing time, found position,
and other resulting data are logged in a file. The file is created under the
name "vision-process-name.csv" in the “3dvision” folder in the folder specified
in Data folder on the Options screen. (See Chapter 11, "Options”.) A .csv
file can be displayed as a graph with Excel or another spreadsheet application.
Measure narrow area For a workpiece whose laser beam radiation area is narrow, a failure is likely to
occur.
In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.
If you click the OK button to exit from this dialog box, the taught data
is saved. If you click the Cancel button to exit the dialog box, the
taught data is lost.
Test
Image
Select an image subject to test execution.
Option Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. When 2D
measurement is used, “BMP File(2D)” is to be set for detection. When 2D
measurement is not used, “BMP file(Slit1)” is to be set for detection. For
details, see Subsection 9.6.2, "Use BMP Files".
Continuous Run
This check box is enabled when “Snap from a camera” is
selected. For details, see Subsection 9.6.1, "Continuous Run".
Get base position data
If “Use” is selected for 2D measurement as described earlier, the
window for the laser is dynamically moved for 3D measurement
based on the detection results of 2D measurement. So, the
detected 2D measurement feature needs to be related to the
relative position of the window for the laser. This operation is
conducted by test execution performed by checking this check
box. The result is not saved until you click the OK button or
Apply button. When data is acquired, the date and time of
acquisition is displayed.
CAUTION
Be sure to acquire reference position data in 3D
- 121 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
After the window for the laser is taught, the following can occur:
- You want to make a modification to the window for the laser.
- You did not acquire base position data.
In these cases, actions described below can be taken.
- 122 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 123 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Color Description
- 124 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Red (thick) Points extracted as a laser beam point string and used for calculation. If points
other than a point string such as noises are displayed in this color, the precision
may have deteriorated. In such a case, a parameter adjustment is required.
Light blue (thin) Points extracted as a laser beam point string but not used for calculation. In
some cases, points extracted correctly may be displayed in this color.
- 125 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
TIP
Measurement not accompanied by 2D image
processing (that is, measurement based on laser beam
image processing only) is possible. In this case,
position and posture information other than a Z
coordinate cannot be obtained. Because of
measurement using a laser beam only, this type of
measurement has such an advantage that a
measurement can be made in the darkness.
However, position and rotation information cannot be
obtained.
- 126 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
9.3.2 Settings
3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.
Camera Setup
Select a camera setup option to be used with the vision process.
Click the Select button to display a list of camera setup options.
Select a desired camera setup name then click the OK button.
Function
This setting item is used to select a measurement function to be
used with the vision process. Select Displacement.
Window setting
This setting item is used to collect only a required portion of
laser information. A laser beam is not always radiated
sufficiently across a workpiece without being radiated outside the
workpiece. A window is used to exclude a laser beam radiated
outside the workpiece from processing.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instruction of the dialog box.
- 127 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 128 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 129 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Parameter
For 3D measurement, various adjustments may need to be made
for a measurement object. Usually, the parameters need not be
adjusted. However, adjust parameters if necessary. Click the
Parameter button. The form shown below appears.
- 130 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Items Description
Brightness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWING BRIGHTNESS
DISTRIBUTION".
Displacement This parameter is valid when the option switch “Select slit points using dispersion
dispersion(STD) of displacement” in “Others” is checked. Because of noise on the image, some
points may represent an exceptional displacement when compared with others.
This parameter is used to statistically ignore those points. When a smaller value
is set, it is easier to remove noise components. However, necessary data is
discarded if the measurement range varies. (Default: 3 STD. STD means the
standard deviation.)
Displacement range(mm) This parameter is valid when the option switch “Select slit points using dispersion
of displacement” in “Others” is checked. This parameter is used to set a range of
displacement for an actual measurement object. Change the setting of this
parameter to match a measurement object. (Default: 100 mm)
Option Description
- 131 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.
2D setting
When a 3D measurement is made, 2D measurement information can
be used for position identification. A setting for this purpose is made
in 2D measurement setting. When 2D information is not required,
choose “Not Use” in the selection of “Use / Not Use” below.
Location Tool
- 132 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the setting of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
When 0 is set, the same brightness as set by “Not Use” is produced.
Others
Set “Other” according to each application.
However, the setting of “Robot to send to” must not be omitted.
Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.
- 133 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
This option is used in an application that measures the distance between the 3D
Laser Vision Sensor and a workpiece. Select this option usually.
User frame: By using the current robot position at the time of measurement, this option
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.
CAUTION
The term "robot" used here means a robot controller that
started the 3D Laser Vision Sensor . The 3D Laser
Vision Sensor obtains the current position and position
register information from the robot controller that started
the 3D Laser Vision Sensor. Do not confuse this robot
controller with the "transmission destination robot",
although both represent the same robot controller in many
cases.
Option Description
Don't save No image is saved.
Save if failure When a vision process is executed, images not detected are saved in files.
Select this option usually.
By saving an image not detected, the cause can be investigated later. For
investigation of the cause, test execution should be performed by using an image
file and adjusting the parameters. For test execution using an image file, see
Subsection 9.6.2, "Use BMP Files".
Save always Image files are saved each time a vision process is executed. This option is
useful in a stage where detection stability is evaluated, for example, immediately
after system startup.
After normal operation starts, switch the setting so that images are saved only
- 134 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Option Description
when undetected.
An image is saved in a file named (vision-process-name)12D@
(date-time).bmp (only when 2D measurement is used), (vision-process
-name)13DSubt1 @ (date-time).bmp, or (vision-process-name)13D
Subt2@(date-time).bmp.
3D Laser Vision Sensor position data is saved in a file named (vision-
process-name)@(date-time).psn. All of these files are saved in a
folder specified in Image folder on the option setting screen. (See
Section 11.1, "OPTION SETTING".)
TIP
The hard disk has a limited capacity. If the hard disk
is fully used up, trouble such as a failure in starting up
the personal computer can arise. With the 3D Laser
Vision Sensor, a limitation on the hard disk area that
can be used to save images is set on the option setting
screen so that images are not saved beyond the set
area.
Options
Set a desired option switch. Four option switches are available
as indicated in the table below.
Items Description
Log the resulting data Each time a vision process is executed, the results of execution such as a
processing time and detection position are recorded in a file. A file named
(vision-process-name).csv is created in the folder named “3dvision” under the
folder specified in Data folder on the option setting screen. (See Chapter 11,
"OPTIONS".)
The data in this csv file can be formed, for example, into a graph by spreadsheet
software such as Excel®.
Measure narrow area If a laser beam is directed to a narrow area of a workpiece, detection may tend to
fail. In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.
Select slit points using Some points do not represent an actual measurement object, for example, due to
dispersion of displacement noise on the image. This option deletes those points by statistically processing
the overall displacement of points. (Default: ON)
The related parameters are “Displacement dispersion(STD)” and “Displacement
range(mm)”. (See the description of “Parameter” of “3D setting”.)
Select slit points using Of a slit that usually has a uniform width, a portion that is placed over a hole edge,
dispersion of slit width for example, can be deformed or has a narrower width when compared with an
ordinary slit, thus resulting in a degraded point string extraction precision when
compared with ordinary portions. This option removes such a portion by
performing statistical processing. (Default: ON)
When you exit from the screen by clicking the OK button, the taught
data is saved.
When you exit from the screen by clicking the Cancel button, the
taught data is cancelled.
Test
Image
- 135 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Option Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. When 2D
measurement is used, “BMP file(2D)” is to be set for detection. When 2D
measurement is not used, “BMP file(Slit1)” is to be set for detection. For details,
see Subsection 9.6.2, "Use BMP Files".
Continuous Run
This check box is enabled when “Snap from a camera” is
selected. For details, see Subsection 9.6.1, "Continuous Run".
Get base position data
If “Use” is selected for 2D measurement as described earlier, the
window for the laser is dynamically moved for 3D measurement
based on the detection results of 2D measurement. So, the
detected 2D measurement feature needs to be related to the
relative position of the window for the laser. This operation is
conducted by test execution performed by checking this check
box. The result is not saved until you click the OK button or
Apply button. When data is acquired, the date and time of
acquisition is displayed.
CAUTION
Be sure to acquire reference position data in 3D
measurement that uses 2D measurement. Moreover,
perform this operation in the state where the window for
the laser was taught. If the program is executed without
satisfying this condition, the window is not set properly on
the image, resulting in an incorrect detection or a failure in
detection.
After the window for the laser is taught, the following can occur:
- You want to make a modification to the window for the laser.
- You did not acquire base position data.
In these cases, actions described below can be taken.
- 136 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 137 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
The screen below shows how the window is set. The portion not
to be measured is masked by setting after checking “Use window
mask”.
Color Description
Red (thick) Points extracted as a laser beam point string and used for calculation. If points
other than a point string such as noises are displayed in this color, the precision
may have deteriorated. In such a case, a parameter adjustment is required.
Light blue (thin) Points extracted as a laser beam point string but not used for calculation. In
some cases, points extracted correctly may be displayed in this color.
- 139 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
CCD camera
Laser slit beam projector
2D image processing
Laser beam image processing
CAUTION
n edge measurement, an end point to which a laser beam is projected is measured,
regardless of how the workpiece position differs. So, no identical position on a
workpiece is measured each time.
Note that for workpiece position compensation using edge measurement, multiple
measurements need to be combined with each other.
- 140 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
9.4.2 Settings
3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.
Camera Setup
Select a camera setup option to be used with the vision process.
Click the Select button to display a list of camera setup options.
Select a desired camera setup name then click the OK button.
Function
This setting item is used to select a measurement function to be
used with the vision process. Select “Edge”.
Window setting
This setting item is used to collect only a required portion of
laser information. A laser beam is not always radiated
sufficiently across a workpiece without being radiated outside the
workpiece. A window is used to exclude a laser beam radiated
outside the workpiece from processing.
Clicking the Window setting button displays a dialog box for
teaching a window. Follow the instruction of the dialog box.
- 141 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 142 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Next, teach a point on the periphery of the circular area to be used for
3D measurement.
- 143 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 144 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
P
A description is provided using the figure above.
Suppose that a laser beam is radiated to the workpiece and the figure
shown above is obtained. Suppose also that point P is the point to
be obtained finally by edge measurement. In this case, straight line
L is set with the location tool so that L is obtained by the model
origin and reference point.
To obtain P in this state, a laser beam on the left side of straight line
L can be used on the image in a window like A. Alternatively, a
laser beam on the right side of straight line L can be used on the
image in a window like B.
From the position of the window center, the software determines
which laser beam to use for calculation.
The center of window A is placed on the left side of straight line L.
So, the software uses a laser beam on the left side of L.
The center of window B is placed on the right side of straight line L.
So, the software uses a laser beam on the right side of L.
Both of windows A and B include an opposite-side laser beam not
to be used relative to straight line L. From the specification above,
however, only those points that are on the same side relative to
straight line L are used, so that the precision is not affected.
Keep this restriction in mind when setting a window.
CAUTION
On the setting of a window for edge measurement, a
restriction not applicable to the other functions is
imposed.
Make a correct setting after understanding the
description above.
Parameter
For 3D measurement, various adjustments may need to be made
for a measurement object. Usually, the parameters need not be
adjusted. However, adjust parameters if necessary. Click the
Parameter button. The form shown below appears.
- 145 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Items Description
Brightness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWING BRIGHTNESS
DISTRIBUTION".
Slit number Number of a laser beam to be used. With this function, only the one of the two
laser beams that corresponds to a number specified here is used for
measurement. (Default: 1)
1: Displayed from the lower-left corner to the upper-right corner on the screen
2: Displayed from the upper-left corner to the lower-right corner on the screen
Upper limit(mm) Among 3D positions found from individual laser beam points, those positions that
are larger than this value along the Z axis in the sensor coordinate system are not
used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: 50 mm)
Lower limit(mm) Among 3D positions found from individual laser beam points, those positions that
are smaller than this value along the Z axis in the sensor coordinate system are
not used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: -50 mm)
Distance limit(mm) Among 3D positions found from individual laser beam points, those positions that
are separated by a distance greater than this value from the straight line including
an edge to be found are not used for calculation. Make a setting according to the
figure of the workpiece. (Default: 50 mm)
- 146 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.
2D setting
In edge measurement, a step needs to be detected as a straight line by
using 2D measurement. In the selection of “Use / Not Use” below,
only “Use” can be selected.
- 147 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Location Tool
Select a location tool to be used with the vision process.
Clicking the Select button displays a list of location tools.
Select a location tool to be used, then click the OK button.
Shutter speed mode
The setting of the shutter speed adjustment mode is almost the
same as for “3D setting”. However, the setting of Snap times is
not applicable in single mode.
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.”, and detection is
performed. Select a value from 0 (bright) to 16 (dark) to set “Speed const.”.
When 0 is set, the same brightness as set by “Not Use” is produced.
Others
Set “Others” according to each application.
However, the setting of “Robot to send to” must not be omitted.
Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.
- 148 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
The use of KAREL can serve a wide variety of cases, so that the default is this
option.
User frame: By using the current robot position at the time of measurement, this option
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.
CAUTION
The term "robot" used here means a robot controller that started the 3D Laser Vision Sensor. The 3D
Laser Vision Sensor obtains the current position and position register information from the robot
controller started the 3D Laser Vision Sensor. Do not confuse this robot controller with the
"transmission destination robot", although both represent the same robot controller in many cases.
- 149 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
TIP
The hard disk has a limited capacity. If the hard disk
is fully used up, trouble such as a failure in starting up
the personal computer can arise. With the 3D Laser
Vision Sensor , a limitation on the hard disk area that
can be used to save images is set on the option setting
screen so that images are not saved beyond the set
area.
Options
Set a desired option switch. Two option switches are available
as indicated in the table below.
Items Description
Log the resulting data Each time a vision process is executed, the results of execution such as a
processing time and detection position are recorded in a file. A file named
(vision-process-name).csv is created in the folder named “3dvision” under the
folder specified in Data folder on the option setting screen. (See Chapter 11,
"OPTIONS".)
The data in this csv file can be formed, for example, into a graph by spreadsheet
software such as Excel®.
Measure narrow area If a laser beam is directed to a narrow area of a workpiece, detection may tend to
fail. In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.
When you exit from the screen by clicking the OK button, the taught
data is saved.
When you exit from the screen by clicking the Cancel button, the
taught data is cancelled.
Test
Image
Select an image subject to test execution.
Option Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. For details, see
Subsection 9.6.2, "Use BMP Files".
Continuous Run
This check box is enabled when “Snap from a camera” is
selected.
For details, see Subsection 9.6.1, "Continuous Run".
Get base position data
- 150 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
CAUTION
Be sure to acquire reference position data in 3D
measurement that uses 2D measurement. Moreover,
perform this operation in the state where the window for
the laser was taught. If the program is executed without
satisfying this condition, the window is not set properly on
the image, resulting in an incorrect detection or a failure in
detection.
After the window for the laser is taught, the following can occur:
- You want to make a modification to the window for the laser.
- You did not acquire base position data.
In these cases, actions described below can be taken.
- 151 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
(3) Set a model origin at a point on the straight line passing through an
edge to be found. Set a model origin precisely by using the
image enlargement function.
- 152 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
(4) Set a reference point at a point on the line passing through an edge
to be found. At this time, separate the model origin from the
reference point as far as possible. Set a reference point precisely
by using the image enlargement function.
(5) Adjust the other parameters of the location tool. The screen
below shows the detection result after adjustment.
- 154 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Color Description
Red (thick) Points extracted as a laser beam point string and used for calculation. If points
other than a point string such as noises are displayed in this color, the precision
may have deteriorated. In such a case, a parameter adjustment is required.
Light blue (thin) Points extracted as a laser beam point string but not used for calculation. In
some cases, points extracted correctly may be displayed in this color.
- 155 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
CCD camera
Laser slit beam projector
Laser ON
CAUTION
In cross section measurement, the cross section of a location to which a laser beam is
projected is measured, regardless of how the workpiece position differs. So, no
identical location on a workpiece is measured each time.
Note that for workpiece position compensation using cross section measurement,
multiple measurements need to be combined with each other.
- 156 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
9.5.2 Settings
3D setting
The 3D measurement setting is the basic setting for obtaining 3D
information by using a laser beam. This setting must not be omitted.
Camera Setup
Select a camera setup option to be used with the vision process.
Click the Select button to display a list of camera setup options.
Select a desired camera setup name then click the OK button.
Function
This setting item is used to select a measurement function to be
used with the vision process. Select “Cross section”.
Window setting
This setting item is used to collect only a required portion of
laser information. A laser beam is not always radiated
sufficiently across a workpiece without being radiated outside the
workpiece. A window is used to exclude a laser beam radiated
outside the workpiece from processing.
- 157 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 158 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 159 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Parameter
For 3D measurement, various adjustments may need to be made
for a measurement object. Usually, the parameters need not be
adjusted. However, adjust parameters if necessary. Click the
Parameter button. The form shown below appears.
- 160 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Items Description
Brightness Threshold Threshold for laser beam extraction. If a laser beam point string is undetectable
because of a bright environment, decrease this value within the range that ensures
correct point string detection.
If a reflected laser beam is too strong and glares, increase this value. (Default:
50)
When making an adjustment, find a threshold that can separate a laser beam from
the background according to Section 9.8, "SHOWING BRIGHTNESS
DISTRIBUTION".
Slit number Number of a laser beam to be used. With this function, only the one of the two
laser beams that corresponds to a number specified here is used for
measurement. (Default: 1)
1: Displayed from the lower-left corner to the upper-right corner on the screen
2: Displayed from the upper-left corner to the lower-right corner on the screen
Upper limit(mm) Among 3D positions found from individual laser beam points, those positions that
are larger than this value along the Z axis in the sensor coordinate system are not
used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: 50 mm)
Lower limit(mm) Among 3D positions found from individual laser beam points, those positions that
are smaller than this value along the Z axis in the sensor coordinate system are
not used for calculation. Make a setting according to the positional relationship
between the 3D Laser Vision Sensor and workpiece. (Default: -50 mm)
Grid pitch(mm) This function performs detection by converting 3D cross section data based on a
laser beam to a 2D image. This value represents the scale of one pixel when 3D
cross section data is converted to a 2D image. As a smaller value is specified,
the resolution improves, but the projection area becomes smaller, and noise
exercises a greater influence. (Default: 0.25 mm)
Click the OK button when the adjustment is completed and the new
setting is to be applied. When canceling the attempt for adjustment,
click the Cancel button to close the form.
- 161 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Option Description
Not Use The shutter speed mode is disabled. The shutter speed is not adjusted.
Multi mode Multiple images are taken by changing the shutter speed, and are processed to
produce an image that has a wider dynamic range when compared with a single
image taken using one shutter speed. This mode is robust against environmental
changes, but the detection time increases by about several hundred milliseconds.
Single mode An image is taken according to the settings of “Speed const.” and “Snap times”,
and detection is performed.
Select a value from 0 (bright) to 16 (dark) to set “Speed const.”. When 0 is set,
the same brightness as set by “Not Use” is produced.
“Snap times” represents the number of snapping operations. If the state of laser
beam reception is poor as in the case of viewing a dark workpiece, the
measurement environment can be improved by specifying a number greater than
1. However, as a greater number is specified, a longer detection time is required.
2D setting
The cross section detection function converts cross section data
obtained using a laser beam to a 2D image then identifies a position
by 2D measurement. So, the setting of 2D measurement must not be
omitted. In the selection of “Use / Not Use” below, only “Use” can
be selected.
- 162 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Robot to send to
Select a robot controller to which the results of vision process
execution are to be sent. Clicking the Add button displays a list
of robots. Select the name of a transmission destination robot
then click the OK button. Multiple robots can be selected.
When multiple robots are selected, the results of vision process
execution are sent to the selected robot controllers in the order of
selection.
Output frame setting
The results of 3D Laser Vision Sensor measurement need to be
related to the coordinate system of the robot. Here, select one
of the three options indicated below to determine which
information is to be used for relationship.
Option Description
Sensor frame Position and posture information in the sensor coordinate system is output.
The use of KAREL can serve a wide variety of cases, so that the default is this
option.
User frame: By using the current robot position at the time of measurement, this option
- 163 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Option Description
Conv with robot CurPos converts output information to position and posture information in the user
coordinate system.
Select a sensor coordinate system as the tool coordinate system and select an
appropriate user coordinate system before starting the vision process for the robot.
User frame: By using the value of the position register of the robot controller set by “PosReg”,
Conv with robot PosReg this option converts output information to position and posture information in the
user coordinate system.
This option may be used in an application that makes a measurement after holding
a workpiece then corrects the position where to place the workpiece.
Before starting the vision process for the robot, store in the position register the
position in the state where an appropriate user coordinate system is selected and a
sensor coordinate system is selected as the tool coordinate system.
CAUTION
The term "robot" used here means a robot controller that started the 3D Laser
Vision Sensor. The 3D Laser Vision Sensor obtains the current position and
position register information from the robot controller started the 3D Laser Vision
Sensor. Do not confuse this robot controller with the "transmission destination
robot", although both represent the same robot controller in many cases.
TIP
The hard disk has a limited capacity. If the hard disk is
- 164 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Options
Set a desired option switch. Three option switches are available
as indicated in the table below.
Items Description
Log the resulting data Each time a vision process is executed, the results of execution such as a
processing time and detection position are recorded in a file. A file named
(vision-process-name).csv is created in the folder named “3dvision” under the
folder specified in Data folder on the option setting screen. (See Chapter 11,
"OPTIONS".)
The data in this csv file can be formed, for example, into a graph by spreadsheet
software such as Excel®.
Measure narrow area If a laser beam is directed to a narrow area of a workpiece, detection may tend to
fail. In such a case, this option may be able to stabilize detection. This option
increases the detection time by about several hundred milliseconds.
Use slit points which are Usually, a laser beam produces only one image on a workpiece. Depending on
not peak the figure and surface state of a workpiece, however, a pseudo image may
additionally appear on the workpiece (which is referred to as "secondary
reflection"). If this phenomenon occurs on a workpiece, which is a real image
cannot be determined. So, all possible points need to be considered. When this
check box is checked, all possible points are considered, but noise can exercise an
influence to a greater extent. Check this check box only when secondary
reflection is observed.
When you exit from the screen by clicking the OK button, the taught
data is saved.
When you exit from the screen by clicking the Cancel button, the
taught data is cancelled.
Test
Image
Select an image subject to test execution.
Options Description
Snap from a camera In test execution, an image is snapped from the camera.
Load BMP files In test execution, an image file stored on hard disk is used. For details, see
Subsection 9.6.2, "Use BMP Files".
Continuous Run
This check box is enabled when “Snap from a camera” is
selected. For details, see Subsection 9.6.1, "Continuous Run".
- 165 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
This check box is used to obtain the taught image of the location
tool required for cross section measurement. If this check box
is checked upon completion of teaching related to 3D
measurement, the Run button of Test is enabled even when no
data is set as a location tool on the 2D measurement side.
Clicking the Run button in this state acquires the taught image of
a cross section model. Teach the location tool by using the
image.
(4) Check the check box “Get cross section image for teach” then
click the Run button to obtain a cross section image. The screens
below show an example of execution and an output message.
This cross section image is saved as an image file under a file
name as indicated in the message. Teach the location tool by
using this image.
- 166 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 167 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
(6) By choosing [File] → [Open BMP file...] from the main menu,
display the cross section image obtained in (4). Then, teach the
location tool. Set a model origin precisely by enlarging the
image as shown in the screen below.
(7) Check the detection state by test execution of the location tool.
The screen below shows an example of the results of detection.
- 168 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
(8) Return to the 3D vision process then set the location tool.
- 169 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 170 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Place the target object within the view field of the camera, and click
the Run button.
The measurement results are displayed on the run-time monitor of the
vision process.
The run-time monitor appears when [Monitor] is selected from the
[View] menu of the 3D Laser Vision Sensor.
The time displayed on the run-time monitor is all the processing time
of the vision process. However, vision process test execution does
not include communication, so the displayed time includes no
communication time.
The output results of test execution are based on the coordinate system
indicated below. When an image is loaded from a file, the output
results differ depending on whether a position data file is present or
not.
Load BMP files Coordinate system used when the Sensor coordinate system
position data file was recorded
CAUTION
In test execution, the robot controller does not start the
3D Laser Vision Sensor, so the setting of “Output frame
setting” does not apply.
This is because the current position and position
register information, which is required for calculating
the output based on the user coordinate system,
cannot be acquired.
Therefore, note that the output results obtained by test
execution differ from the output results obtained when
the 3D Laser Vision Sensor is started by the robot
controller.
- 171 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Run-time Monitor
Latest Image
- 172 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
Icon Description
Executes in succession in reverse order.
Executes using the previous image and stops execution temporarily.
Stops execution temporarily.
Executes using the next image and stops execution temporarily.
Executes in succession.
- 173 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
Location Description
R[n] A number indicating whether a measurement is successful is stored. When a
measurement is successful, 1 is stored. When a measurement is unsuccessful, 0
is stored.
PR[m] Measurement results are stored in the XYZWPR format. When a measurement is
unsuccessful, no data is stored.
- 174 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
1: UTOOL_NUM = 1
2: R[1]=(-1)
3: PCVIS RUN PROGRAM1_SLT [ST=R[1], OF=PR[2]]
4: WAIT R[1]<>(-1)
5: IF R[1] = 0, JMP LBL[99]
:
10: ! ERROR ABORT
11: LBL[99]
[End]
CAUTION
If “User frame:Conv with robot CurPos” is set in
“Output frame setting” for the 3D vision process, be
sure to set the sensor coordinate system as the tool
coordinate system and set an appropriate user
coordinate system before starting the 3D Laser Vision
Sensor to make measurements. If a different
coordinate system is set for each tool coordinate
system, correct results cannot be output.
CAUTION
Results sent from the 3D Laser Vision Sensor to the
robot controller are not compensation data but
measured position data. For the method of
calculating compensation data, see Appendix C,
"ROBOT POSITION COMPENSATION".
CAUTION
Ensure that before the execution of the vision process
for the 3D Laser Vision Sensor started by the robot
controller is completed, the robot controller does not
start another vision process for the 3D Laser Vision
Sensor (prohibition of duplicate execution). In other
words, check first that after starting a vision process
for the 3D Laser Vision Sensor by the robot controller,
information on whether a measurement made by the
vision process is successful or not has been sent to a
register, then start the next vision process for the 3D
Laser Vision Sensor.
- 175 -
B-81444EN/03 9.TEACHING AND TESTING THE 3D VISION PROCESS
- 176 -
9.TEACHING AND TESTING THE 3D VISION PROCESS B-81444EN/03
- 177 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM
- 178 -
10.CREATION AND EXECUTION OF A ROBOT PROGRAM B-81444EN/03
This section shows how to create and execute a program that instructs
the robot to operate in such a way that the robot changes its position
and posture for holding a workpiece as the position and orientation of
the workpiece change.
The following illustrates the basic principle:
Teaching
Measurement
Measurement
Automatic calculation
As the position and orientation of a workpiece change, the way to hold the
workpiece is changed accordingly. .
1. Measure the current "position and orientation of a workpiece".
2. The way to hold the workpiece is calculated automatically.
here are called "nominal data". In the lower figure, the position and
posture of the workpiece change from those in the upper figure. In
the state shown in the lower figure, a measurement is performed, and
from the results of the measurement and nominal data, the operation
for holding the workpiece is calculated automatically. This is called
"actual execution". Subsection 10.1.1, "Creating a Robot Program",
explains how to create a program that executes these operations, and
Subsection 0, "Executing a Robot Program", explains how to execute
the program.
For the overall teaching flow, see Section 1.3, "3D LASER VISION
SENSOR SETUP PROCEDURE".
A 1: UTOOL_NUM = 1
2: J P[1] 50% FINE
3: R[1]=(-1)
B 4: PCVIS RUN PROGRAM1_SLT [ST=R[1], OF=PR[2]]
5: WAIT R[1]<>(-1)
6: IF R[1] = 0, JMP LBL[999]
7:
C 8: OFS_RJ3(10,2,0,10,0,11,0)
9: UFRAME[9] = PR[11]
10: PR[11] = UFRAME[9]
11:
11: IF R[10] <> 1, JMP LBL[1]
12: R[10] = 0
13: PAUSE
14: LBL[1]
15: UTOOL_NUM = 2
D 16: J P[2] 50% CNT100
17: L P[3] 500 mm/sec CNT100
18: L P[4:GRASP POINT] 100 mm/sec FINE
19: L P[5] 300 mm/sec CNT100
:
30: ! ERROR ABORT
31: LBL[999]
32: UALM[1]
[End]
The meanings of the registers and position registers used by this robot
program are as follows:
R[1] : Execution status of vision process "PROGRAM1"
R[10]: Nominal flag (1: nominal execution, other than 1:
- 180 -
10.CREATION AND EXECUTION OF A ROBOT PROGRAM B-81444EN/03
actual execution)
PR[2]: Execution result of vision process "PROGRAM1"
PR[10] : Nominal data
PR[11] : Compensation data
In addition, the robot program uses UF(user frame) 9.
Start
Yes
Is R[10]= 1?
No
End
- 181 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM
NOTE
For more details on the OFS_RJ3 specifications,
contact FANUC.
Nominal execution
(1) Place a workpiece at a certain position within the range that
ensures robot operation, and teach P[1], which is the
measurement position, in the robot program created as described
in Subsection 10.1.1, "Creating a Robot Program". Usually,
when vision process PROGRAM1 is taught, the position where
the workpiece is placed and the workpiece measurement position
should have been determined. So, this step requires only
teaching of these positions.
(2) After position teaching for P[1] is completed, enter 1 in R[10]
manually and execute the robot program starting from the first
line.
(3) As a result of program execution, the robot moves to the
measurement position, and a measurement is made. If the
measurement is successful, OFS_RJ3 internally sets the
measurement result in PR[10] as nominal data.
(4) Then, the robot program stops temporarily at the “PAUSE” line.
Set R[10] to 0 not to update the nominal data by mistake at the
next execution.
(5) Subsequently, while keeping the workpiece in the state present
when it was measured in (3), teach the operation of holding the
workpiece from P[2] to P[5].
(6) Add an “Offset” instruction to the operation statements for
holding the workpiece in the program.
15: UTOOL_NUM = 2
16: J P[2] 50% CNT100Offset,PR[11]
17: L P[3] 500 mm/sec CNT100 Offset,PR[11]
18: L P[4:GRASP POINT] 100 mm/sec FINE Offset,PR[11]
19: L P[5] 300 mm/sec CNT100 Offset,PR[11]
- 182 -
10.CREATION AND EXECUTION OF A ROBOT PROGRAM B-81444EN/03
CAUTION
To perform nominal execution, do the following as
described in the underlined parts:
- Check that the register indicating nominal execution
(the nominal flag) is set correctly.
- Exercise care so that the workpiece state at the time
of measurement is kept unchanged until the robot holds
the workpiece.
If one of the above conditions is not met, execution
must be performed again from the workpiece
measurement (from (2) in the above procedure).
Actual execution
(1) Check that 0 is set in R[10], and execute the robot program
starting from the first line.
(2) The workpiece is measured, and the operation of holding the
workpiece is performed according to the changes in workpiece
position and orientation.
CAUTION
To perform actual execution, as described in the
underlined part above, check that the register indicating
nominal execution (the nominal flag) is not set. If
actual execution is performed with the nominal flag set,
the nominal data is rewritten. (In this case, the
nominal data can be restored by teaching the operation
of holding the workpiece again.)
- 183 -
B-81444EN/03 10.CREATION AND EXECUTION OF A ROBOT PROGRAM
Robot systems using the 3D Laser Vision Sensor can find a wide
variety of applications by modifying the robot program presented in
Subsection 10.1.1, "Creating a Robot Program". Such applications
include the following:
- 184 -
11.OPTIONS B-81444EN/03
11 OPTIONS
In option setting, settings for the 3D Laser Vision Sensor except
teaching of vision data is performed.
- 185 -
B-81444EN/03 11.OPTIONS
Item Description
StartUpCamera Camera number that is selected when you start up the 3D Laser Vision
Sensor . 1 is the default.
Data folder Folder name to store vision data that you trained.
C:¥Program Files¥Fanuc¥3D Vision¥Data is the default.
Image folder Folder name to store images, such as unmatched images, used by the 3D
Laser Vision Sensor .
C:¥Program Files¥Fanuc¥3D Vision¥Image is the default.
- 186 -
11.OPTIONS B-81444EN/03
Item Description
If image folder exceeds limit The Vision process saves images to files when you select image saving option
in vision process training. If most of the hard disk is used by saving images
such as unmatched images, the trouble such as PC can not start may occur.
Even when most of the hard disk space is not yet used up, if the folder
specified in “Image folder” becomes too large, it can take too much time to
access the folder, abnormally increasing the processing time of the 3D Laser
Vision Sensor. In the 3D Laser Vision Sensor, the size of the folder specified
in “Image folder” is suppressed not to exceed the limit specified in the next
item, “Size if image folder”.
This setting selects an action to be taken not to exceed the limit.
Option Description
Don’t save When the limit is reached, the subsequent
images are not stored.
Save after deleting oldest When the limit is reached, the image file with
one the oldest date in the folder is deleted, and a
new image file is stored. This setting is used
by default.
CAUTION
When the file system of the personal computer is
FAT32, it may take too much time to store an image if
“Save after deleting oldest one” is selected. As
described in Subsection 2.1.1, "Personal Computer",
the NTFS file system should be used. (At least, the
file system of the image folder should be NTFS.)
Size if image folder When changing this size, ensure that there is a hard disk space of about
100MB left even if the image folder becomes full. The maximum value is
1000MB.
Show run-time monitor Check to automatically show the run-time monitor at start-up of the 3D Laser
at start-up Vision Sensor application. You can show the run-time monitor by clicking
[Monitor] button on the tool bar. OFF is the default.
Advanced mode By default, some items that would not be used in usual do not appear. Check it
to show all of items.
Keep image in last execution When this check box is checked, the images of the most recently executed
vision process are saved in an image folder named LastImage*.bmp. (*
denotes presence of multiple files. The number of images saved varies
depending on the vision process used.) This folder is used when a trouble
occurs in the system. By default, this check box is checked.
Comm. to robot Setting for communication between the robot controller and 3D Laser Vision
Sensor. Communication is used for invoking the 3D Laser Vision Sensor
process from the robot controller and allowing the program to store the
processing results in robot controller's registers and position registers.
- 187 -
B-81444EN/03 11.OPTIONS
Item Description
TCP/IP keep alive time Indicates the time in milliseconds until which the Windows NT TCP/IP service
waits to determine the communication partner did not return a response. The
3D Laser Vision Sensor uses this service to determine whether the robot
controller is powered off. When this value is too small, communication may
become unstable. When this value is too large, the 3D Laser Vision Sensor
cannot resume communication with the robot controller because it cannot
determine whether the robot controller is powered off. The default value for
the TCP/IP service is two hours (7,200,000 milliseconds), which is too long to
determine whether the robot controller is powered off. Therefore, change this
value to about 15 seconds by entering 15,000. For this change to take effect,
you must restart the PC.
PING timeout time The 3D Laser Vision Sensor checks whether the robot controller is powered on
by periodically issuing Ping. Ping determines the robot controller is powered
off after waiting for the time period specified in this field without receiving a
response from it. If the robot controller does not return a response with this
value set to a larger value, a heavy load is applied, affecting the performance of
and a slow response from the PC. Decreasing this value off-loads the PC, but
makes the operation of Ping unstable. The default value is 250 milliseconds.
PING interval time Indicates the interval in milliseconds at which Ping is issued to check whether
the robot controller is powered on. Decreasing this value imposes loads on
the PC. Increasing this value off-loads the PC, but it takes a longer time until
the 3D Laser Vision Sensor checks whether the robot controller is powered on.
The default value is five seconds.
Comm. software timeout Indicates the maximum time in seconds until which the 3D Laser Vision Sensor
waits for a response from the robot controller during communication with the
controller over the Ethernet. The default is 60 seconds.
No alarm outputs even if Checking this check box suppresses alarm indication even if a specified
specified program does not program is not present when the robot controller specifies execution of the
exist program. This check box should be checked only in such a special case that
more than one personal computer is connected to one robot controller. By
default, this check box is not checked.
- 188 -
11.OPTIONS B-81444EN/03
TIP
When the PC vision instruction is executed in a
program for the robot controller, the robot controller is
set as "robot" data, and all personal computers (3D
Laser Vision Sensor software) that are in the
"connected" state are started.
For the connection between the robot controller and
personal computers (3D Laser Vision Sensor
software), see Chapter 5, "ROBOT COMMUNICATION
SETTING".
Optional setup for the 3D Laser Vision Sensor . Select [3DV Options]
from the [View] menu. You will see the screen as shown below.
Item Description
Application Select the kind of application to be used when training is performed. "2D, 3D
measurement" is the default. Checking “Bin picking” then clicking the OK
button changes the menu configuration to display a menu specifically
designed for the “Bin pinking” function. For information about “Bin pinking”,
refer to the separate operator's manual.
- 189 -
APPENDIX
B-81444EN/03 APPENDIX A.ERROR CODES
A ERROR CODES
The following table lists the major error codes.
Error code Message Cause Response
80041007H No frame grabber The frame grabber is not available to The frame grabber may not be inserted
was opend. the software of the 3D Laser Vision into the PCI bus securely, the frame
Sensor . grabber may be faulty, or more than one
3D Laser Vision Sensor program may be
running on the same personal computer.
Check for these items.
80042003H 2D detection failed. A failure occurred in 2D detection Using a location tool, adjust parameters
and teach models again, so that 2D
detection can be performed securely.
80042005H Image loading failed. The video image file is not found or Check for the video image file, and take
is damaged. pictures again as required.
80042006H Image saving failed. The target folder is not found or Check whether the target folder exists.
write-protected. Also check that the file property is correct.
80042007H Pixel value cannot Camera calibration has not been Calibrate the camera (again).
be translated to line. performed, or the "Scale factor" is
invalid such as being zero or less.
80042008H Position cannot be The "CCD cell size" is invalid such Check "CCD cell size" in the Change
translated to pixel as being zero or less. camera type dialog box. Correct it as
value. required, and calibrate the camera
(again).
80042009H Invalid program An attempt was made to execute a Check the vision process name and
name. command with a measurement measurement number, and correct them
number of 2 or greater. However, as required.
the vision process name is different
from that used during the previous
execution of the process.
8004200CH Invalid snap An attempt was made to execute a Check the measurement number and
number. command with a measurement correct it as required.
number of 2 or greater. However,
the measurement number is not
consecutive to that used during the
previous execution of the command
or is invalid such as being greater
than the specified measurement
count.
80042011H Lack of points. The number of bright points If the window is not appropriate, adjust the
generated with laser beams as window size and position.
captured with the camera is If the laser beams are defeated by the
insufficient. background, decrease the amount of light
from the outside, or decrease “Brightness
Threshold” within the range that ensures
correct detection.
80042013H No plane. No straight line forming a flat surface . If the laser beam point string is
was found, or the distance between insufficient, respond in the same way as
two straight lines is too large to form for “80042011H Lack of points.”.
- 193 -
A.ERROR CODES APPENDIX B-81444EN/03
- 194 -
B-81444EN/03 APPENDIX B.CALIBRATION JIG
B CALIBRATION JIG
This appendix explains a jig used for simple re-calibration (see
Section 6.7, "SIMPLE RE-CALIBRATION"), sensor coordinate
system setup (see Subsection 6.6.2, "Setting a Sensor Coordinate
System"), and grid pattern calibration (see Section 6.4, "GRID
PATTERN CALIBRATION (2D MEASUREMENT)").
In each calibration operation, the following grid points (grid pattern)
are shown to the 3D Laser Vision Sensor to allow automatic
recognition of the positional relationship between the jig and the 3D
Laser Vision Sensor, distortion of the lens and focal length.
Y
Origin
The jig must fit within the field of view of the 3D Laser Vision
Sensor.
The number of the grid points must be 49 (= 7x7) as shown above.
Place the bigger grid points like the letter "L" and set the corner of the
letter "L" to the origin as shown above.
The size of the jig used varies from one application to another
according to the size of the view field of the camera.
The minimum required number of grid points is as shown above
(about 7×7 = 49). The more grid points, the higher the precision.
However, if there are too many grid points, it may become impossible
to detect the object correctly.
- 195 -
B.CALIBRATION JIG APPENDIX B-81444EN/03
The grid pattern must have large grid points arranged like letter L as
shown above.
The corner of the letter L is placed at the grid pattern origin. A string
of three large dots corresponds to the X-direction, while a string of
two large dots corresponds to the Y-direction.
The large grid points have a diameter 1.5 times that of the small grid
points so that the calibration jig can be recognized even if it is placed
obliquely to the camera.
- 196 -
B-81444EN/03 APPENDIX C.ROBOT POSITION COMPENSATION
- 197 -
D.RESTRICTIONS WITH A RAIL AXIS APPENDIX B-81444EN/03
- 198 -
B-81444EN/03 INDEX
Index
-i-1-
INDEX B-81444EN/03
176 O
Execution of Simple Re-calibration..............81
OPTION SETTING.....................................188
EXIT 3D LASER VISION SENSOR ............37
OPTIONAL FUNCTION OF LOCATION
Extra care region ..........................................96
TOOL .........................................................95
F OPTIONS ....................................................187
OTHER APPLICATION EXAMPLES .......186
Find Grid Pattern .........................................67
OUTLINE OF 3D LASER VISION SENSOR4
Frame Grabber Board...................................12
OVERVIEW OF THE MANUAL....................2
FRAME GRABBER BOARD ........................15
P
G
PAINTING AN IMAGE WITH A RED PEN40
GRID PATTERN CALIBRATION (2D
PC I/O Cable..................................................13
MEASUREMENT) ....................................66
Personal Computer .......................................12
Grid Pattern Frame ......................................68
PREFACE........................................................1
I
PROTECTOR ................................................35
I/O Board .......................................................13 R
I/O BOARD....................................................19
RESTRICTIONS WITH A RAIL AXIS ......200
Information Obtained .........112, 128, 142, 158
ROBOT COMMUNICATION SETTING .....47
INSTALLATION SEQUENCE.....................14
ROBOT POSITION COMPENSATION.....199
Installing Hardware ...............................15, 19
Installing the 3D Laser Vision Sensor S
Software .....................................................25
Safety of Laser Sensor ....................................5
Installing the Driver Software ...............17, 21
SELECT A CAMERA....................................37
L SENSOR COORDINATE SYSTEM .............73
Sensor Coordinate System and Measurement
Laser Beam .................................................... 5
Results .......................................................73
Laser Slit Detail Display ..............................58
Setting a Reference Point .............................92
LIST OF VISION DATA...............................44
Setting a Sensor Coordinate System ...........74
M
Setting Laser Slit Detection Parameters.....58
Making Settings for the 3D Laser Vision Setting of 3D Measurement Calibration......53
Sensor Side ................................................28 Setting the Camera.......................................30
Making Settings for the Robot Side .............27 Settings .......................................129, 143, 159
Model Display ...............................................91 SETTINGS ..................................................113
MODIFYING LOCATION TOOL MODEL..90 SETUP...........................................................10
MOVING IMAGES .......................................39 SETUP VISION PROCESS........................103
SHOW DETAILS OF CAMERA
CALIBRATION DATA ..............................72
- i-2 -
B-81444EN/03 INDEX
-i-3-
Revision Record
Feb.,2001
01