Automated Imaging Association Robotic Industries Association

International Conference for Vision Guided Robotics

Proceedings

September 30 – October 2, 2008 Sheraton Detroit Novi Novi, Michigan USA

Welcome to the International Conference for Vision Guided Robotics!
In today’s economy, the need for vision guided robotics technologies is greater than ever. The Robotic Industries Association (RIA) and the Automated Imaging Association (AIA) bring you this joint conference to help you apply these technologies in order to boost productivity, reduce costs and increase quality. We hope you will find great ideas from the presenters as well as your fellow attendees. And, we encourage you to meet with the vendors in the tabletop exhibit area who can offer you products that meet your specific needs. In addition to this conference, RIA and AIA offer a host of valuable resources that can help you when you return to your company. We recommend visiting our websites (www.robotics.org and www.machinevisiononline.org) to find free technical papers, case studies, and information on upcoming events such as The Vision Show (end of March 2009 in Phoenix, Arizona) and the International Robots, Vision & Motion Control Show (June 2009 in Chicago, Illinois). Your feedback is very important to us, so please take the time to complete your evaluation form and submit it to us onsite (or send it in to our office after the conference). If you prefer, you can always talk to our staff in person, either here this week or by calling 734/994-6088 to share your ideas. Thanks so much for coming, and enjoy the conference! Sincerely,

Jeffrey A. Burnstein Executive Vice President Robotic Industries Association

Dana Whalls Managing Director Automated Imaging Association

© 2008 ALL RIGHTS RESERVED The contents of this book may not be copied or further disseminated without the written approval of the Robotic Industries Association, the Automated Imaging Association or the individual authors.

DISCLAIMER The papers presented in this Proceedings book are the personal expressions and positions of the respective author and presenter. These views are not those of the Association, nor are they necessarily endorsed by the Association or its members. This conference was presented by the Association to allow robot and vision topics to be openly discussed and diverse views disseminated. Comments about the contents will be forwarded to the authors by the Association. 900 Victors Way, Suite 140, Ann Arbor, Michigan 48108 Telephone: 1-734-994-6088 Fax: 1-734-994-3338

Table of Contents
2008 - 2009 Events ......................................................................................................... 3 About Robotic Industries Association (RIA)..................................................................... 4 About Automated Imaging Association (AIA) .................................................................. 4 Exhibitor Index ................................................................................................................ 5 Exhibitor Listings ............................................................................................................. 6 Conference Speakers.................................................................................................... 10 ICVGR Conference Agenda .......................................................................................... 13 Corporate Sponsors: ..................................................................................................... 16 Media Sponsors: ........................................................................................................... 17

Automation Technologies Council

2008 - 2009 Events
Mark Your Calendar!

National Robot Safety Conference Indianapolis Marriott East

October 6-9, 2008 Indianapolis, Indiana USA

AIA Networking Reception VISION 2008

November 5, 2008 Stuttgart, Germany

16th Annual Robotics Industry Forum Portofino Bay Hotel

November 5-7, 2008 Orlando, Florida USA

17th Annual AIA Business Conference Marriott Coronado Island Resort

February 4-6, 2008 Coronado, California USA

AIA International Pavilion Shanghai Exhibition Center

March 24 – 26, 2008 Shanghai, China

The Vision Show Phoenix Convention Center

March 31-April 2, 2008 Phoenix, Arizona USA

International Robots, Vision & Motion Control Show Donald E. Stephens Convention Center

June 9 – 11, 2009 Rosemont (Chicago), Illinois USA

For full details on these or other events, visit

www.Robotics.org ~ or ~ www.MachineVisionOnline.org
or call 1-734-994-6088

About Robotic Industries Association (RIA)
Robotic Industries Association (RIA) is the only trade association in North America organized specifically to serve the field of robotics. Founded in 1974, RIA is dedicated to the exchange of technical and trade related information between robot manufacturers, distributors, corporate users, accessory equipment and systems suppliers, consultants, research groups and international organizations. RIA is the common ground where these groups can come together to discuss challenges and solutions dealing with the implementation of robotic technology. Some 285 companies are members of RIA. Members receive many benefits, including discounts on RIA workshops, conferences and resources.

About Automated Imaging Association (AIA)
Founded in 1984, AIA was organized specifically to promote the global use of image capture and analysis technology and now represents more than 300 machine vision suppliers, system integrators, users, researchers, and consulting firms from 27 nations. The AIA Sponsors many educational conferences and workshops including the International Robots, Vision, & Motion Control Show, The Vision Show and the annual AIA Business Conference. AIA also produces an annual Machine Vision Market Study. Be sure to visit Machine Vision Online (www.machinevisiononline.org), the world’s leading resource for machine vision information on the internet.

Exhibitor Index
AIA/RIR Standards Activities........................................................................................... 5 Basler Vision Technologies ............................................................................................. 5 CCS America, Inc............................................................................................................ 6 Cognex Corporation ........................................................................................................ 6 Components Express, Inc. .............................................................................................. 6 DENSO Robotics............................................................................................................. 6 Dunkley International, Inc................................................................................................ 6 EPSON Robots ............................................................................................................... 6 FANUC Robotics America, Inc. ....................................................................................... 6 Hitachi Kokusai Electric America Ltd............................................................................... 6 HTE, Inc. ......................................................................................................................... 6 ISRA VISION SYSTEMS, Inc.......................................................................................... 6 Item North America ......................................................................................................... 6 KUKA Robotics ............................................................................................................... 7 LEONI Engineering Products & Services, Inc. ................................................................ 7 LMI Technologies Inc. ..................................................................................................... 7 Matrox Imaging ............................................................................................................... 7 Motoman, Inc. ................................................................................................................. 7 Multi-Contact USA........................................................................................................... 7 MVTec, LLC .................................................................................................................... 7 Nachi Robotic Systems Inc. ............................................................................................ 7 Northwire, Inc. ................................................................................................................. 7 Pilz Automation Safety L.P.............................................................................................. 7 Radix Controls Inc. .......................................................................................................... 8 Schneider Optics, Inc. ..................................................................................................... 8 SICK, Inc. ........................................................................................................................ 8 Stäubli Robotics .............................................................................................................. 8 StockerYale, Inc. ............................................................................................................. 8 Tectivity, Inc. ................................................................................................................... 8 United Sales & Services Inc. ........................................................................................... 8 Valentine Robotics, Inc.................................................................................................... 8 VMT-Pepperl + Fuchs, Inc............................................................................................... 8 WireCrafters LLC............................................................................................................. 9

Exhibitor Listings
Automated Imaging Association Robotic Industries Association 900 Victors Way Suite 140 Ann Arbor, Michigan 48108 Phone: 734-994-6088 Fax: 734-994-3338 Email: jfryman@robotics.org Web: www.Robotics.org www.MachineVisionOnline.com Contact: Jeff Fryman

products in stock and has designed over 2,000 custom lighting solutions to date. The latest products include our new high power ring and dome lights.
Cognex Corporation One Vision Drive Natick, Massachusetts 01760 Phone: 508-650-3000 Fax: 508-650-3344 Email: mktg@cognex.com Web: www.cognex.com Contact: John Keating

Dunkley International, Inc. 1910 Lake Street Kalamazoo, Michigan 49001 Phone: 269-343-5583 Fax: 269-343-5614 Email: pcallan@dunkleyintl.com Web: www.dunkleymachinevision.com Contact: Pat Callan

The Robotic Industries Association is an Accredited Standards Developer responsible for Industrial Robot Standards in the United States. Working with ISO and ANSI, the RIA sponsors the R15 series of standards, most notably the R15.06 Robot Safety standard and the National Adoption of ISO 10218-1. RIA hosts annual robot safety events throughout North America. The Automated Imaging Association is the world leader in sponsoring interoperability standards for digital machine vision applications. The Camera Link® and GigE Vision™ standards are renowned for their ability to allow integration of total vision systems using the best solution of components available from multiple suppliers.
Basler Vision Technologies 855 Springdale Drive Suite 610 Exton, Pennsylvania 19341 Phone: 610-280-0171 Fax: 610-280-7608 Email: tim.coggins@baslerweb.com Web: www.baslerweb.com Contact: Tim Coggins

Cognex Corporation is the world’s leading provider of vision systems, vision software, and vision sensors used in manufacturing automation. Cognex is also a leader in industrial ID readers.
Components Express, Inc. 10330 Argonne Woods Drive Suite 100 Woodridge, Illinois 60517 Phone: 630-257-0605 Fax: 630-257-0603 Email: rberst@componentsexpress.com Web: www.componentsexpress.com Contact: Ray Berst

Dunkley International is a supplier of turnkey vision systems. Currently we have systems ranging from high-speed fruits and vegetables to final inspection of heavy-duty truck transmissions. While much of our business is geared towards very high volume systems, we also manufacture many custom one of a kind systems. Whether you need a complete vision and robotic cell or a vision system added to your existing line we can help.
EPSON Robots 18300 Central Avenue Carson, California 90746 Phone: 562-290-5958 Fax: 562-290-5999 Email: rick_brookshire@ea.epson.com Website: www.robots.epson.com Contact: Rick Brookshire EPSON Robots is the global leader in PC controlled precision factory automation with a product line of hundreds of easy to use SCARA, Cartesian and 6 axis robots. FANUC Robotics America, Inc. 3900 W. Hamlin Road Rochester Hills, Michigan 48309 Phone: 800-IQ-ROBOT Fax: 248-276-4227 Email: marketing@fanucrobotics.com Web: www.fanucrobotics.com Contact: Ed Roney FANUC Robotics America, Inc. is the leading supplier of industrial robots and robotic systems. Over 200,000 robots are installed worldwide, and more than 200 robot variations are available to work in a wide range of applications. The combination of the world’s most reliable robots, process expertise, support services, regional locations and a network of system integrators provide manufacturers in virtually every industry the tools they need to reduce costs, improve quality, maximize productivity, and increase their competitive position in the global market.

Machine Vision Cables, Camera Link® Cables, Cables for GigE Vision™, FireWire Cables, Analog Cables, SCSI Cables, Internal Ribbon Cables, Camera Enclosures, Transformers.
DENSO Robotics 3900 Via Oro Avenue Long Beach, California 90810 Phone: 888-476-2689 Fax: 310-952-7502 Email: info@densorobotics.com Web: www.densorobotics.com Contact: Greg Johnson DENSO offers a wide range of compact, fouraxis SCARA and five- and six-axis articulated robots, for payloads up to 20 kg and reaches from 350 to 1,300 mm. Repeatability is to within ±0.015 mm. Standard, dust- and mistproof and cleanroom models are available. ANSI and CE compliance enables global deployment. UL-listed models are available for both the US and Canada. Easy-to-use programming and 3-D offline simulation software, controllers and teaching pendants are also offered.

Basler Vision Technologies specializes in state of the art digital camera solutions for a wide variety of demanding vision applications. Over 20 years of industry expertise and product development is evident in Basler's extensive application knowledge and broad product offering. Area scan and line scan cameras utilize both CCD & CMOS sensors, and FireWire, Gigabit Ethernet, and Camera Link interface technologies.
CCS America, Inc. 5 Burlington Woods Burlington, Massachusetts 01803 Phone: 781-272-6900 Fax: 781-272-6902 Email: sales@ccsamerica.com Web: www.ccsamerica.com Contact: Barbara Gagnon

CCS is a global manufacturer of LED lighting for Machine Vision. Due to its quality and advanced lighting technologies, CCS Inc. is the #1 supplier for vision systems in the world and has the largest market share in Japan. CCS has more than 300 different types of standard

Hitachi Kokusai Electric America Ltd. 150 Crossways Park Drive Woodbury, New York 11797 Phone: 817-490-5124 Fax: 817-490-6116 Email: phyllis.vela@hitachikokusai.com Web: www.hitachikokusai.us Contact: Phyllis Vela

robotic enclosures are easily designed with the support of item’s in-house engineering group.

KUKA Robotics 22500 Key Drive Clinton Township, Michigan 48036 Phone: 866-USE-KUKA Fax: 866-FAX-KUKA H i t a c h i K o k u s a i E l e c t r i c A m e r i c a L t d Email: rebeccamarkel@kukarobotics.com www.kukarobotics.com manufactures and sells miniaturized high-speed, Web: high resolution analog and digital cameras. Contact: Rebecca Markel Cameras are available as monochrome or color. KUKA Robotics offers a broad range of highly Outputs include Camera Link®, Firewire 1394A modular robots, covering all common payload & 1394B, & GigE Vision™ interfaces. Come see categories, from 3 kg to 1000 kg. Over two o u r n e w G i g E c a m e r a l i n e o f f e r i n g s . thirds of the 75,000 KUKA robots installed in the field use our open architecture PC-based controller, making KUKA the number one PCHTE, Inc. controlled robot manufacturer in the world. 1100 Opdyke KUKA controllers are also available for Auburn Hills, Michigan 48326 integration with other components of your Phone: 248-371-1918 automation systems. Other products include Fax: 248-371-2185 SoftPLC, Remote Service, KUKA SIM Email: dreed@hte.net simulation software, Networking Services and Web: www.hte.net a variety of dress packages. In addition, our Contact: Daniel Reed Systems Partners - experts in their respective HTE is an application engineering distributor industries - offer key technologies that providing hardware and software products transform the KUKA robot into an applicationspecializing in track and trace, error proofing, specific solution. Our advanced KUKA College direct part marking, machine vision and plant enables fast learning through flexible training floor data collection. Now offering 2D and 3D systems that simulate a variety of real-world vision guided robotic solutions based on Shafi applications. KUKA Robotics offers a 24-hour service hotline as well as engineering services. Reliabot and Siemens vision. ISRA VISION SYSTEMS, Inc. 3350 Pine Tree Road Lansing, Michigan 48911 Phone: 517-887-8878 Fax: 517-887-8444 Email: info.usa@isravision.com Web: www.isravision.com Contact: Diane Rizer ISRA VISION SYSTEMS develops flexible turnkey machine vision solutions for industrial applications. ISRA specializes in 2D and 3D robot guidance, web inspection, bead inspection, and assembly inspection. Years of experience in machine vision, robotic technology and industrial automation provides cost-effective, integrated solutions, fully installed and performance guaranteed. item North America 925 Glaser Parkway Akron, Ohio 44306 Phone: Fax: Email: Web: Contact: Rick Fascione All manufacturing, assembly or automation processes require a robust sub-structure, base or platform as the starting point for design, development and implementation. item North America provides this sub-structure utilizing structural aluminum and modular components to replace welded steel with a more efficient, flexible and visually appealing alternative. Machine bases, sub-structures, frames, safety, laser and LEONI Engineering Products & Services, Inc. 2505 Industrial Row Drive Troy, Michigan 48084 Phone: 248-655-1900 Fax: 248-655-1905 Email: chris.miller@leoni.com Web: www.leoni-robotic-solutions.com Contact: Chris Miller

Matrox Imaging 1055 St. Regis Boulevard Dorval Quebec H9P 2T4 Canada Phone: 514-822-6000, x2438 Fax: 514-822-6298 Email: bruno.parent@matrox.com Web: www.matroximaging.com Contact: Bruno Parent

Matrox Imaging is a leading provider of component-level solutions to OEMs and integrators involved in various manufacturing sectors. Products include cameras, interface boards and processing platforms, all designed to provide optimum price-performance within a common software environment. Matrox Imaging offers a comprehensive collection of software tools for calibrating (2D and 3D), enhancing and transforming images, locating objects, extracting and measuring features, reading character strings, and decoding and verifying identification marks.
Motoman, Inc. 805 Liberty Lane West Carrollton, Ohio 45449 Phone: 937-847-6200 Fax: 937-847-6277 Email: info@motoman.com Web: www.motoman.com Contact: Greg Garmann High-speed, high-performance Motoman robots feature payloads from 3-500 kg and are available with integrated vision capability to facilitate multi-processing in a wide range of applications, including: arc welding; assembly; coating; dispensing; material cutting; material handling; material removal; and spot welding. Integrated vision is used for part finding, robot guidance, identification, and inspection. Multi-Contact USA 5560 Skylane Boulevard Santa Rosa, California 95403 Phone: 440-243-4929 Fax: 440-243-6628 Email: d.rababy@multi-contact.com Web: www.multi-contact.usa.com Contact: Dave Rababy Multi-Contact is the world class provider of industrial robotic cable connectors up to 250 amp capacity. Our multilam technology has virtually unlimited applications due to our design flexibility. Our connectors offer both standard and custom designed solutions for a wide and diverse spectrum of applications. Multi-Contact can provide reliable and cost effective solutions for your interconnection requirements. The robotic line of connectors are small in size and high in performance.

Tailor Made Robotic Cable and Cable Management Solutions. Providing Global Field Service and Project Management.
LMI Technologies Inc. 1673 Clivedon Avenue Delta, British Columbia V3M 6V5 Canada Phone: 604-636-1011 Fax: 604-516-8368 Email: info@lmitechnologies.com Web: www.lmitechnologies.com Contact: Dan Howe

LMI Technologies Inc. is a research and manufacturing organization specializing in machine vision applied technologies. The LMI brands include FireSync, Sensors That See, HexSight, and maestro.

MVTec, LLC One Broadway, Fl 14 Cambridge, Massachusetts 02142 Phone: 617-401-2112 Fax: 617-401-3617 Email: eisele@mvtec.com Web: www.mvtec.com Contact: Heiko Eisele

touch screen HMIs, e-stop pushbuttons, safety sensors, two-hand enabling devices and light curtains. Certified engineers and qualified consultants are available to design systems, manage projects, perform risk assessments, perform machine/plant reviews, install equipment and train personnel.
Radix Controls Inc. 2105 Fasan Drive Oldcastle, Ontario N0R 1L0 Canada Phone: 519-737-1012 Fax: 519-737-1810 Email: info@radixcontrols.com Web: www.radixcontrols.com Contact: Ross Rawlings Radix Controls Inc. has been providing North American manufacturers with high-tech tools they need to keep their production lines competitive for over 15 years. Our vision experts specialize in vision inspection design and integration in automotive, food & beverage, pharmaceutical & packaging markets. We also recently won the Product Innovation Award from Windsor Chamber of Commerce for one of our proprietary vision products – Tool Tracker. Schneider Optics, Inc. 285 Oser Avenue Hauppauge, New York 11788 Phone: 631-761-5000, x204 Fax: 631-761-5090 Email: industrial@schneideroptics.com Web: www.schneideroptics.com Contact: Stuart Singer

product application solutions. Products from SICK initiate, inspect, confirm, monitor, and safeguard the movement of product in industries that use robotics for automation. With the customer as our focus and innovation as our guide, SICK is equipped to deliver unique and superior products to the robotics industry. Stäubli Robotics 201 Parkway West Duncan, South Carolina 29334 Phone: 864-486-1980 Fax: 864-486-5497 Email: d.arceneaux@staubli.com Web: www.staubli.com Contact: David Arceneaux

MVTec provides standard software for machine vision applications including algorithms for 2D and 3D vision-based robotics guidance.
Nachi Robotic Systems Inc. 22285 Roethel Drive Novi, Michigan 48375 Phone: 248-305-6542 Fax: 248-605-6542 Email: marketing@nachirobotics.com Web: www.nachirobotics.com Contact: Karen Lewis Nachi Robotic Systems Inc. provides successful robotic solutions for several applications including: spot welding, arc welding, sealing, dispensing, material handling, machine loading and unloading, buffing, palletizing, assembly, roller hemming, diecasting, deburring, and press-to-press handling. Nachi robots can handle load capacities from 5 to 700 kg. Nachi is a full-service supplier and certified to ISO 9001:2000. Northwire, Inc. 110 Prospect Way Osceola, Wisconsin 54020 Phone: 715-294-2121 Fax: 715-294-3727 Email: cableinfo@northwire.com Web: www.northwire.com Contact: Ken Anderson Northwire Endurance™ Vision assemblies – the most rugged assemblies for vision system applications. Northwire has been producing high quality industrial grade cable for over 36 years. That standard of quality has gone into our vision cable assemblies. The high-quality connectors and Northwire’s advanced, industrial-grade cables provide ultra-reliable interconnectivity in motion and vision system applications which include CCXC Analog Video, MVC-800 FireWire, GEV1000™ GigE Vision™ and Camera Link® cable assemblies. Pilz Automation Safety L.P. 7150 Commerce Boulevard Canton, Michigan 48187 Phone: 734-354-0275 Fax: 734-354-3355 Email: info@pilzusa.com Web: www.pilz.com Contact: Customer Service

Stäubli is a mechatronics solution provider with three dedicated divisions: textile machinery, connectors and robotics. Founded in 1892, Stäubli is known worldwide for the quality of its methods and processes. Featuring high productivity and precision, Stäubli robots offer solutions for all industries. The comprehensive product range includes small 4-axis SCARA as well as 6-axis medium to heavy-duty robots with payloads ranging from 1kg - 250 kg featuring superior quality and performance.
StockerYale, Inc. 275 Kesmark Montreal, Quebec H3M 1R2 Canada Phone: 514-685-1005 Fax: 514-685-3307 Email: lasers@stockeryale.com Web: www.stockeryale.com Contact: Customer Service

Schneider Optics designs, develops, and manufactures high performance lenses for machine vision, robotics, document scan ning, in du stria l in sp ect ion and metrology, gauging, military, surveillance, & other image processing applications. Standard products include Compact Cmount lenses, Bilateral Telecentric lenses, a modular Macro system, large format lenses (area & line scan), 3-CCD lenses and industrial filters. Custom lens solutions are also available. Key markets include Machine Vision, Robotics, Document Scanning, Industrial Inspection, 2D/3D Metrology, Surveillance, & Hyperspectral Imaging.
SICK, Inc. 6900 W 110 Street Minneapolis, Minnesota 55438 Phone: 952-941-6780 Fax: 952-941-9287 Email: brian.mcmorris@sick.com Web: www.sickusa.com Contact: Brian McMorris Whether safeguarding robot assembly areas or inspecting finished product, companies can count on SICK for innovative products and topnotch expertise to deliver a wide range of

StockerYale, Inc. is an independent designer and manufacturer of structured light lasers, LED modules and fluorescent illumination products, as well as phase masks and specialty optical fibers for use in a wide range of markets and industries including machine vision, industrial inspection, telecommunications, military, utilities, and medical.
Tectivity, Inc. 3099 Tall Timbers Milford, Michigan 48380 Phone: 248-676-9797 Fax: 248-676-9796 Email: info@tectivity.com Web: www.tectivity.com Contact: Jon Heywood

Pilz Automation Safety L.P. manufactures and offers a complete line of safe automation solutions and control products. The line includes safety relays for automation applications, safety and general-purpose PLCs, lockout/tagout systems utilizing safety controls, motion control systems, monitoring relays,

Manufacturer of the VideoModule, LEDModule, and Laser-Module family of protective enclosures for robot applications. Also, we are a distributor of lighting, lensing, CCD cameras, filters, cables, etc.

United Sales & Services Inc. 32549 Schoolcraft Road Livonia, Michigan 48150 Phone: 734-522-8100 Fax: 734-522-0818 Email: rweber@ussvision.com Web: www.ussvision.com Contact: Ron Weber

USS United Sales & Services Inc., was established in 1990. USS has become the largest, most diverse total turn-key integrator in North America. USS is the largest integrator of DVT/ Cognex products. Our success has enabled us to obtain a global blanket with General Motors providing total turn key vision error proofing solutions. USS has developed with Cognex the USS Exact Scan that is capable of reading an entire vehicle for errors at end of assembly. USS has developed the USS tracker in conjunction with Shafi technologies that mounts on the end of any robot and reads the entire bead real time. These are a few of the various intangibles we provide to all of our clients in the most efficient and cost-effective manner.
Valentine Robotics, Inc. 36625 Metro Court Sterling Heights, Michigan 48312 Phone: 586-979-9900 Fax: 586-979-9901 Email: andy@valentinerobotics.com Web: www.valentinerobotics.com Contact: Andrew Valentine

WireCrafters LLC 6208 Strawberry Lane Louisville, Kentucky 40214 Phone: 800-626-1816 Fax: 502-361-3857 Email: bsemones@wirecrafters.com Web: www.wirecrafters.com Contact: Butch Semones Supplier of physical barriers for robotic work cells along with value adds such as weld curtains and interlocks.

Valentine Robotics is the North American distributor of Scorpion Vision robot guidance and machine vision software. We deliver turnkey robot and vision systems for all application types. We offer machine vision software, components, kits, studies and integration. Contact www.valentinerobotics.com Free trial software and Integrator opportunities available!
VMT-Pepperl + Fuchs, Inc. 3600 Green Court Suite 490 Ann Arbor, Michigan 48105 Phone: 269-823-4650 Fax: 330-486-0288 Email: vmt-info@us.pepperl-fuchs.com Web: www.vmt-gmbh.com Contact: Todd Belt The Pepperl+Fuchs VMT group has over 20 years of success in applying complete turnkey systems for industrial image processing applications. System solutions are based on self developed software products adaptable to clients’ specific needs. With an easy-to-use test and calibration process along with multiple redundancies, we can customize a solution to increase safety, improve quality, speed up production, and reduce costs.

Conference Speakers
Mr. Robert Anderson New Technology Manager Advanced Manufacturing Engineering Chrysler LLC 800 Chrysler Drive, CIMS 482-04-16 Auburn Hills, Michigan 48326 Phone: 248-944-6076 Fax: 248-841-6272 Email: ra2@chrysler.com Mr. David Arceneaux Business Development – Marketing Manager Stäubli Corporation – Robotics Division 201 Parkway West PO Box 189 Duncan, South Carolina 29334 Phone: 864-486-5416 Fax: 864-486-5497 Email: d.arceneaux@staubli.com Mr. David Dechow President Aptúra Machine Vision Solutions 3130 Sovereign Drive Suite 5A Lansing, Michigan 48911 Phone: (517) 272-7820, x11 Fax: (866) 575-1583 Email: ddechow@apturavision.com Mr. René Dencker Eriksen Chief Technology Officer Scape Technologies Kochsgade 31 C, 3. sal DK-5000 Odense C Denmark Phone: 45 70 25 31 13 Fax: 45 70 25 31 14 Email: rde@scapetechnologies.com Mr. Greg Garmann Software & Controls Technology Leader Motoman, Inc. 1050 Dorset Road Troy, Ohio 45373 Phone: 937-440-2668 Fax: 937-440-2626 Email: greg.garmann@motoman.com Mr. Babak Habibi President & CTO Braintech Inc. 102 - 930 West 1st Street North Vancouver, British Columbia V7P 3N4 Canada Phone: 604-988-6440 Fax: 604-986-6131 Email: bhabibi@braintech.com Mr. Eric Hershberger Senior Engineer Applied Manufacturing Technologies 219 Kay Industrial Drive Orion, Michigan 48359 Phone: 248-409-2000 Fax: 248-409-2027 Email: ehershberger@appliedmfg.com Mr. John Keating Product Marketing Manager Cognex Corporation 1 Vision Drive Natick, Massachusetts 01760 Phone: 508-650-3000 Fax: 508-650-3338 Email: john.keating@cognex.com Mr. Jens Kuehnle Research Associate Fraunhofer Institute Manufacturing Engineering and Automation (IPA) Nobelstrasse 12 70569 Stuttgart Germany Phone: 49 711 970 1861 Fax: 49-711-970-1004 Email: kuehnle@ipa.fraunhofer.de Mr. Jerry Lane Great Lakes Office Director Applied Research Associates 48320 Harbor Drive Chesterfield Township, Michigan 48047 Phone: 586-242-7778 Fax: 802-728-9871 Email: glane@ara.com

Mr. Eric Lewis President Flexomation, LLC 586 Northland Boulevard Cincinnati, Ohio 45240 Phone: 513-825-0555 Fax: 513-825-1870 Email: info@flexomation.com Mr. Frank Maslar Technical Specialist Ford Motor Company 6100 Mercury Drive Dearborn, Michigan 48239 Phone: 313-805-3904 Email: fmaslar@ford.com Mr. Michael Muldoon Business Solutions Engineer AV&R Vision & Robotics Inc. (Averna Vision & Robotics 269 Rue Prince Montreal, Quebec H3C 2N4 Canada Phone: 514-788-1420 Fax: 514-866-5830 Email: michael.muldoon@avr-vr.com Mr. Mark Noschang Manager of Applications Engineering for North America Adept Technology, Inc. 11133 Kenwood Road Cincinnati, Ohio 45242 Phone: 513-792-0266, x106 Fax: 513-792-0274 Email: mark.noschang@adept.com Mr. Steven Prehn Senior Product Manager – Vision FANUC Robotics, Inc. 3900 W. Hamlin Road Rochester Hills, Michigan 48309 Phone: 248-276-4065 Email: steven.prehn@fanucrobotics.com

Mr. Bob Rochelle North American Sales Manager Kawasaki Robotics (USA), Inc. 28140 Lakeview Drive Wixom, Michigan 48393 Phone: (248) 446-4211 Fax: (248) 446-4200 Email: bob.rochelle@kri-us.com Mr. Adil Shafi President SHAFI Innovation, Inc. 8060 Kensington Court Brighton, MI 48116-8520 Phone: (248) 446-8200 Fax: (248) 446-8282 Email: adil.shafi@shafiinc.com Ms. Jane Shi Senior Research Scientist General Motors Corporation 30500 Mound Road MC 480-106-359 Warren, MI 48090-9040 Phone: 586-986-0353 Fax: 586-986-0574 Email: jane.shi@gm.com Mr. Kevin Taylor Vice President ISRA VISION SYSTEMS, Inc. 3350 Pine Tree Road Lansing, Michigan 48911 Phone: 517-887-8878 Fax: 517-887-8444 Email: ktaylor@isravision.com Mr. James Wells Senior Staff Research Engineer General Motors Corporation 30500 Mound Road Warren, Michigan 48090 Phone: 810-602-9879 Fax: 856-856-0574 Email: james.w.wells@gm.com

Mr. Steven West Development Manager – Robotic Vision Technology ABB, Inc. 1250 Brown Road Auburn Hills, Michigan 48326 Phone: 248-393-7120 Fax: 248-391-8532 Email: steven.w.west@us.abb.com Mr. Brian Windsor Business Development Manager – Machine Vision SICK, Inc. 6900 West 110th Street Minneapolis, Minnesota 55438 Phone: 952-941-6780 Fax: 952-941-9287 Email: brian.windsor@sick.com Mr. David Wyatt Staff Engineer Applied Manufacturing Technologies 219 Kay Industrial Drive Orion, Michigan 48359 Phone: 248-409-2073 Fax: 248-409-2027 Email: dwyatt@appliedmfg.com

ICVGR Conference Agenda
Tuesday, September 30, 2008
7:00 am to 8:00 am 8:00 am to 8:15 am 8:15 am to 9:45 am Registration and Continental Breakfast Conference Overview and Introductory Remarks The Basics of Robotics Bob Rochelle, North American Sales Manager, Kawasaki Robotics (USA), Inc. Break The Basics of Machine Vision David Dechow, President, Aptúra Machine Vision Solutions Group Luncheon Successfully Integrating Vision Guided Robotics David Dechow, President, Aptúra Machine Vision Solutions Optional Group Dinner

9:45 am to 10:00 am 10:00 am to Noon

Noon to 1:30 pm 1:30 pm to 5:00 pm

Evening

Wednesday, October 1, 2008
7:30 am to 8:30 am 8:30 am to 8:45 am Registration and Continental Breakfast Review of Day One and Preview of Day Two Moderator: Frank Maslar, Advanced Manufacturing Technology Development – Ford Motor Company 8:45 am to 9:30 am Technology Advances in 2D Vision Guided Robotics John Keating, In-Sight Product Manager, Cognex Corporation Top Lessons Learned in Vision Guidance Applications Eric Hershberger, Senior Engineer & David Wyatt, Staff Engineer Applied Manufacturing Technologies Break How Advancements in Vision Guidance Making Flexible Feeding Applications Desirable Eric Lewis, President, Flexomation Vision Guided Robot Applications for Packaging & Flexible Feeding Mark Noschang, Applications Engineer, Adept Technology Group Lunch and Tabletop Exhibits See the offerings of leading vision and robotics companies from around the world who can assist you with your specific needs.

9:30 am to 10:15 am

10:15 am to 10:30 am 10:30 am to 11:00 am

11:00 am to 11:30 am

11:30 am to 1:30 pm

ICVGR Conference Agenda
Wednesday, October 1, 2008
1:30 pm to 2:15 pm High Accuracy Robot Calibration, Wireless Networking, and Related Technical Issues Eric Hershberger, Senior Engineer & David Wyatt, Staff Engineer, Applied Manufacturing Technologies Vision Based Line Tracking Frank Maslar, Advanced Manufacturing Technology Development, Ford Motor Company Break Case Study: Robots & Vision in the Automated Pharmacy David Arceneaux, Business Development & Marketing, Stäubli Robotics Unmanned Systems Intelligence, Vision and Automation Concepts for Combat Engineer and Other Battlefield Missions Jerry Lane, Director, Great Lakes Office, Applied Research Associates, Inc. Tabletop Exhibit Viewing and Reception Your Chance to spend more time with the exhibitors while enjoying refreshments and networking with your peers. Exclusive Dinner/Comedy Event

2:15 pm to 2:45 pm

2:45 pm to 3:00 pm 3:00 pm to 3:30 pm

3:30 pm to 4:00 pm

4:00 pm to 5:15 pm

6:00 p.m. to 10:00 p.m.

To maximize networking, on October 1st attendees will be transported by bus for dinner at one of metro Detroit’s finest Italian restaurants, Andiamo. This will be immediately followed by a comedy show at The Second City, whose unique brand of social and political satire mixed with improvisation has delighted audiences for over 45 years. Sports comedy will be the theme – be prepared for raucous laughs!

Thursday, October 2, 2008
7:30 a.m. to 8:30 a.m. Continental Breakfast Moderator: Frank Maslar, Advanced Manufacturing Technology Development – Ford Motor Company International Trends and Applications in 3D Vision Guided Robotics Adil Shafi, President SHAFI Innovation Inc. Advances in 3D Vision Guided Robotics at Fraunhofer IPA Jens Kuehnle, Research Associate, Fraunhofer IPA Vision Guided Part Loading/Unloading from Racks for Automotive Applications – Lessons Learned Robert Anderson, New Technology Manager, Chrysler LLC Break Random Bin Picking Technical Challenges and Approach Babak Habibi, CTO, Braintech Inc.

8:30 am to 9:15 am

9:15 am to 9:45 am

9:45 am to 10:15 am

10:15 am to 10:30 am 10:30 am to 11:00 am

ICVGR Conference Agenda
Thursday, October 2, 2008
11:00 am to 11:30 am Random Bin Picking Applications/Solutions Steven West, Business Development Manager, ABB Inc. The Need for Generic 3D Bin Picking René Dencker Eriksen, CTO, Scape Technologies Group Luncheon Robot Visual Servoing – Opportunities and Challenges Ahead Jane Shi, Senior Research Scientist & James Wells, Senior Staff Research Engineer, General Motors Corporation 3D Robot Guidance for Cosmetic Sealer Applications Kevin Taylor, Vice President, ISRA Vision Systems, Inc. Combining Machine Vision and Robotics to Mimic Complex Human Tasks Michael Muldoon, Business Solutions Engineer, Averna Vision & Robotics Break Using 3D Laser Scanning for Robotic Guidance Brian Windsor, Business Development Manager, SICK, Inc. Vision Options for “Dual Arm” Robot Guidance Greg Garmann, Software & Controls Technology Leader, Motoman Inc. Distance, Pitch & Yaw from a 2D Image Steve Prehn, Senior Product Manager – Vision, FANUC Robotics America VGR Panel Discussion Your opportunity to ask specific questions and get insight from this experienced panel of VGR leaders. 11:30 am to Noon

Noon to 1:00 pm 1:00 pm to 1:45 pm

1:45 pm to 2:15 pm

2:15 pm to 2:45 pm

2:45 pm to 3:00 pm 3:00 pm to 3:30 pm

3:30 pm to 4:00 pm

4:00 pm to 4:30 pm

4:30 pm to 5:00 pm

Thank you to our Corporate Sponsors:

Thank you to our Media Sponsors:

The Basics of Robotics

Presented by:

Bob Rochelle Kawasaki Robotics USA

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Bob Rochelle North American Sales Manager Kawasaki Robotics USA

Bob Rochelle Kawasaki Robotics USA 28140 Lakeview Drive Wixom, Michigan 48393 Phone: 248-446-4211 Fax: 248-446-4200 Email: bob.rochelle@kri-us.com
Bob Rochelle has a Bachelor’s and Master’s degree in Engineering from Virginia Tech and holds numerous US and International patents in the automation and food packaging fields. He has been in the Automation Industry for over 25 years and has held positions as Design Engineer, Project Manager, R & D Engineer, Engineering Manager, Sales Engineer and Sales Manager. He is currently the North American Sales Manager at Kawasaki Robotics with responsibility for robot and system sales through a direct sales staff or via an Integrator Network located throughout North, South and Central America. Bob is a veteran seminar speaker and has taught General Engineering, Project Management and Robotics for Baker College in Southeast Michigan. He is also the Chair for the RIA’s New Markets Committee.

The Basics of Robotics
Bob Rochelle North American Sales Manager Kawasaki Robotics

References
References: Robotic Industries Association Kawasaki Robotics (USA) Inc. Denso Robotics Advance Products Corp Practical Robotics Services TDI Covers Adept Technology PAR Systems, Inc Conveying Industries Inc ANSI / RIA Handbook of Industrial Robotics The Top 10 Application Mistakes

www.robotics.org www.kawasakirobotics.com www.densorobotics.com www.advanceproductscorp.com www.prsrobots.com www.tdicovers.com www.adepttechnology.com www.par.com www.conveyind.com Standard R15.06 - 1999 Edited by Shimon Y. Nof Article by George Martin

Outline
Flexible Automation The Robot Industry Yesterday, Today, Tomorrow Terms and Types of Robots Basic Robot Technology Mechanical, Controls, Programming Tooling Robots in Systems Robot Based Systems Vision Examples - Case Studies Final Thoughts

Robotics = Flexible Automation
Manual
Quick product change Breaks Monotonous tasks Health Claims

Dedicated Automation
High Volume Requires Set-up time More maintenance Air Cylinders / actuators Rigid conveyors / fixtures

Flexible Automation
Quick product change Programmable Higher initial Cost systems Repeatability by fixtures Changeable Cell configuration Responds to Part Changes

The Robot Industry

History Today Tomorrow General Terms Types of Industrial Robots

First “Robots”
Steam Man and Electric Man Robota
Czech word for “forced labor” or “serf”

Karel Capek - Rossum’s Universal Robots
Written in 1920, Premiered in Prague in 1921 Translated into English and performed in New York in 1923

Isaac Asimov
Coined the word Robotics 1950’s wrote the Robot Series - part of the Foundation Series Drafted the Three Laws of Robotics.

Today’s Industrial Robots
People
George Devol Joseph Engleberger – “Father of Robotics”

History
1956 George Devol & Joseph Engleberger met Began development work of first commercial robot First Working Model late 1956 1961 - First Installation GM - Die Cast Part Extractor Patented in 1961 Formed Unimation

Early Industrial Robots
Unimation Universal Automation Unimate Robot 4000# Arm Step by Step Commands stored on a magnetic drum Hydraulic Actuators $100,000 Plus Price

Puma
Programmable Universal Machine for Assembly

Robot Industry - Today
Over 850,000 at work today Over 100,000 sold per year Revenue
$5,000,000,000 - robots $15,000,000,000 - systems

Growth rate greater than 18% yearly Largest Users
Automotive - 47% Electronic -15%

Major Applications
Material Handling - 39% Welding - 30% Assembly - 8%

Robots Today
90% of Industries that could use robotic automation have yet to consider purchasing their first one.

Applications
Spot Welding Arc Welding Coating & Dispensing
Less than 10 pounds Greater than 10 pounds

Assembly
Less than 10 pounds Greater than 10 pounds

Material Handling
Packaging / Palletizing Machine Tending Body Shop Other Material Handling

Material Removal Inspection
Defined by Robotics Industry Association

New Markets and Applications
Service Industry
RoboBar, Food Service Care for the Elderly Humanoids

Medical and Pharmaceutical Industries
Prescription Dispensers Lab Automation Surgery System – Doctor Guidance Prosthetics Research and Design

Construction
Manufactured Housing

Machining

Flexible Manufacturing

Terms and Types of Robots

Common Industry Terms and Concepts Various Types of Industrial Robots

General Terminology
Work Envelope, Work Space or Reach
The set of points representing the maximum extent or reach of the robot hand or working tool in all directions. Also referred to as the working envelope or robot operating envelope. All encompassing range of motion

Payload
The maximum total weight that can be applied to the end of the robot arm without a sacrifice of any of the applicable published specifications of the robot. Weight carrying capacity

Cycle Time or Speed
Execution time for one task

The Axes – Degrees of Freedom
Degrees of Freedom - Axes
One of a limited number of ways in which a robot joint may move.

Joint 1 - Base Rotation Joint 2 - Rotation of the lower arm Joint 3 - Rotation the upper arm Joint 4 - Swivel of the upper arm Joint 5 - Bend of the wrist Joint 6 - Rotation of tool mounting plate
Z

Joint 7 - ??? - Traverse, Turntable, or other motions Coordinates
Base or World - Origin is in the robot base Tool Coordinates - Origin is the Tool Center Point Y

X

Multiple Axis System
Axis 7 - Turntable

Axis 1 to 6 - Robot

Axis 8 and 9 – Part Rotators Axis 10 and 11 – Part Spinners

Common Industrial Robots
Cartesian / Gantry SCARA Telescopic Parallel Articulated Modular

Cartesian / Gantry Robots
Four Plus Axes Simple Motions Linear X, Y, Z Tool Rotation Components Base / Superstructure Arm / Runway Telescope / Carriage Controls

Packaging / Machining / Water Jet Cutting / Palletizing

SCARA Robots
Four Degrees of Freedom / Advanced Control
One Linear Axis and multiple rotary axes

Motions
Rotational Linear Z Axis
300 mm 25 mm

Highly Accurate
± 0.015 mm

Fast and Vibration Free
Adept Cycle: 0.30 – 0.35 seconds Adept Cycle

Packaging / Assembly / Insertion

Telescopic Robots

Clean Room applications 3, 4 and 5 Axis designs Specific to Application
Wafer Handling Systems Flat Panel Screens Semi Conductor Industry – Clean Room

Parallel Robots
Tripod with three axes Hexapod with six axes Very Stiff Accurate High Speed

High Speed Pick and Place

Articulated Robots
Most Common / Most Flexible 4, 5 or 6 Degrees of Freedom
Rotational Motions

Modular Robots
System with a combination of robot types

Beyond Industrial Robots?

Robot Technology
• • • • Robot Mechanical Components Robot Controls Robot Programming Robot Communication Arm or Manipulator

Controller

Arm or Manipulator
Joint 3 Motor - in rear Arms Wrist Counter Balance Joints 4, 5 & 6 Motors

Fork Lift Pockets

Joint 1 Motor

Base Joint 2 Motor

Mounting and Environment
Mounting
Floor, Ceiling or Walls Proper Fasteners - no Casters Tracks or Traverse Units

Typical Environmental Specifications
IP65 / 67 Standard Ambient Temperature: 0 - 52oC Relative Humidity: 35% - 85% Non Condensing Optional: Clean Room / Wash down Hazardous Duty Units - Spray painting

Robot Controllers
Two Components Controller Teach Pendant Design Microprocessor based Programmable Generally One Controller per Robot Multi Controllers available

Teach Pendant
Design
Hand Held Programmer's Interface to Robot Controller and Programs LCD Display Hard keys for Functions / Keyboard

Functions
Communicates with Controller Dead man Switches E - Stop Monitor Teaching / Programming User Interface to robot Operator’s System Interface Possibility

Communication and Networks
Discrete I/O Photocoupler, relays, transistors Relay modules add on Remote I/O to PLC’s DeviceNet Master, Slave, Master & Slave Profibus Master, Slave, Master & Slave Interbus Ethernet TCP / IP, I/O adaptor RS232 / RS485 Internet Intranet

Programming
Teach Pendant
Programmer holds the teach pendant Manually teaches the robot

Off Line Programming
Teach Pendant Programming

Program written remotely Higher level language Loaded into Robot Controller Touch up required No additional hardware is needed.

Check Programs
Slow speed operation

Program Storage
Flash RAM PC Hard Drive Other media

PC Programming

Basic Robot Motion Teaching
Motion Instruction
Defines a target position

Interpolation Instruction
Defines how to get to the position Joint Move - Robot articulates any axis to accomplish the move Linear Move - Maintains the tool in the orientation specified Circular Move - Generated by defining three points and a radius to scribe a circle

Speed
Expressed in percent of full speed or a software settable maximum speed.

Termination Instruction
Expressed as a number [1 - 9] most to least accurate. Defines approach to the target position

Additional Programming Activities
Activities to be complete before moving to the next target position I / O switching Data acquisition

Repeatability
Repeatability Ability of the robot to return to a preprogrammed position. Closeness of agreement of repeated position movements under the same conditions to the same location.

Assume repeatability to be +/- 0.004”
0.008”

• • •

• •

Robot can position anywhere within the 0.008” diameter circle and still fall within its repeatability specification.

Robots In Systems
Who’s Who in Robot System Industry Tooling Control Systems Systems Vision Safety

Who’s Who in the Robotics' Industry
Robot Manufacturers
Manufactures the robot Provides robot training, maintenance and service

System Integrator [System Builder]
Integrate the robot into a system to perform a specified task
Independent business, industry specific, some allegiance to robot manufacturer Has knowledge of End User’s business

Designs and builds the robot based system
Purchases robot and all peripheral equipment Designs and builds systems, writes and maintains programs Trained on entire cell / provides training on system

Provides system components, installation, training, service and support

End Users
Uses the robotic-based system in production or processing Knows what is required to accomplish tasks Ultimate user - needs training, service, maintenance, spare parts

Tooling / End Effectors / E.O.A.T
The tool attached to the robot manipulator or arm that actually performs the work. Examples
Vacuum Cups Grippers Spatulas / Fingers Spray Nozzles Dispensers Buffing Wheels Machine Tools Water Jets Welding Torches / Resistance Welding Guns Saws Laser Cutters Ladles

Adds to the Work Envelope Adds to the Payload / Torque / Inertia

Tooling Considerations
Parts Fixtures Repeatable and Positive Sensors Part locators / verification of action / QC Tool Changers Quick change / machine set-up Environmental Considerations No Parts Fixture? Can Locate

Do I move the part ? Do I work on a stationary part?

System Control Philosophy
Philosophy 1
Robot Controller does all System I/O, Tooling Control, Motion Control, Operator Interface

Philosophy 2
Robot Controller Tooling Control, Motion Control PLC or PC System I/O, Operator Interface

Philosophy 3
Robot Controller Motion Control only PLC or PC System I/O, Tooling Control, Operator Interface

Robot System Safety
Responsibility Robot Manufacturer Integrator / System Builder / Installer User Refer to Resources ANSI / RIA R15.06-1999 OSHA Standards CUL / UL [Underwriters Laboratories] Hazardous materials requirements Local Codes Good manufacturing practices Plant Standards Personnel training policies

ANSI/RIA R15.06-1999
Robotic Industries Association Ann Arbor, Michigan 48106 (734) 994-6088 www.robotics.org

System Development Process
Identify the System Specifications
What do you want to do? Existing Process, Reach, Payload, Speed, Operator Involvement, QC Issues, Interface with Production System, Technological Capability of User Who is going to Integrate the system? End user, Integrator, Robot Manufacturer, Combination

System Design and Build
Preliminary Layouts and Design Proposal Space Required, Parts Movement, Tooling, Safety Concerns, I/O, Interfaces and Communication, Operator Involvement Simulations / Cycle Time Study / Verification Tests Build and test the system prior to shipment

System Start Up and Commissioning
Installation, Start-up and Customer Acceptance Continuous Improvement

Industrial Robot Systems
System Components
Robot and Controller Arm Dressing and Risers End of Arm Tooling Parts Fixtures or Locators Interfaces Pneumatics Sensors Electrical Components Cables Peripheral Equipment Varies by application PLC or External Control Communication via Network or Discrete I/O Safety Components Fence, Gates, Interlocks, Light Curtains, Barriers, Awareness Beacons

Selecting a Systems Integrator
Determine if the Integrator has experience in your industry
Transferable knowledge

Evaluate the Integrator’s background and capabilities
Full Service Commercial Issues

Check references
The Integrator’s Robot Manufacturers

Prepare for disaster
What happens?

After sale maintenance
Integrator / Robot manufacturer

Cost
Is the lowest bid the best?

Vision Systems
Peripheral Equipment
Camera Camera Controller Light Source Calibration Check Means

Robot Components
Robot and Controller Interface to Camera Controller Software

Applications
Part Location Inspection Bin Picking Real Time Feedback

Bin Picking

Locating or Orientating Parts
Cameras

Camera Parts Rack

Robot Guidance

Real Time
Welding Seam Sealing Dispensing

10 Reasons to Invest in Robotics
1. Reduce Operating Costs 2. Improve Product Quality and Consistency 3. Improve Quality of Work Environment for Employees 4. Increase Production Output Rates 5. Increase Product Manufacturing Flexibility 6. Reduce Material Waste and Increase Yield 7. Comply With Health and Safety Standards 8. Reduce Labor Turnover and Difficulty Recruiting Workers 9. Reduce Capital Costs like Inventory, WIP 10. Save Space in High Value Manufacturing Areas

Expect System Reliability > 99.5%

The Green Sand Casting Process
Green Sand Casting Process
Create the mold mixture of sand, clay and moisture simple materials materials can be reused or regenerated low cost materials Pour molten metal into the molds Remove the parts Machining or clean up is required

Green Sand Cast Parts
Require surface finish Lowest cost casting process

Labor intensive process
Automated mold creating Recently automated the pouring process Manual parts removal

Robotic Pouring
Customer’s Results
Four times the capacity impeded by peripheral equipment One part every 30 seconds Reduced labor by three per shift Energy reduction automatic furnace lid closure provides insulation Operator Safety is vastly improved Reduced material use same quantity for every part Parts consistency is 100% reliable repeatable process Increased Parts Quality metal heat more consistent pour efficiency

10+ Mistakes in Robot Integration
Underestimating Payload and Inertia. Expecting the robot to do to much. Underestimating Cable Management Issues. Not considering all current and future application needs. Misunderstanding accuracy and repeatability. Focusing on the robot alone. Not planning for disaster. Overlooking the need for options or peripheral equipment for a system. Not fully utilizing the capabilities of a robot. Choosing a robot or system solution solely on price. Thinking that robots are too complicated. Failure to consider using robotic technology. Expect System Reliability > 99.5%

Applications
Welding Spot Welding Plasma Welding Coating & Dispensing Glue Dispensing Paint Packaging / Palletizing Bag Palletizing Box Palletizing Muffin Packaging Material Handling Press Tending Forging Investment Casting Machine Tending Die Casting

Defined by Robotic Industries Association

Contact Information

Bob Rochelle
North American Sales Manager

Kawasaki Robotics
28140 Lakeview Drive Wixom, Michigan 48393 USA Telephone: 248-446-4211 Email: bob.rochelle@kri-us.com Web: www.kawasakirobotics.com

The Basics of Machine Vision

Presented by:

David Dechow Aptúra Machine Vision Solutions

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Dechow President Aptúra Machine Vision Solutions

David Dechow Aptúra Machine Vision Solutions 3130 Sovereign Drive, Suite 5A Lansing, Michigan 48911 Phone: 517-272-7820, x11 Email: ddechow@aptura.com
David Dechow is president and founder of Aptúra Machine Vision Solutions, LLC. Mr. Dechow has worked in the field of machine vision for over 25 years as a programmer, engineer, and manager. He served 14 years on the AIA board of directors, and was a two term president of that board. Mr. Dechow is the 2007 recipient of the AIA Automated Imaging Achievement Award honoring industry leaders for outstanding contributions in industrial and/or scientific imaging. Mr. Dechow is a regular speaker at conferences and seminars, and a frequent contributor to industry trade journals and magazines and has served on the editorial boards of Vision Systems Design magazine and Quality Magazine’s Vision and Sensors.

The Basics of Machine Vision
David Dechow President Aptúra Machine Vision Solutions

Session Outline

1 2 3 4

Overview/Introduction to Machine Vision

Imaging, Lighting and Optics

Machine Vision – Getting Data From Images

Application Analysis and Specification

The Basics of Machine Vision

MACHINE VISION INTRODUCTION

Overview
• Machine Vision
– Machine vision is the substitution of the human visual sense and judgment capabilities with a video camera and computer to perform an inspection task. It is the automatic acquisition and analysis of images to obtain desired data for controlling or evaluating a specific part or activity. – Key Points:
• • • • • Automatic – self-acting Acquisition and analysis – machine vision uses both Non-contact Data acquisition – value of the technology Control – necessary for reasonable ROI

Overview
• Machine Vision Integration
– Machine vision systems integration is the process where significant value is added to a machine vision component by the incorporation of software, peripheral hardware, mechanical devices, materials and engineering.

Overview
• Prerequisite Integration Expertise:
– – – – – – – Application-based lighting and optics Understanding of imaging and input devices Electrical and mechanical engineering Industrial automation systems and components Machine vision algorithms Programming and/or system configuration Project management and customer support

Process Overview
• Hardware execution • Camera and (if applicable) strobe trigger

Initiate Inspection – external event

Acquire Image

Analysis

• Software execution of inspection program

-Part Tracking -Multiple results -Other data

-Recipe changeovers -Multiple images/lights -Part tracking

Results

• Determine part status and communicate results

Process Result – external event

System Architectures
• Machine Vision Systems
Camera Lens
Imager Electronics Power/Control Signal Frame Grabber or other signal conversion Digital Image

Computer

System Architectures
• Machine Vision Systems, continued
Camera Lens
Imager Electronics Power/Control Signal Frame Grabber or other signal conversion Digital Image

Computer

Optional computer for operator interface

System Architectures
• Machine Vision Systems, continued
Camera Lens
Imager Electronics Power/Control Signal Frame Grabber or other signal conversion Digital Image

Computer

Connected monitor for operator interface

System Architectures
• Machine vision hardware is an “image delivery system”!
– Differentiation of products at the hardware level is limited
• • • • Physical structure and system layout Available camera resolutions Input/output options Other hardware integration capability

System Architectures
• Machine vision software drives component capability, reliability, and usability
– Available image processing and analysis tools – Ability to manipulate imaging and system hardware – Methodology for inspection task configuration

• Main component differentiation is the software implementation • Often, system software complexity increases with system capability

The Basics of Machine Vision

IMAGING, LIGHTING AND OPTICS

Imaging, Lighting and Optics
• Key Issues
– Imaging
• Application requirements will dictate image space and camera resolutions

– Lighting
• The purpose of lighting for machine vision is to create the highest level of contrast between features to be inspected relative to the background or other features • Competent lighting technique contributes over 80% to the success of an application

– Optics
• Most machine vision applications use “off the shelf” optics • Select proper machine vision quality lenses

Imaging Basics
• Image Acquisition
– Performed by a light-gathering silicon device
• CCD, CID, CMOS …

– The imaging chip comes in a variety of physical layouts
• Area • Line

– Size of the chip varies widely as does the number of individual picture elements (pixels)
• Typical area chip for machine vision: from .3 to 4+ Mpix • Physical sizes from ¼" diag. up • Typical line scan array: from 1K to 12K+

Imaging Basics
• Cameras
– Image sensor supported by electronic circuitry and packaged for industrial use – Final output may be analog or digital
• RS170, CCIR, NTSC, PAL, USB, FireWire (1394 a/b), Camera Link, GigE

Camera Lens Imager Electronics Power/Control In Signal Out

Imaging Basics
• Digital image representation
– Common thread is the internal representation of the image as seen by most algorithms – The image is stored as a single or multiple image planes containing arrays of integer numbers – Each number represents one small section of the image – a pixel (picture element) – Lower numbers are darker, higher numbers are lighter
• Typical range is 0-63, -127, or -255

Imaging Basics
• Internal representation – gray scale image
255 255 255 105 60 44 41 59 100 255 255 255 255 255 255 110 68 42 46 54 120 255 255 255 255 255 255 111 62 41 42 60 120 255 255 255 105 116 112 109 57 46 48 59 115 118 121 100 51 62 68 60 42 46 44 41 51 62 68 60 41 44 41 42 41 42 42 46 41 44 41 42 43 42 46 48 46 48 41 42 43 42 46 48 49 57 58 61 41 44 41 46 49 57 58 61 101 120 117 115 43 42 46 46 110 115 120 105 255 255 255 112 49 42 43 42 116 255 255 255 255 255 255 114 42 46 49 48 118 255 255 255 255 255 255 108 41 42 42 46 105 255 255 255

Imaging Basics
• Color Images
– Color images commonly are acquired and internally represented as three planes of digital data – one each for Red, Green, and Blue – Difference between 3-chip color and Bayer Filter – Other representations such as HIS, LAB are derived from the RGB data

Lighting Basics
• Illumination for machine vision must be designed for imaging, not human viewing
– Selection must be made relative to light structure, position, color, diffusion – We need to know how light works so our light selections are not “hit and miss” guesswork

Light Source

Diffuse Reflection

Specular Reflection

– Light is both absorbed and reflected to some degree from all surfaces
• When an object is clear or translucent, light is also transmitted • Angle of incidence = angle of reflection

Refraction, Absorption

Transmitted Light (if object is not completely opaque)

Lighting Basics
• Dedicated lighting must be used for machine vision with few exceptions. • Where feasible, LED illumination is the best source
– Long life with minimal degradation of intensity – Able to be structured into a variety of shapes
• May be directional or diffuse

– May be strobed at very high duty cycles and overdriven to many times nominal current specifications – Available in many visible and non-visible colors

• Other sources – fluorescent, fiber-optics

Lighting Basics
• Lighting Techniques
– The goal of lighting for machine vision applications usually is to maximize the contrast (grayscale difference) between features of interest and surrounding background – Techniques are categorized generally by the direction of the illumination source
• Most may be achieved with different sources

Lighting Basics
• Direct bright-field illumination
– Sources: high-angle ring lights (shown), spot-lights, bar-lights (shown); LED’s or Fiber-optic guides – Uses: general illumination of relatively high-contrast objects; light reflection to camera is mostly specular

Images: CCS America; www.ccsamerica.com

Lighting Basics
• Diffuse bright-field illumination
– Sources: high-angle diffuse ring lights (shown), diffuse bar-lights; LED’s or fluorescent – Uses: general illumination of relatively high-contrast objects; light reflection to camera is mostly diffuse

Images: CCS America; www.ccsamerica.com

Lighting Basics
• Direct dark-field illumination
– Sources: low-angle ring lights (shown), spot-lights, bar-lights; LED’s or Fiberoptic guides – Uses: illumination of geometric surface features; light reflection to camera is mostly specular – “dark field” is misleading – the “field” or background may be light relative to surface objects

Images: CCS America

Lighting Basics
• Diffuse dark-field illumination
– Sources: diffuse, lowangle ring lights (shown), spot-lights, bar-lights; LED’s or fluorescent – Uses: non-specular illumination of surfaces, reducing glare; may hide unwanted surface features

Images: CCS America

Lighting Basics
• Diffuse backlight
– Sources: highly diffused LED or fluorescent area lighting – Uses: provide an accurate silhouette of a part

Images: CCS America

Lighting Basics
• Structured light
– Sources: Focused LED linear array, focused or patterned lasers – Uses: highlight geometric shapes, create contrast based upon shape, provide 3D information in 2D images

Images: CCS America, Stocker & Yale; www.stockeryale.com

Lighting Basics
• On-axis (coaxial) illumination
– Sources: directed, diffused LED or fiber optic area – Uses: produce more even illumination on specular surfaces, may reduce lowcontrast surface features, may highlight high-contrast geometric surface features depending on reflective angle
Images: CCS America

Lighting Basics
• Collimated illumination
– Sources: diffuse, lowangle ring lights (shown), spot-lights, bar-lights; LED’s or fluorescent – Uses: non-specular illumination of surfaces, reducing glare; may hide unwanted surface features

Images: Edmund Optics; www.edmundoptics.com, CCS America

Lighting Basics
Camera Beam Splitter Light Source (on-axis)

• Constant Diffuse Illumination (CDI – “cloudy day illumination”)
– Sources: specialty integrated lighting – Uses: provides completely non-specular, nonreflecting continuous lighting from all reflective angles; good for reflective or specular surfaces

Light Source (off-axis)

Object

Images: Siemens; www.nerlite.com

Lighting Basics
• Other lighting considerations
– Color
• • • • The effect of monochromatic light on colored features Camera response to different colors White light and color imaging Non-visible “colors”

– Light degradation over time; component life, heat dissipation – Light intensity and uniformity – Strobing – Elimination of ambient and other stray light

Optics Basics
• Application of optical components
– Machine vision requires fundamental understanding of the physics of lens design and performance – Task is to competently specify the correct lens
• Create a desired field of view (FOV) • Achieve a specific or acceptable working distance (WD) • Project the image on a selected sensor based on sensor size – primary magnification (PMAG)

– Goal, as always, to create the highest level of contrast between features of interest and the surrounding background; with the greatest possible imaging accuracy

Optics Basics
• Considerations for lens selection
– Magnification, focal length, depth of focus (DOF), fnumber, resolution, diffraction limits, aberrations (chromatic, spherical, field curvature, distortion), parallax, image size, etc. – The physics of optical design is well known and can be mathematically modeled and/or empirically tested
• Specification or control of most of the lens criteria is out of our hands

Optics Basics
• Considerations for lens selection
– Practical specifications for machine vision: PMAG (as dictated by focal length) and WD to achieve a desired FOV
• Use a simple lens calculator and/or manufacturer lens specifications • Simple – state the required FOV, the sensor size based on physical selection of camera and resolution, and a desired working distance – calculate the lens focal length • Test your results

– Always use a high-resolution machine vision lens NOT a security lens

Optics Basics
• Why use machine vision lenses only
– Light gathering capability and resolution

Images: Edmund Optics; www.edmundoptics.com

Optics Basics
• Specialty Lenses
– Telecentric – Microscope stages – Macro, long WD

The Basics of Machine Vision

MACHINE VISION – GETTING DATA FROM IMAGES

Machine Vision – Getting Data out of Images • Inspection Concepts
– What are the capabilities and limitations of machine vision technology for the target application
• Requirement: specify a processing direction to take with respect to system architecture, and the ability to specify deliverables, performance, and acceptance criteria

– Analysis of the inspection concept can be subdivided by general type of inspection
• • • • • Assembly Verification/Recognition Flaw Detection Gauging/Metrology Location/Guidance OCR/OCV

Machine Vision – Getting Data out of Images • Assembly Verification/Object Recognition
– Feature presence/absence, identification, differentiation of similar features – Imaging Issues
• Must create adequate contrast between feature and background • Accommodate part and process variations • May require flexible lighting/imaging for varying features • For feature presence/absence, feature should cover approx. 1% of the field of view (med. resolution camera), more for identification or differentiation

Machine Vision – Getting Data out of Images • Defect/Flaw Detection
– A flaw is an object that is different from the normal immediate background – Imaging Issues
• Must have sufficient contrast and geometric features to be differentiable from the background and other “good” objects • Typically must be a minimum of 3x3 pixels in size and possibly up to 50x50 pixels if contrast is low and defect classification is required • Reliable object classification may not be possible depending upon geometric shape of the flaws

Machine Vision – Getting Data out of Images • Gauging/Metrology
– Measurement of features – There are physical differences between gauging features in an image produced by a camera, and the use of a gauge that contacts a part. These differences usually can not be reconciled

Machine Vision – Getting Data out of Images • Gauging/Metrology
– Gauging concepts
• • • • Resolution, repeatability, accuracy Sub-pixel measurement Measurement tolerances Resolution must be approximately 1/10 of required accuracy in order to achieve gauge reliability/repeatability

Machine Vision – Getting Data out of Images • Gauging/Metrology
– Imaging Issues
• Lighting to get a repeatable edge
– Backlighting, collimated light

• Telecentric lenses • Calibration
– Correction for image perspective/plane – Calibration error stack-up

Machine Vision – Getting Data out of Images • Location/Guidance
– Identification and location of an object in 2D or 3D space – May be in a confusing field of view – Imaging Issues
• Measurement tolerances and accuracies as described for gauging/metrology applications • Sub-pixel resolutions may be better than discrete gauging results • For guidance applications, the stack-up error in robot motion may be significant

Machine Vision – Getting Data out of Images • OCR/OCV
– Optical Character Recognition/Verification – reading or verifying printed characters – Can be fooled by print variations – Verification is difficult depending upon the application – Imaging Issues
• Consistent presentation of the character string • May require extensive pre-processing

The Basics of Machine Vision

APPLICATION ANALYSIS AND SPECIFICATION

Application Analysis and Specification
• Define the target application and inspection criteria
– Describe the desired inspection
• Avoid discussion of machine vision technique

– Clearly define good part criteria and bad part criteria – What is the reason for the inspection – What will happen to a bad part

Application Analysis and Specification
• Define the part(s) to be inspected
– Include physical detail about geometric structure, features – Identify all possible part variations; color, size, structure – Describe the materials and surface finish of the part – Will the part change over time – Get photos, samples

Application Analysis and Specification
• Production Process Analysis
– Background information about how the part is manufactured and moved – Production rates, number of shifts – What factors in the process cause the bad parts – Benefits of implementing inspection
• What happens if a bad part gets through • Will costs, yield, quality be improved

– What is the cost of a falsely reject part
• Can rejects be recovered/repaired

Application Analysis and Specification
• Part Handling and Presentation
– Existing automation • Physical description of the mechanism/conveyor including background objects, surfaces, and colors at the target inspection station • Physical envelope available for inspection components • Mounting surfaces available for inspection components • Other processes taking place at the inspection station • Other physical constraints or obstacles • Reject method • Interfacing method to existing controls system
– Inspection triggering, reject signal, alarms – Operator interface requirements

Application Analysis and Specification
• Part Handling and Presentation
– Environment
• Type of environment: factory, lab, clean-room, wash-down, hazardous, etc. • Air quality in the vicinity of the inspection
– Oil, smoke, debris

• Dirt, oil, lubricant, water, other contaminants on parts • Things damaging to cameras: weld spatter, reflected laser light, radiation, etc. • Ambient light • Temperature and humidity • Shock or vibration

Application Analysis and Specification
• Business Issues
– Scope of supply/deliverables; who is responsible for what
• • • • Engineering: design, integration, shipping, installation Hardware components Warranties Documentation and training

– Contractual items
• Performance guarantees • Terms • IP ownership

Application Analysis and Specification
• Once the constraints of the application are fully identified the system performance can be quantified. • The performance criteria of the system should include
– Actual inspection capability (measurement tolerance, feature detection, etc.) with respect to the target application – Throughput and speed of inspection – Anticipated lighting and imaging methodology – General overview of the operation of the inspection system – Description of the automation and appropriate performance related a specific process if applicable

Application Analysis and Specification
• Exceptions and limitations
– The project specification must identify all non-obvious exceptions and limitation to the performance of the system – Include all possible unknowns

Application Analysis and Specification
• Acceptance Criteria
– How to prove the machine is functioning properly – How to resolve differences in opinion regarding machine function – Clearly state acceptance criteria AND methodology in quantifiable terms – Acceptance will be based on stated performance criteria

Application Analysis and Specification
• Acceptance Criteria
– Analysis of system performance must be done using a verifiable sample or challenge set of parts
• Verifiable: All parties agree that each specific challenge part meets the stated criteria, either reject criteria or feature size if a gauging application

– Static testing is done with challenge parts – A gauge R&R is appropriate for gauging applications – Production testing can be done with parallel visual inspection
• Rejected parts will be judged against the set of challenge parts

– The acceptance criteria will list false accept and false reject rates

Contact Information

David Dechow
President

Aptúra Machine Vision Solutions
3130 Sovereign Drive, Suite 5A Lansing, Michigan 48911 USA Telephone: 517-272-7820, x11 email: ddechow@aptura.com www.aptura.com

Successfully Integrating Vision Guided Robotics

Presented by:

David Dechow Aptúra Machine Vision Solutions

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Dechow President Aptúra Machine Vision Solutions

David Dechow Aptúra Machine Vision Solutions 3130 Sovereign Drive, Suite 5A Lansing, Michigan 48911 Phone: 517-272-7820, x11 Email: ddechow@aptura.com
David Dechow is president and founder of Aptúra Machine Vision Solutions, LLC. Mr. Dechow has worked in the field of machine vision for over 25 years as a programmer, engineer, and manager. He served 14 years on the AIA board of directors, and was a two term president of that board. Mr. Dechow is the 2007 recipient of the AIA Automated Imaging Achievement Award honoring industry leaders for outstanding contributions in industrial and/or scientific imaging. Mr. Dechow is a regular speaker at conferences and seminars, and a frequent contributor to industry trade journals and magazines and has served on the editorial boards of Vision Systems Design magazine and Quality Magazine’s Vision and Sensors.

Presentation not available at time of production.

International Conference for Vision Guided Robotics

A Special thanks to our Moderator:

Frank Maslar Ford Motor Company

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Frank Maslar Technical Specialist Ford Motor Company

Frank Maslar Ford Motor Company 36200 Plymouth Road Livonia, Michigan 48150 Phone: 313-805-3904 Email: fmaslar@ford.com
Key Responsibilities: Work with universities and key suppliers to develop and implement advanced manufacturing technology in the manufacturing of powertrain systems. Areas of focus include vision systems and traceability. Previous Positions Held: Advanced Manufacturing and Quality Engineer at Ford Electronics Research Scientist at Siemens Corporate Research Degrees: B.S.M.E. Penn State

Technology Advances in 2D Vision Guided Robotics

Presented by:

John Keating Cognex Corporation

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

John Keating Principal Product Marketing Manager Cognex Corporation

John Keating Cognex Corporation 1 Vision Drive Natick, Massachusetts 01760 Phone: 508-650-3000 Fax: 508-650-3338 Email: john.keating@cognex.com
John Keating is a Principal Product Marketing Manager for In-Signt® vision systems at Cognex Corporation. He holds a B.S in Electrical Engineering from Boston University and an MBA from Babson College. Since joining Cognex in 1994, he has held roles in applications engineering management, as well as a variety of positions in industry and product marketing.

Technology Advances in 2D Vision Guided Robotics
John Keating
Principal Product Marketing Manager

Cognex Corporation

Types of Robotic-Vision Applications
• Vision Guided Robotics (VGR)
– Alignment & placement of parts – Provides X, Y, Θ to robot

VGR Plus Inspection
– Inspect parts while providing positional data – Assembly verification, product identification, defect detection

Robotic-assisted Inspection
– Robot presents part to vision system for inspection – End-of-arm vision system maneuvered around part

Types of Robotic-Vision Applications
• End-of-Arm Mounted Vision
– Vision moves with robot – Application Concerns:
• Cabling • Weight • Perspective distortion

Fixed Mount Vision
– Vision separated from robot – Application Concerns:
• Occlusion • Multiple planes of inspection

Vision Guided Robotic Applications
• • Palletizing/Depalletizing
– Place/Remove parts on pallets

Conveyer Tracking
– Locate unfixtured parts on conveyer and place them in package

Component Assembly
– Locate unfixtured parts and assemble to other components

Machine Tending
– Locate unfixtured parts on conveyer and place into CNC work cells

Robotic Inspection
– Use robot to manipulate part or camera to inspect critical features of part

Types of Industries Using VGR
• • • • • • • Automotive & Aerospace Consumer Electronics Semiconductor & Solar Cell Consumer Packaged Goods Food & Beverage Pharmaceutical Medical Devices

Applications Examples for Vision Guided Robotics
• • Wide variety of industries Wide range of robot manufacturers
– ABB, Adept, Denso, Epson, Fanuc, Kawasaki, Kuka, Mitsubishi, Motoman, Staubli, Yamaha, and others

And they cover…
– Description of application and challenges – Enabling technology to overcome challenges – Benefit to the customer

De-Palletizing Application
in the Food Industry
• Application:
– Stacked pallets of juice boxes need to be depalletized for distribution

Challenges:
– Various juice box sizes and configurations – Parts move slightly when on pallet – High speed of production line must minimize robot movement

Solution:
– High resolution, robot-mounted vision system with 6 mm lens and large (6 feet) field of view – Non-linear calibration algorithm to ensures accurate placement – 30% reduction in robot cycle time

Enabling Technology: Non-linear Calibration

Distorted
Caused by Lens

Undistorted
After Non-Linear Calibration

• •

Correct lens distortion
– Removing distortion from short focal length lens

Correct perspective distortion in 2 planes
– Calibrate conveyor and pallet planes separately

Ensures positional accuracy of robot

Component Assembly Application
in the Automotive Industry
• Application:
– Locate holes and assemble rivnuts into automotive frame

Challenges:
– Part is not precision-fixtured – Need .003” accuracy to insure proper assembly – 27 different hole locations
• 5 different planes of view • Varying surface finishes

Rivnut

Hole

Solution:
– Wrist-mounted, high resolution vision system
Assembly Cross-Section

Enabling Technology: High Accuracy Gauging
• High Resolution Vision Systems
– Provides greater detail in image
480 640 1024 1600

Standard Res.

Non-Linear Calibration
– Removes lens and perspective distortion

768

Highly Accurate Measurements

High Resolution
1200

Conveyer Tracking Application
in the Food Industry
• Application:
– Bagged food packet – Pick-and-Place from conveyor into shipment boxes – Packets vary in size and can sometimes overlap – need flexible solution to provide exact location

Challenges:
– – – – Patterned background Non-uniform lighting Overlapping parts Specular reflection from bags

Solution:
– Fixed Mount Vision System with geometric pattern finding

Application:
– Bagged food packet – Pick-and-Place from conveyor into shipment boxes – Packets vary in size and can sometimes overlap – need flexible solution to provide exact location

Challenges:
– – – – Patterned background Non-uniform lighting Overlapping parts Specular reflection from bags

Solution:
– Fixed Mount Vision System with geometric pattern finding

Enabling Technology: Advanced Pattern Finding Algorithm

Occlusion

Out of focus

Confusing Background

Trained Part

180° Rotation

Reversed Polarity

Scale Change & Dim Lighting

Accurate Part Location Under Challenging Conditions

De-Palletizing Application
in the Automotive Industry
• Application:
– Stacked pallets of automotive wheels are placed at machining center – Experiencing problems with incorrectly loaded parts

Challenges:
– Large variety of wheel types – Part finish varies due to part processing – Part cannot be shrouded resulting in variable lighting – Part type changes – Part is loosely placed in bin

Solution:
– Fixed-mount vision system communicating to 6-axis robot – 4 Month Project Payback

Robotic Inspection Application
in the Durable Goods Industry
• Application:
– Inspect washing machine – Inspect controls, LEDs, and labels for correct placement and surface finish

Challenges:
– Large variety of panel colors and configurations – Large area to inspect for small defects

Solution:
– Six-axis robot presents washer panels to vision system for inspection

Conveyer Tracking Application
in the Pharmaceutical Industry
• Application:
– Pharmaceutical product tubes need to be located on conveyer and placed into package for distribution

Challenges:
– Tubes are loosely placed on conveyer – Range of product sizes

Solution:
– Fixed-mount vision system – Pick-and-Place Robot

Enabling Technology: Robot Communications
• Communication Flexibility
– Serial and Ethernet based – Formatted strings, specific drivers, and native mode commands

Point-and-Click Configuration
– Build up formatted communication strings quickly

Code Samples
– Robot and vision system sample code available

Robotic Inspection Application
in the Electronics Industry
• Application:
– Consumer electronics stereo components are assembled in a flexible automation cell

Challenges:
– Verify that the correct components are being assembled – Match model number with database description in computer system

Solution:
– Robot presents part to fixed-mount vision system – OCV algorithm “reads” part number – OPC communications to SCADA system

Enabling Technology: OPC & ActiveX
• ActiveX Display Control embeds images and graphics into thirdparty HMIs Built-in OPC Servers make OPC communications point-and-click simple Software Development Kits allow integrators to develop custom operator interfaces

Machine Tending Application
in the Automotive Industry
• Application Description
– Robot locates parts on conveyer and places them into machine press

Challenges:
– Safety concerns prohibit manual intervention – Parts are in random orientation on conveyer

Solution:
– Fixed mount vision system suited to a rugged, industrial environment

Enabling Technology: Rugged Hardware
• IP67 rated protection
– Eliminate need for enclosures and reduces weight for arm-mounted applications

High flex cables
– Lengthens cable life – Minimizes downtime required to replace cables

Lightweight camera choices
– Minimize weight at the end of the robot

Integrated Power and Communications
– Power Over Ethernet technology reduces cabling to one

Component Assembly Application
in the Electronics Industry
• Application:
– Inserts need to be loaded into an injection mold housing

Challenges:
– Need flexible solution to accommodate a wide range of parts – Heavy Industrial Environment

Solution:
– Industrially-hardened fixed mount vision system

Robotic Inspection Application
in the Automotive Industry
• Application:
– Need multiple inspections on variety of parts

Challenges:
– Need to achieve 70 inspections in under 1 minute – Small lot-size production and need to minimize machine changeover

Solution
– Robot-mounted vision system – High speed & high accuracy system for multi-point inspection

Enabling Technology: DSP Performance Advancements
• Higher Performance
– Embedded image processing – Ability to run more powerful algorithms

Smaller Footprint
– Reduction in component size due to 90nm process design – Smaller form factors and lighter weights to fit into tight-spaced production lines

In-Sight 5600: World’s fastest vision sensor uses 1Ghz TI DSP
DSP Price Per Unit 4600 4200 3800

Lower Cost
– Driving overall vision system prices down with greater performance

3400 3000
20 02 20 03 20 04 20 05 20 06 20 07 20 08 20 09 20 10 20 11

Advantages of Vision Guided Robotics
• Elimination of Costly Fixtures
– Reduced capital investment costs

Improved Robotic Accuracy
– Lower downtime and improve process efficiency

Processing Multiple Part Types on the Same Production Line
– Improved equipment flexibility & reduced machine changeover time

Improved Quality Resulting from Vision Inspection
– Command price premiums for a higher quality product

Key Vision Technologies
• Robust Location Algorithms
– Part Finding under challenging conditions – Eliminate perspective distortion – Translation to robot coordinate system

Tell your robot where to go

Communications
– Availability of Robot Protocols – Making the application code work together

Hardware Performance
– Processing Throughput – Size & Weight – Environmental protection

Simple communications to robots and lower end-of-arm tooling weight

Add Vision to Robotic Applications to Improve…
• Return on Investment
ROI = Gain from Investment – Cost of Investment Cost of Investment

Operational Equipment Effectiveness (OEE) OEE = Availability x Performance x Quality Product Quality Cost of Quality = Internal Failure Cost + External Failure Cost + Inspection Cost + Prevention Cost

Improve Return on Investment
ROI =

Elimination of costly mechanical fixtures

Gain from Investment – Cost of Investment Cost of Investment

Reduced Capital Investment Costs

Improve Operational Equipment Effectiveness
OEE = Availability x Performance x Quality
• Improve robotic placement accuracy
Increase production speeds and reduce downtime

Process multiple part types on the same production line
Make equipment more flexible Reduce machine change-over time

Improved Equipment Efficiency

Improve Product Quality
Cost of Quality = Internal Failure Cost + External Failure Cost + Inspection Cost + Prevention Cost

Cost of Quality

Cost of Poor Quality

Cost of Good Quality

Internal Failure Costs

External Failure Costs

Appraisal Costs

Prevention Costs

Improve Product Quality
Cost of Quality = Internal Failure Cost + External Failure Cost + Inspection Cost + Prevention Cost

Direct Costs – Defects identified in manufacturing
Reduce Scrap & Rework Costs Reduce labor costs from manual inspection Improve reason for reject data to drive process improvement

Indirect Costs – Defects that escape the plant
Brand reputation, customer complaints, warranty claims, product recalls Improved Product Quality

Contact Information

John Keating
Principal Product Marketing Manager

Cognex Corporation
1 Vision Drive Natick,MA 01660 USA Telephone: 1-508-650-3000 email: john.keating@cognex.com www.cognex.com

Top Lessons Learned in Vision Guidance Applications

Presented by:

Eric Hershberger and David Wyatt Applied Manufacturing Technologies

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Eric Hershberger Senior Engineer Applied Manufacturing Technologies

Eric Hershberger Applied Manufacturing Technologies 219 Kay Industrial Drive Orion, Michigan 48359 Phone: 248-409-2000 Fax: 248-409-2027 Email: ehershberger@appliedmfg.com
Eric has a degree in Computer Science from Michigan Tech. He enjoys working with vision systems and loves robot calibration and performance testing.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Wyatt Staff Engineer Applied Manufacturing Technologies

David Wyatt Applied Manufacturing Technologies 219 Kay Industrial Drive Orion, Michigan 48359 Phone: 248-409-2073 Fax: 248-409-2027 Email: dwyatt@appliedmfg.com
David was educated as an Electrical Engineer at the University of Missouri, is a Charter Member of the Machine Vision Association of SME, the founder of Midwest Integration and is a Staff Engineer at Applied Manufacturing Technologies. David started in vision guidance at Delco Electronics. At Delco Electronics, David was involved in the writing of many of the core vision algorithms in use today and served as the Machine Intelligence Chairman for General Motors. At Midwest Integration, David performed hundreds of automation projects earning the Vendor of the Year Award from Day and Zimmerman Inc. as well as the US Small Business Administration’s Award for Excellence.

Top Lessons Learned in Vision Guidance Applications
David R. Wyatt Staff Engineer Applied Manufacturing Technologies

Fixture Repeatability
• Yes, vision relaxes fixture requirements • No, vision will not eliminate the need for fixtures • The capability of the vision system must exceed the repeatability of the fixture. • Determine fixture repeatability before doing anything else. (6 DOF)

Vision Repeatability
• If the part cannot be located with repeatability, the application will never work. • A representative supply of parts must be maintained for testing and re-testing purposes
– A minimum bank of 30 parts per part type should be identified and stored – Parts bank must have every visible difference expected (such as color or pattern differences) – Do not use scrap parts unless you are sure the differences will not matter

Lighting
• Lighting will make or break the application
– Weeks of effort now avoids years of pain later. – Use long life LEDs and matching filters.

• Send parts to lighting companies or distributors for free evaluations
– Get a gray scale distribution of foreground and background and actual pictures as .bmp files – Must have more than 25 levels of gray between foreground and background.
• Recommend 50 or 100 gray levels out of 255.

• Block all direct exposure to sunlight

Test Plan
• Test plan defines when the job is done.
– The test plan is the specification applied.

• The parts bank is used to execute the test plan.
– New parts get added to the bank.

• The test plan is ran after each change to the software or tooling.

Vision Tools • Invest in great vision tools • Always buy as much resolution and as big a tool box as your budget can afford • Cognex and DVT are different platforms
– You get what you pay for – Test all platforms, never assume

• Use grayscale tools
– Auto-thresholding is self fulfilling – Use Geometric Pattern Recognition (GPR) – Recognize that GPR often times has no reference point

Internal Support
• Vision works better when the plant wants it to work. • Find a local champion.
– Teach him/her the system and test plan.

• Get a support agreement. • Have an Exit plan.

Operators
• Involve Operators early
– They know what is going on at the line level – They know what part problems exist – They know what didn’t work before and why – They can help avoid mistakes and delays

2D versus 3D • A camera is a 2D sensor • We can get 3D info from cameras • Make sure we don’t make 3D decisions on 2D data
– Avoid using shadows to gage height – Avoid using reduced feature sizes as an indication of distance – Perspective can be valid in certain applications

Global vs. Local Calibration
• It is possible to improve robot accuracy in a smaller work envelope
– Accuracy will decrease outside of that smaller envelope

• It is always best to calibrate the vision system as close to the work area as possible

Contact Information
David Wyatt
Staff Engineer

Applied Manufacturing Technologies
219 Kay Industrial Drive Orion, Michigan 48359 USA Telephone: 248-409-2000 email: dwyatt@appliedmfg.com www.appliedmfg.com

How Advancements in Vision Guidance are Making Flexible Feeding Applications Desirable

Presented by:

Eric Lewis Flexomation

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Eric Lewis President Flexomation, LLC

Eric Lewis Flexomation, LLC 586 Northland Boulevard Forest Park, Ohio 45240 Phone: 513-825-0555 Fax: 513-825-1870 Email: eric.lewis@flexomation.com
Bio Not Available at time of Print.

Presentation not available at time of production.

Vision Guided Robot Applications for Packaging & Flexible Feeding

Presented by:

Mark Noschang Adept Technology

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Mark Noschang Manager of Applications Engineering for North America Adept Technology, Inc.

Mark Noschang Adept Technology, Inc. 11133 Kenwood Road Cincinnati, Ohio 45242 Phone: 513-792-0266, x106 Fax: 513-792-0274 Email: mark.noschang@adept.com
Mark Noschang was appointed Manager of Applications Engineering for North America in July of 2008. He joined Adept in October 1997 as an applications engineer in the company’s Cincinnati, Ohio office. In his tenure, he served as a senior applications engineer, as well as fulfilling roles in the training department. Mr. Noschang holds a Bachelor’s Degree in Electrical Engineering from the University of Cincinnati.

Vision Guided Robot Applications for Packaging and Flexible Feeding
Mark Noschang Manager of Applications Engineering Adept Technology, Inc.

Agenda
• • • • Introduction Motivation System Components Pitfalls for system implementation

Packaging and Flexible Feeding?

Flex Feeding???

Packaging vs. Feeding
• Packaging:
– A method by which products are enclosed to provide containment and protection, allowing for easier shipment, distribution, and sale

• Flexible Feeding:
– A method by which parts are taken from bulk storage to a known orientation, typically for assembly operations

What is Flexible Material Handling?
• “A method of taking parts from bulk to a known orientation that can handle multiple part sizes and styles.”
• Abilities:
– – – – Handle a wide variety of part types Perform frequent model changeovers quickly / easily Process multiple parts and models simultaneously Respond quickly to part design changes

Market Business Drivers
1. Higher Throughput per Factory Space
Factory Space is at a Premium

2. Increased Labor & Liabilities Costs 3. Localized Production
Need to Produce Near Customers Regulatory Constraints

4. Increased Product Breadth
Shorter Time to Volume Mass Customization of Products

5. Clean Product Handling
Need for Sterile Packaging

Reasons for Flexible Handling
• • • • • • Short product life Frequent part changeover Reduce engineering / startup time Allow software-only part changeovers Hard tooled solutions are often expensive Provide consistent cost for global production

Flexibility is Key to Success
• Dedicated and specialized equipment is limiting
– Products and packaging changes – Consumers / retailers demand customized products

• Robots can easily be adapted to new products
– Minimal changes to tooling / software – Robust and tightly integrated vision is critical

• Flexible automation provides an agile solution
– Respond quickly to changes in the market – Minimize inventory levels

Cost Per Placement
$0.0140 $0.0140 $0.0120 $0.0120 $0.0100 $0.0100 $0.0080 $0.0080 $0.0060 $0.0060 $0.0040 $0.0040 $0.0020 $0.0020 $$1 2 1990 2 1 3 3 4 4 5 5 6 6 7 7 8 8 9 10 11 12 13 14 152006 9 10 11 12 13 14 15 16 16

Manual

Robot

Source: US Department of Labor, International Federation of Robotics

Underlying Technology
Method of presenting parts Method of locating parts Method of manipulating parts Flexible Material Handling

System Layout
• Methods of presentation
– Continuous conveyor – Indexing conveyor – Part feeders

• Vision systems are used to locate product • The robot picks and places the product • The system software ties it all together
+ + +

Enabling Technologies
1. Multiple Feeding Methods 2. Conveyor Tracking Support 3. Robust Vision Integration 4. High Speed Robotics 5. Integrated Compact Controls 6. Tight Connectivity to System Components

Expectation vs. Reality
“Flexible parts feeding is a lot like education spending. Most folks want only the best schools and the best teachers for their children. But, when it comes time to pay the piper, the enthusiasm quickly evaporates.” - John Sprovieri, Senior Editor, Assembly Magazine

Questions For Planning a System
• What is the overall cycle time required? • Does the customer require average or absolute cycle time? • How is the machine flow controlled? • How many stable states does the part have and can vision differentiate them? • Is the pick state the most stable state? • Can the customer provide parts for testing?

Key to Success in Part Presentation
• Goals for feeding systems
– Provide part singulation / separation – Provide contrast between parts and background – Provide parts in stable state – Present parts to system in desired stable state

Key to Success in Part Presentation
• Selecting the proper type of feed system
– Flexible feeders
• Belt type is best for soft parts • Platen type is best for rigid parts • Grass style is best for fragile or rolling parts

Key to Success in Part Presentation
• Selecting the proper type of feed system
– Conveyor systems
• • • • Variable belt materials / colors Cleated / textured belts Backlit to provide greater contrast Variable speeds to provide part separation

Keys to Success for Vision Systems
• Ability to see what you need to see
– – – – Must have proper lighting to highlight the parts Must have sufficient resolution to identify key features Must have proper lens / mounting Must have comprehensive set of vision tools

• MUST CONTROL THE ENVIRONMENT!

Keys to Success for Vision Systems
• Powerful / complete vision toolset
– Retrain models with a single click – Allow for measurements and inspections – Make any vision tool relative to any other tool – Allow for complex “filtering” of good / bad parts based on any vision result

Keys to Success for Vision Systems
• Automated calibration process • Support for multiple camera mounting configurations

Keys to Success in Robotic Systems
• Pick the correct robot for the job

Key to Success in Robot Systems
1. Table-Top SCARA Robots
High Speed robots for Assembly, Handling & Packaging

2. Cartesian & Linear Robots
Configurable, Precision robots for Assembly & Handling

3. 6-Axis Articulated Robots
High Dexterity Robots for 3-D Assembly & Handling

4. Floor Mount SCARA Robots
High Payload robots for Industrial Applications

5. Parallel Kinematics Robots
Ultra High Speed robots for Packaging

Key to Success in Robot Systems
• • • • • • • Selecting the right robot for the job Feasibility study for specific application Rigid mounting for all system hardware Tight integration of all components Attention to system layout Clear definition of cycle Attention to “details”

Benefits for Packaging and Feeding
• • • • • In-feed parts do not require fixturing Part change over can be a simple software change Different parts can be handled with the same feed system Robots and feeders can be redeployed Vision provides flexibility

• Results:
– – – – Save engineering time Saves start up time Allows equipment to be re-deployed Saves money

Trends in Packaging and Feeding
• Food, Consumer & Household Products and Personal Care Products more specialize • Companies desire to keep manufacturing close to consumers • The use of contract packagers is increasing • Labor & worker liability increasing • Handling randomly oriented product from conveyors is required for many companies • Flexible automation enables companies to compete and flourish in a global economy

Contact Information
Mark Noschang
Manager of Applications Engineering

Adept Technology, Inc.
11133 Kenwood Road Cincinnati, Ohio 45242 USA Telephone: 513-792-0266, x106 email: mark.noschang@adept.com www.adept.com

High Accuracy Robot Calibration, Wireless Networking, and Related Technical Issues

Presented by:

Eric Hershberger and David Wyatt Applied Manufacturing Technologies

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Wyatt Staff Engineer Applied Manufacturing Technologies

David Wyatt Applied Manufacturing Technologies 219 Kay Industrial Drive Orion, Michigan 48359 Phone: 248-409-2073 Fax: 248-409-2027 Email: dwyatt@appliedmfg.com
David was educated as an Electrical Engineer at the University of Missouri, is a Charter Member of the Machine Vision Association of SME, the founder of Midwest Integration and is a Staff Engineer at Applied Manufacturing Technologies. David started in vision guidance at Delco Electronics. At Delco Electronics, David was involved in the writing of many of the core vision algorithms in use today and served as the Machine Intelligence Chairman for General Motors. At Midwest Integration, David performed hundreds of automation projects earning the Vendor of the Year Award from Day and Zimmerman Inc. as well as the US Small Business Administration’s Award for Excellence.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Eric Hershberger Senior Engineer Applied Manufacturing Technologies

Eric Hershberger Applied Manufacturing Technologies 219 Kay Industrial Drive Orion, Michigan 48359 Phone: 248-409-2000 Fax: 248-409-2027 Email: ehershberger@appliedmfg.com
Eric has a degree in Computer Science from Michigan Tech. He enjoys working with vision systems and loves robot calibration and performance testing.

Wireless Vision Systems and High Accuracy Vision Guidance
Eric Hershberger Senior Engineer Applied Manufacturing Technologies

Wireless Vision – A Reality
• New Gigabit Ethernet (GigE Vision™) camera standard • New IEEE 802.11n draft protocol • Routers available on the market • Fewer wires, less expensive high flex cables • Easy to integrate new cameras with older vision systems • Problems and issues • Other options

GigE Vision™ Standard
• New GigE Vision™ camera standard
– 1000 megabits per second (Mbps)

• Lots of camera manufacturers
– DALSA, Basler, Prosilica

• Fast image transfer • 100m cable runs

IEEE 802.11n draft protocol
• Not approved until ~December 2009
– Hardware already for sale as Draft-n compliant – Backwards compatible, but not recommended

• MIMO – Multiple-input Multiple-output
– More antennas – Theoretical 600Mbps

• 5.0Ghz Recommended
– Older protocols use the 2.4Ghz network

Routers Recommended
• Linksys WRT600N • Best implementation of the 5.0Ghz draft-n • 12v power • Easy to mount to the EOAT • Built in GigE switch • Best to use

Less Wires, Less Expensive High flex Cables

• For full wireless solution 12v power cable is only required for EOAT • High Flex gigabite ethernet cables are less expensive than current analog / Camera Link® and FireWire cables

Easy to Integrate

• Visual Studio 2005 recommended • All manufacturers have SDK’s
– Lots of examples and good tech support – Basic routines for image transfer easy to implement

• Easily interfaced with older Vision Pro systems
– Specifically v3.4

• VB scripting can work

Problems and Issues

• Make sure plant IT is on board • Draft-n is not an approved standard
– Future hardware incompatibility – Mix and match hardware vendors might not implement the same

• 5.0Ghz band is clear now, in the future….? • Custom hardware available for analog systems, >$20k

Other Options • Bluetooth 3.0
– Up to 480Mbs – Ultra-wideband (UWB)

• Nanny cams, or wireless ethernet based cameras
– Typically a CMOS imager – Lower resolution – 2.4Ghz band

High Accuracy Vision Guidance
• What can we expect the robot accuracy to be? • Robot calibration • High accuracy tool center points (TCP) • Calibration between robot and vision • Calibration must occur near work…global vs. local calibrations

Robot Accuracy
• • Off the shelf robots are repeatable but not accurate Typical off the shelf robot accuracy will be 3-5mm depending upon the robot arm The robot will be very repeatable 0.02-0.04mm Calibrated robots can have an accuracy of 0.3-0.5mm Use of external equipment can increase the accuracy to less than 0.1mm

• • •

Robot Calibration

• Metris K600 robot calibration • Dynalog Dynacal system • Absolute Accuracy systems from Kuka, and ABB – Calibrated from the factory • Laser trackers can be used • Factory calibration becoming more prevalent

High Accuracy TCP’s • Can use portable CMM’s for best accuracy • Metris and Dynalog can calculate a high accuracy TCP

Robot to Vision Calibration • 2D systems, typical for a dot calibration grid – manual process • 3D systems, auto calibration available • Portable CMM’s can be used for very high accuracy calibration

Global vs. Local Calibration
• It is possible to improve robot accuracy in a smaller work envelope
– Outside of that smaller work envelope will lose accuracy

• Always best to calibrate the vision system as close to the work area as possible

High Accuracy Vision Guidance • Combination of robot calibration, high accuracy TCP, and vision optics can improve your VGR project • Robot calibration alone can help increase the accuracy of off line programming for downloads

Contact Information
Eric Hershberger
Senior Engineer

Applied Manufacturing Technologies
219 Kay Industrial Drive Orion, Michigan 48359 USA Telephone: 248-409-2000 email: ehershberger@appliedmfg.com www.appliedmfg.com

Vision Based Line Tracking

Presented by:

Frank Maslar Ford Motor Company

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Frank Maslar Technical Specialist Ford Motor Company

Frank Maslar Ford Motor Company 36200 Plymouth Road Livonia, Michigan 48150 Phone: 313-805-3904 Email: fmaslar@ford.com
Key Responsibilities: Work with universities and key suppliers to develop and implement advanced manufacturing technology in the manufacturing of powertrain systems. Areas of focus include vision systems and traceability. Previous Positions Held: Advanced Manufacturing and Quality Engineer at Ford Electronics Research Scientist at Siemens Corporate Research Degrees: B.S.M.E. Penn State

Vision Based Line Tracking
Frank Maslar Technical Specialist Ford Motor Company

Background
• Interactive directed research project between Ford Advanced Manufacturing Technology Development and Purdue University Robot and Vision Lab • Team Leaders
– Ford – Frank Maslar – Purdue University – Professor Avinash Kak

Current Robotic Applications
Assembly
3%
Arc Welding 13%

Painting
18%

Dispensing
3%

Inspection
1%

Material Handling
27% Spot Welding 31%

Material Removing
4%
Source: Robotic Industries Association

Opportunities
1910’s 1940’s 1990’s

Moving Line Assembly

• Eliminate wasted time during transfer of parts into and out of assembly stations • Eliminate cost associated with stop stations

Line Tracking Status Overview

• Enhanced Accuracy
–3 mm

• Multi-loop control
–Enhanced robustness

• Visual tracking systems
–Geometric model-based approach –Appearance-model-based approach

Control Architecture

Coarse Control

Fine Control #1

Fine Control #2

Control Arbitrator

Robot Controller

Coarse Control

Without specularity detection

With specularity detection

Detection & Compensation of specular Highlights on target object

Stereo space parameters

Appearance-based 3D Rigid Object Tracking • General Idea
Appearance Model (Template)

0.3 0.25 0.2 0.15 0.1 0.05 0 -0.05 -0.1

-0.1

-0.05

0

0.05

0.1

0.15

0.64 0.66 0.68 0.7 0.72

Perspective projection
The estimated 3D pose

3D model

Current Image

• Extension in the presence of occlusions

Results

Research provided by:

Avinash C. Kak
Professor of Electrical and Computer Engineering Electrical and Computer Engineering Purdue University EE Building West Lafayette, Indiana 47907 765-494-3551 kak@purdue.edu http://rvl4.ecn.purdue.edu/~kak/

Contact Information
Frank Maslar
Technical Specialist Advanced Manufacturing Technology Department

Ford Motor Company
36200 Plymouth Road Livonia, Michigan 48150 USA Telephone: 313-390-2132 email: fmaslar@ford.com www.ford.com

Case Study: Robots & Vision in Life Sciences and Automated Pharmacy

Presented by:

David Arceneaux Stäubli Robotics

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

David Arceneaux Business Development & Marketing Manager Stäubli Robotics

David Arceneaux Stäubli Robotics 201 Parkway West Duncan, South Carolina 29334 Phone: 864-486-5416 Fax: 864-486-5467 Email: d.arceneaux@staubli.com
Arceneaux is Business Development-Marketing Manager for Stäubli Robotics, manufacturers of innovative 4 and 6 axis industrial and cleanroom robotic solutions. During the past 5 years, his key responsibilities include business development, marketing, and strategic management. Stäubli Robotics and employees are corporate members of the Robotic Industries Association (RIA), Association for Laboratory Automation ALA, Semiconductor Equipment and Materials International (SEMI), Society of Plastics Industry (SPI), and Society of Mechanical Engineers (SME).

Robots & Vision in Life Sciences and Automated Pharmacy
David Arceneaux Business Development-Marketing Manager Stäubli Robotics

Agenda
Why Automate? Where is Automation being utilized? Why Use Vision in Robotic Automation? Selecting the Right Automated Solution Types of Robotic Automation Applications Case Study (RIVA)

Why Automate?
Industry Driving Forces
High-throughput (productivity and efficiency) Reproducibility, reliability, accuracy, quality, and traceability Automation of repetitive tasks No cross-contamination (human/product or product/human) Miniaturization Biocompatibility (Cleanroom) Safe handling of sensitive and hazardous products
Photo Courtesy: High Resolution Engineering

Why Automate?
Top 5 Reasons to Automate

Reasons to Automate: 1=Low; 5=High

Increased productivity means that Identification of new drugs and healthcare products can be done more quickly thus reducing the time to market
JALA, (2006). 2006 ALA Survey on Industrial Laboratory Automation

Where is Automation being utilized?
The hottest area utilizing laboratory automation is in high throughput screening (HTS) Still many growth opportunities within most life science segments

Why Use Vision in Robotic Automation?
Key Benefits
Elimination of Costly Fixtures Reduced capital investment costs

Improved Robotic Accuracy Lower downtime and improve process efficiency

Multiple Sample Processing on the Same Production Platform Improved equipment flexibility & reduced changeover times

Improved Quality Resulting from Vision Inspection Command price premiums for a higher quality product

Selecting the Right Automated Solution
Key Success Factors
Reliability – Automated systems need to be able to work consistently for extend periods with minimal human intervention. (24/7) Scalability Consideration to current equipment and future growth should include automation that is modular and scalable. Flexibility Automation needs to be easily reconfigurable when the need arises. Data Storage and Scheduling Increasing sophistication of the automation equipment and the increasing volume of information that is created, software is an important link to the success of the system.

Types of Robotic Automation
Three of the most common geometries for laboratory robots are: Cartesian (three mutually perpendicular axes) Cylindrical (parallel action arm pivoted about a central point) Anthropomorphic (multi jointed, human-like configuration). Anthropomorphic (5-6 Axis) robots generally provide more flexible “human-like” automation that includes transfer, handling, weighing, extraction and general manipulation of media.

Applications:
Compact Cell Automation
Compact cells can include all the devices needed for plate storage, plate retrieval, plate replication and high throughput screening. Flexible robotic automation allows devices to be positioned closer to the robot for a compact solution with all of the capacity and functionality of the larger systems offered in the industry.

Applications:
Bench Top Automation
Bench top robotic systems are inexpensive, flexible robotic solutions capable of performing a wide array of applications. Cells can be configured with two or more devices utilizing very little floor space.

Applications:
Cell culture automation
• Cell culture is and has historically been an essential component of the drug discovery toolbox.
• Cell culture provides the proteins, membrane preparations and other raw materials required for biological research.

• In recent years this demand for cells and new cell lines has increased dramatically with the emergence of high-throughput screening reinforcing the need for robotic automation.

Applications:
Ultra High Throughput Automation
Robots are capable of UHTS applications with a throughput of over 1 million assays per day. These systems are built for industrial use and are capable of running 24 hours a day, 7 days a week.

Case Study: Automated Pharmacy (RIVA)
ROBOTIC IV AUTOMATION
The new standard in IV admixture compounding. Providing INTELLIGENT SOLUTIONS to deal with the issues of. . . – – – Safety Efficiency and Effectiveness The ever changing standards of the regulatory environment.

Background
St. Boniface Research Centre developed technology
The St. Boniface Hospital & Research Foundation was established in 1971 as part of the centennial celebrations of the Hospital. The Foundation acts as the primary fundraising body for the St. Boniface General Hospital Inc., and promotes excellence in health care research related to the prevention and treatment of disease, the promotion of good health, and improvements in patient care.

RIVA is the first product of its kind

The Product

RIVA
Robotic IV Automation

The Product Overview
RIVA is an integrated system designed to automate the process of preparing IV admixtures in the hospital pharmacy.
Designed with input from + 300 pharmacists Fully enclosed ISO Class 5 environment (cleaning procedure in minienvironment. They use UV process for Sterilization daily… FDA required.) Fully automated preparation of syringe and bag doses…in any combination…in many sizes Multiple independent verifications – visual, weight for complete electronic audit trail Up to 600 labeled, patient specific or batch doses, per 8 hour shift with one operator Designed to operate in the currently evolving regulatory environment

The Market
Target Market Outlook – Market size- 1750 hospitals in the main target profile • 510 with 400+ beds; 1240 with 200 to 400 beds – (All manual process with high turnover of personnel)

– Based on feedback to date and IV volumes • On average 2 RIVA units per hospital – 3500 units – (Two areas for need: Antibiotics and Chemo therapy and they must be kept separate)

Value Proposition to Hospitals
There are several categories of potential savings to the hospital, these vary depending on the institution and their goals – Resource savings (3 Techs per shift – ROI in less than one year) – Drug wastage savings (= staggering losses – up to 40% IV applications are scrap. 99% usage with RIVA) – Reduction or elimination of pre-fills from third parties – Cost avoidance of major construction for USP797 compliance (Cost for a “cleanroom”) – Inventory reduction (better control and loss prevention + resources limited)

Conclusion
Just about any laboratory and hospital can take advantage of some of the advances in automation, the questions are what to automate and to what extent? The options cover the spectrum from islands of automation, which retain some manual processes, to fully automated integrated systems. The optimal degree of laboratory automation depends on the laboratory setting and considerations of cost, throughput, and flexibility. Other considerations include the time that will be required to complete the installation, the space available, the proportion of the tests that are routine, the availability of skilled technicians, safety, and reliability. • The reasons for automation have become so compelling that it is no longer simply a competitive advantage for laboratories, but rather is now a competitive necessity!

Contact Information

David Arceneaux Business Development-Marketing Manager Staubli Robotics 201 Parkway West Duncan, South Carolina 29334 USA Telephone: 864-486-5416 email: d.arceneaux@staubli.com www.staubli.com

Unmanned Systems Intelligence, Vision and Automation Concepts for Combat Engineer and Other Battlefield Missions

Presented by:

Jerry Lane Applied Research Associates, Inc.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Jerry Lane Director Great Lakes Office Applied Research Associates

Jerry Lane Applied Research Associates 48320 Harbor Drive Chesterfield Twp., Michigan 48047 Phone: 586-242-7778 Email: glane@ara.com
Jerry joined ARA in 2005 to work on multiple unmanned systems projects including the Modular Robotic Control System for Route Clearance and Combat Engineer missions, the man-packable Nighthawk mini-unmanned aerial vehicle and the obstacle/stair climbing robot, and the Lightweight Reconnaissance Vehicle (LRV). Jerry’s prior assignment for 28 ½ years was at TARDEC leading various advanced vehicle technologies and robotic vehicle projects. Jerry is also on the Board of Directors of the Michigan Chapter of NDIA and First Vice President of the Association for Unmanned Vehicle Systems International (AUVSI). Jerry is the Co-Founder and Co-Chair of the Intelligent Ground Vehicle Competition (IGVC). Jerry holds a Bachelor of Mechanical Engineering and Masters of Business Administration from the University of Detroit.

Unmanned Systems:
Intelligence, vision and automation concepts for combat engineer and other battlefield missions

Jerry Lane Director Great Lakes Office Applied Research Associates

Unmanned Systems: Intelligence, vision and automation concepts for combat engineer and other battlefield missions

• •

Combat engineer functions such as route clearance, Explosive Ordnance Disposal (EOD), countermine and other high risk battlefield missions can be performed with intelligent and capable unmanned systems. Unmanned systems with advanced manipulators, machine vision, planners, navigation can automate the dangerous battlefield functions. Unmanned systems can save soldiers lives while increasing the efficiency of the operations. The integration of advanced vision and manipulation with provide new levels of semi-autonomous performance on the battlefield by relieving the soldier from dull & tedious moment to moment control of current tele-operated systems

Robotic Route Clearance for Convoys

Robotic Route Clearance
Robotic Control System +
Construction Equipment Armor option Supportable Low cost Optional equipment High power Hydraulic available

+
Multiple IED Disruptors Rollers & Chains Rakes & Cutters Jammers GPR/EM Detectors

Robotic Concept with IED Disruptor Options

Rhino

Cat 924G/H with Robotic Control System

IED Drag Chains

USAF All-purpose Remote Transport System (ARTS) - Tool Attachments
• Standard Tool Attachments For ARTS
– – – – – – – – – – – Forks Loader Bucket Brush Cutter Three-point Hitch Surface Clearance Windrow Assembly Backhoe Gripper Manipulator Arms Water Cannon High Energy Access And Disablement Device (HEADD) Small Munitions Disrupter (SMUD)

• Tool Attachments with ARTS

Robotic Skid Steer RC50/RC60 for USMC
10 Delivered & 5 in Iraq

USMC Grappler dexterity with simulated UXO

US Army Humanitarian Demining

Detection Robotic Platform

Neutralization

Humanitarian Demining System Development Effort
Sponsored by US Army Night Vision and Electronic Sensors Directorate (NVESD), Fort Belvoir, Virginia, USA Integrate Detection and Neutralization Technologies on a Robotic Platform

ARA Robotic controls on Cat 247b with backhoe

IED Command & Trip Wire Cutter Options

Lightweight Reconnaissance Vehicle (LRV) for under vehicle Inspection

LRV Videos & Specifications

Stair Climbing

Mountain Terrain

--- climbs and descends stairs, rocks and rubble up to 11” in height. --- low cost surveillance robot (starting at under $25k) --- self righting, under vehicle inspections (planes, trains and autos) --- 0.003 Lux color cameras with IR illumination, Simple Tele-operation --- 15 lbs. (man packable), carry small payloads, & pulls large payloads --- Up to a 7 mile range with its Lithium Polymer batteries

LRV Under Vehicle Inspection

LRV: High mobility, low profile, backpackable reconnaissance UGV

Nighthawk
Mini Unmanned Aerial Vehicle Weight = 01.5 lbs (w/ payloads) Rolls into a 6” x 20” tube No assembly required Quiet Electric Propulsion Adv GPS Navigation Auto Pilot Forward and Side looking Cameras IR Option

Nighthawk In-Flight

UAV Launch, Control, & Recovery for Immediate Situational Awareness Launch Solutions:
Launcher SIDE VIEW Launcher FRONT VIEW

UAV/MAV:

Recovery Solutions:
State Police Sedan (Ford Crown Victoria): Deployable Capture System SIDE VIEW Deployable Capture System FRONT VIEW Launcher Recovery (Deployed) CIA SUV (Chevy Suburban):

Systems Integration Solutions:
HMMWV-M1025 SIDE VIEWS Launcher Recovery (Stowed)

Enhanced Situation Awareness On-Board Deployment of UAV

Robotic Checkpoint Objective Capability

Robotic Checkpoint (Near Term 3-4 mos.)

Vehicle Stopped at Robotic Checkpoint
Inspection Robot w/grappler Blocking Robot w/remote M240

LRV under vehicle inspection robot

Inspection Robot removing cargo

Inspection robot removes hidden ordnance

Robotic Checkpoint Control

Suspect Car removed by blocking robot with forks

Suspect Car threat neutralized by water cannon

Contact Information
Jerry Lane
Director Great Lakes Office

Applied Research Associates Inc.
48320 Harbor Drive Chesterfield Twp. Michigan 48047 USA Telephone: 586-242-7778 Email: glane@ara.com www.ara.com

International Trends and Applications in 3D Vision Guided Robotics

Presented by:

Adil Shafi SHAFI Innovation Inc.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Adil Shafi President SHAFI Innovation, Inc.

Adil Shafi SHAFI Innovation, Inc. 803 Lakeshore Drive Houghton, Michigan 49931 Phone: 734-516-6761 Email: adil.shafi@shafi-inc.com
Adil Shafi is founder and president of SHAFI, Inc. and SHAFI Innovation, Inc. (www.shafiinc.com) He has worked in the robotics and vision industry for the last 20 years and his companies have pioneered more than 100 Software Solutions in the area of Vision Guided Robotics. His company's RELIABOT software runs on equipment worth $500 million and is ranked #1 in the world for AutoRacking and Bin Picking. Adil received three degrees in Computer Science and Electrical Engineering from Michigan Technological University and worked in Chicago, Manhattan and the Silicon Valley prior to founding his companies in Michigan. He works closely with many industry, academic and government organizations and has travelled to more than 90 states and countries in the world.

Presentation not available at time of production.

Advances in 3D Vision Guided Robotics at Fraunhofer IPA

Presented by:

Jens Kuehnle Fraunhofer IPA

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Jens Kuehnle Research Associate Fraunhofer Institute Manufacturing Engineering and Automation (IPA)

Jens Kuehnle Fraunhofer Institute Manufacturing Engineering and Automation (IPA) Nobelstrasse 12 70569 Stuttgart Germany Phone: 49 711 970 1861 Fax: 49 711 970 1004 Email: kuehnle@ipa.fraunhofer.de
Jens Kuehnle studied pure and applied mathematics at Ulm University, Germany and at San Diego State University, California, USA. He completed his diploma in 2005 and his Master of Science in 2006. Since 2007, he is employed as research associate in the Information Processing department at Fraunhofer Institute Manufacturing Engineering and Automation (IPA), Stuttgart, Germany. His areas of interest include software development for 3D data evaluation, 3D metrology and computed tomography, 3D perception in robot vision and dimensional inspection.

Advances in 3D Vision Guided Robotics at Fraunhofer IPA
Jens Kuehnle Research Associate Fraunhofer IPA

Structure

• • •

Motivation 3D Measurement Principles Applications: – 3D Obstacle Avoidance for Service Robots – 3D Object Recognition and Localization for Automation and Handling Engineering – Case Studies on Bin Picking Summary

© Fraunhofer IPA

Fraunhofer Society in Germany

58 institutes at 40 locations 12.800 employees Budget of 1,2 billion euro

© Fraunhofer IPA

Fraunhofer IPA, Stuttgart

© Fraunhofer IPA

Focal Fields of Research at Fraunhofer IPA
Corporate Management
Life cycle management Product creation management Technical risk management Hazardous materials and recycling management Quality management Factory planning and design Work-flow and process planning Production system and plant management Production optimization Order processing management Inbound and outbound logistics Manufacturing logistics Supplier parks Network logistics and supply chain management

Automation
Digital Factory Visualization and interaction Technical information systems New fields of application Human-machine interaction Industrial and service robots Assembly systems Robot assistance systems Sensor technology, mechatronics and microsystems engineering Signal and power transmission Measurement and testing technology Ultraclean technology Rapid product development

Surface Technology
Coating processes and plant for paints and powder coating Deposition of metals and metal compounds (electroplating, PVD/CVD) Optimization / planning of processes and plants Generation of functional layers and nano layers Analysis and testing techniques for components, layers, surface engineering processes and sources of defects Simulation of coating processes and spatial flow

© Fraunhofer IPA

Department Technical Information Processing

• • • • • • • •

3D Object Recognition Robot Vision Industrial Image Processing Digital Signal Analysis Measurement and Testing Technology 3D Modeling and Reverse Engineering Rapid Prototyping, Rapid Product Development Industrial Computer Tomography

© Fraunhofer IPA

Motivation

Motivation
Challenges of Industrial Production innovation/product life cycle ▼ zero defect production ▲ part complexity ▲ robot usage in human environments ▲ robot accomplishes complex tasks (e.g., parts handling) ▲ Effects on Machine/Robot Vision reconfigurability of systems ▲ measurement in manufacturing cycle ▲ intuitive MM-communication; no expert knowledge ▲ 1st Law of Robotics: collision avoidance ▲ perception ▲

© Fraunhofer IPA

Motivation
Challenges of Industrial Production innovation/product life cycle ▼ zero defect production ▲ part complexity ▲ robot usage in human environments ▲ robot accomplishes complex tasks (e.g., parts handling) ▲ Effects on Machine/Robot Vision reconfigurability of systems ▲ measurement in manufacturing cycle ▲ intuitive MM-communication; no expert knowledge ▲ real-time modeling of 3D workspace with memory object recognition and localization of industrial parts

© Fraunhofer IPA

3D Measurement Principles

Rough Classification
ula tio n
Sheet of Light Time-of-Flight

ul a od e)M (D

Tri an g

Structured Light

Interferometry

n tio

Stereo

Focus Determination •

Depth-From(De)Focus

all yield 3D point cloud or 2.5D depth image of object‘s 3D surface

© Fraunhofer IPA

Triangulation
Sheet of Light
Laser Lineprojection
Mo vem en t

Structured Light

2D-Camera

• • •

Accuracy < 1mm One profile per measurement; movement necessary (linear) E.g. SickIVP Ruler E600: 10.000 profiles/sec., 1.536 points each

• • •

Accuracy < 1mm Dependence on contrast and hence, on ambient light E.g. gom ATOS III: 4.000.000 points in 2 sec.

© Fraunhofer IPA

Time-of-Flight
Rotated 2D Laser Scanner Time-of-flight Camera

• • •

Accuracy > 3mm One profile per measurement; movement necessary (pivoting) E.g. Sick LMS400: 500 profiles/sec.

• • • •

Accuracy > 10mm Provides depth/intensity image Still somewhat prototypical E.g. Mesa Imaging SwissRanger SR3000: 176 x 144 pixels at 25 Hz

© Fraunhofer IPA

3D Obstacle Avoidance for Service Robots

Obstacle Detection
• • A robot completes its tasks in 3D space, so obstacle detection should be done in 3D as well. State of the art (SOTA):
– various sensor systems available (e.g., ultrasound, radar, laser rangefinder, rotated laser scanner, stereo rig) – problem: no 3D data set (ultrasound, radar, laser rangefinder) or generation of 3D data expensive (time: rotated laser scanner; computation: stereo rig)

Promising sensoric alternative:
– time-of-flight camera provides 2.5D depth image at video frame rate

© Fraunhofer IPA

Moving Voxel Model with Memory
• 3D capture of robot environment in voxel model:
– axis aligned uniform grid – parameters:
• dimension in x/y/z • voxel size

– voxel states:
• unknown (white) • free (green) • occupied (red)

– voxel time stamp – e.g. 6.4m x 6.4m x 3.2m with voxel size 0.05m yields 1.048.576 voxels

© Fraunhofer IPA

Moving Voxel Model with Memory
• • Typically, viewing frustum < model dimensions
– data registered within the model

Model follows the robot‘s movement
– maintain information in overlap

robot robot robot

© Fraunhofer IPA

Experimental Platform
• DESIRE paltform (www.service-robotik-initiative.de)
stereo rig Marlin 145C2 3D sensor SR3000 Pan-Tilt Schunk PW70

Voxel model used for obstacle avoidance:
– 3D collision check of robot 7-DOF KUKA LBR3 with environment robotic hand Schunk SAH – complements 2D path planing (projection of voxels into xy-plane)
laser scanner SICK S300 biaxial drive

© Fraunhofer IPA

Performance of Voxelization Algorithm
• Time-of-flight camera SR3000 used
– calibration + depth correction + bad pixel removal + robot filtering

Model update rate:
– CPU algorithm: about 10 Hz – GPU algorithm: about 25 Hz

• •

Time stamp allows to keep data up-to-date Possible to use more than one sensor

© Fraunhofer IPA

3D Object Recognition and Localization for Automation and Handling Engineering Case Studies on Bin Picking

Object Recognition and Localization
Sorted parts

S
Partly ordered 2D camera

TA O

• Parts stored in special carriers • Parts supplied totally ordered • Vision system redundant • However, carriers specifically adapted to the parts stored and thus, variations in the parts usually require the carriers to be changed as well • Carriers are space-consuming

Randomly stored 3D Sensor

© Fraunhofer IPA

Object Recognition and Localization
Sorted parts
• Parts get separated by special machinery • Vision system for parts lying separated from each other

Partly ordered 2D camera

S
Randomly stored 3D sensor

TA However, machinery O •

specifically adapted to the parts and thus, variations in the parts usually require the machinery to be changed as well • Machinery is space-consuming

© Fraunhofer IPA

Object Recognition and Localization
Sorted parts
• Parts in known plane (e.g., belt, …) with only 3 DOF (translation x/y, rotation z)
- one 2D camera - image processing

Partly ordered 2D camera

S
Randomly stored 3D sensor

TA O

• Parts in arbitrary position (6 DOF) with identifiable features (e.g. corners, holes, …)
- one (or more) 2D camera(s) - photogrammetry

© Fraunhofer IPA

Object Recognition and Localization
Sorted parts
• Parts „thrown in a box“ • Parts supplied unordered • E.g., Bin Picking

Partly ordered 2D camera

• Parts in arbitrary position (6 DOF) with no identifiable features

Randomly stored 3D sensor

- 3D sensor

© Fraunhofer IPA

Problems like „Bin Picking“
• Applications
– handling of known parts with a robot – e.g, supplying chaotically stored parts to the manufacturing chain

Vision system that recognizes and localizes the specified object should meet certain criteria:
– – – – – speed: in time with fabrication process robustness: 100% performance even on partial data degree of automation: amateur vs. expert operability flexibility towards variations in parts handled variety of objects that can be handled

© Fraunhofer IPA

Different Approaches at Fraunhofer IPA
• Bin Picking ≠ Bin Picking
– part features (e.g., geometry) – form of supply – etc. Approach 1: database-driven

At Fraunhofer IPA, two different approaches are studied:
– database-driven algorithm that uses the CAD-model of the part considered – algorithm based on best-fitting geometric primitives within the part

Approach 2: fitting of geometric

primitives

© Fraunhofer IPA

Approach 1:
database

• • •

Offline: database contains certain orientations of considered part Online: comparison of scan data with orientations in database Characteristic (excerpt):
– Depth histograms – Normal vector histograms – Significant bounding volumes on the part (positive) and next to the part (negative)

rotated CAD-model

comparison

scan data
© Fraunhofer IPA

Approach 1: Overview
Generation of a  point cloud e.g. with a rotated laser‐scanner Object localization of workpieces with offline‐generated  database Gripping point calculation, movement  planing and gripping of a workpiece

© Fraunhofer IPA

Approach 1: Status
• Current state of development:
– Tool to generate Database implemented – Flexible object-localization and gripping-point-calculation finished – Prototype of bin picking system for shafts implemented

Currently under development:
– Integration of robot-cell at Hirschvogel Umformtechnik GmbH (Denklingen, Germany) with cycle time 8s till end of 2008 – Tests with different types of objects (e.g. ring, housing)
gripper

found parts

© Fraunhofer IPA

Approach 2 (basics): Segmentation of Geometric Primitives
• • Geometric Primitives are:
– Plane, Sphere, Cylinder, Cone, Torus

Best-Fit Principle (non-linear least-squares) optimizes:

d T = ( d1 ,...,d m )
measurement point

d i = X i − X i′

orthogonal distance distance vector

objective functions:

Xi

X i′ base point

2 σ0 = d T PT Pd

with weighting matrix

PT P
m i =1

optimization problem:

min min m
a
© Fraunhofer IPA

{X i′}i =1∈ F

σ

2 0

({X ′ ( a )} )
i

Approach 2 (basics): Best-Fitting of Geometric Primitives
• In case, the point cloud represents more than one geometric primitive, the iterative fitting can be complemented with an automatic segmentation into inlier and outlier:
– error of fit = order of magnitude of measurement error

inlier

outlier

© Fraunhofer IPA

Approach 2:
• Parts are represented by means of the geometric primitives contained within (meta-model) Idea: geometric primitives carry enough information to make possible proper recognition and localization Algorithm combines segmentation and best-fit of geometric primitives
meta-model
cylinders representing crankshaft

model registration
segmenting and best-fitting of cylinders in scan data

© Fraunhofer IPA

Approach 2: Overview
1. scanning unordered scene 2. recognition/localization
scan data

found cylinders

3. collision-free gripping

4. supplying ordered

© Fraunhofer IPA

Approach 2: Status
• Current state of development:
– Prototype of bin picking system for „simple“ parts implemented

Currently under development:
– Treatment of more complex parts which can also contain free-form surfaces (e.g., crankshaft) – Tool to automatically teach the system to handle a yet unknown part – Tool to automatically generate all parameters needed

© Fraunhofer IPA

Performance of Bin Picking Algorithms
• Computing time:
– Approach 1: about 0.5 sec. with database < 40 MB. – Approach 2: about 0.25 sec.

Accuracy of localization adaptable:
– Approach 1: dependent on rotatory resolution of database (typically 2°) – Approach 2: bounded by sensor inaccuracy (typically < 0.5 mm)

• • • •

Not dependent on 3D sensor used Successfully adapted to different parts Outstanding rate of recognition for parts tested Adaptable to different parts:
– Approach 1: definition of bounding volumes – Approach 2: examination of geometric primitives (relation, symmetry, …)

© Fraunhofer IPA

Summary

Summary
• • • • • • Situations occur, where 2D does not suffice. 3D algorithms are in need. 3D data is typically more complex than 2D (depth map, point cloud). 3D algorithms computationally more expensive than 2D. Fraunhofer IPA has many years of experience with 2D/3D. Examples of 3D vision guided robotics at Fraunhofer IPA:
– real-time modeling of 3D workspace with memory – object recognition and localization of industrial parts

© Fraunhofer IPA

Acknowledgments

I would like to thank my colleagues at Fraunhofer IPA. In particular, Martin Stotz, Thomas Ledermann and Dennis Fritsch.

This work was partly funded within the research project DESIRE by the German Federal Ministry of Education and Research (BMBF) under grant no. 01IME01A.
© Fraunhofer IPA

Thank you for your Attention!

Contact Information
Jens Kuehnle
Research Associate

Fraunhofer Institute Manufacturing Engineering and Automation (IPA)
Nobelstrasse 12 D-70569 Stuttgart Germany Telephone: +49 711 970 1861 email: kuehnle@ipa.fraunhofer.de
© Fraunhofer IPA

www.ipa.fraunhofer.de

Vision Guided Part Loading/Unloading from Racks for Automotive Applications – Lessons Learned

Presented by:

Robert Anderson Chrysler LLC

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Robert Anderson New Technology Manager Advance Manufacturing Engineering Chrysler LLC

Robert Anderson Chrysler LLC 800 Chrysler Drive, CIMS 482-04-16 Auburn Hills, Michigan 48326 Phone: 248-944-6076 Fax: 248-841-6272 Email: ra2@chrysler.com
Robert Anderson is currently the New Technology Manager in the Advance Manufacturing Engineering Department, Chrysler LLC. Mr. Anderson is responsible for coordination of technology development and deployment activities for North American assembly plants. Mr. Anderson has developed and used robotic vision systems during most of his career. Mr. Anderson received Bachelor’s and Master’s of Science degrees from the Ohio State University and a Master’s of Engineering degree from the University of Michigan. Before coming to the Chrysler, Mr. Anderson worked for General Electric and Automatix, Inc. Mr. Anderson has four patents in powder metallurgy and laser processing and has published several technical articles.

Vision Guided Part Loading & Unloading from Racks for Automotive Applications – Lessons Learned

Robert Anderson
New Technology Manager Advance Manufacturing Engineering

Chrysler, LLC

Automotive Part Loading / Unloading

Topics • • • • • • • Robotic rack load / unload experience Chrysler flexibility strategy Why use vision ? Vision techniques / approaches Design, integration & operations lessons Plant considerations Summary

Scope
• • • • Automotive body shop Loading & unloading part racks Practical lessons learned Specific material delivery, but lessons adaptable to other delivery, conveyance methods • Topics not covered
– Image processing techniques, inspection, gauging, bin picking, best fit guidance, adhesive detection, general assembly, . . .

U.S. Auto Market Fragmentation Drives Lean Processes

Increase
181

70%

306

2000

2008

Flexible Processes

Robotic Vision Experience
• • Early Experience
– First vision aided rack unload application in 1997 using stationary cameras. – Robot mounted cameras for rack unloading used first in 2006.

Current Applications
– Belvidere Assembly was the first Chrysler program to use comprehensive vision rack unload strategy. – All but one Chrysler assembly plant and all Stamping/Sub-Assembly fabrication plants use rack load / unload vision systems. – Approximately 75 robotic vision rack load / unload applications. – Load and unload a wide range of body shop components: doors, floorpans, body sides, rails, crossmembers. . . .

Future
– Standardize systems, incorporate best practices, improve system integration and robustness. – Possibly reduce the number of assemblies loaded/unloaded with vision. – Looking at bin picking but need solid business case.

Rack Load / Unload Applications

Why Use Vision ? – Reduce Cost
• Productivity, not technology • Must consider:
– – – – – – Direct & indirect labor Vision system cost Tooling & rack costs Training Impact on production rate, MTBF, MTTR Part transportation

Must reduce total life cost !

Why Use Vision ? – Flexibility & Floor Space

Why Use Vision ? – Productivity
• Cycle time improvement ? • MTBF, MTTR impact ?

Why Use Vision ? – Logistics
Logistics Factors: • One time rack build, eng’g change cost • Production day rate • Rack density • Rack fleet quantity • Transportation cost • Part movement & damage • Rack change frequency

Even the lost or gain of one or two parts per rack costs or saves thousands.

Why Use Vision ? • • • • • Reduce costs Flexibility, floor space Ergonomics Productivity Logistics definitely sometimes sometimes sometimes no

Select application carefully !

Vision Approaches
• 2D
– X, Y, Rz – Uses auxiliary sensors, palletizing routine to compensate for Z. – May use robot or stationary mounted camera. – Object can not tilt out of plane – Often used for flat, stack sheet metal and lower accuracy applications

• 2½D
– – – – Feature size, shape change interpreted as change in Z. Requires 3D object model Object tilt introduces measurement error Often used for parts which have little rotation in the rack

Vision Approaches
• 3D with one robot mounted camera
– – – – 6 degrees of freedom May use multiple robot mounted cameras for larger objects Requires 3D object model Lens size & lighting placement critical

Z

Y X

• 3D with multiple cameras
May use stationary cameras or combination of robot and stationary cameras
– – – – – May not require object model Requires clear lines of sight for stationary mounted cameras Often used for larger objects Suited to applications which require vision for parts & racks Stationary mounted cameras less prone to operational problems

System Variations
• Architecture
Sensor Type Single monochrome CCD camera One or multiple CCD cameras laser (triangulation) Robot Stationary Combo. robot & stationary 2D & 3D: part model comparison True 3D: triangulation / stereo Vision PC / software Integrated with robot Smart cameras Separate vision screen & keyboard Vision access via robot teach pendent Laptop for setup, only

Sensor Mounting Robot Offsets Vision System

• Optics
Lighting LED Max. response Variable intensity control Manufacturer Max. response Pixel size / total resolution Req'd field of view, resolution None Manufacurer industrial Integrator Focus Aperture Shutter Speed

Camera Lens Camera Housing

HMI

• Support
Engineering / Integration Start-up Training Vision system supplier Vision, controls, robot integrator System integrator, line builder Turn-key Contractual to main OEM Direct Contractual to OEM

Manual / Automatic Adjustability

Robot Mounted Camera: No Incident Angle

Robot Mounted Camera(s): Low Incident Angle
x 1.15 x 2.5 x 2.5 x 2.4 Brightness Range

Object surface

Stationary Mounted Cameras

Vision Approach Lesson Learned
• One solution will not likely be best for all applications
– Drives increased cost and complexity

• Custom solutions lack advantages of standardization
– Need consistent vision components, integration and user interface to optimize operation and troubleshooting procedures.

• Standardize by application
– – – – – – Flat, stacked sheet metal such as floor panels Vertical, 3D objects such as rails Completed assembly loading such as doors, hoods Small components such as cross-members Installation applications such as roof and glass fitting Can also group by positioning accuracy required

Image Processing
• Lighting

• Feature recognition
Search windows, field of view, accuracy, redundancy, accommodate part movement

Robustness for Productions Conditions

Use 3D Vision

• 2 ½ D vision due to lack of part features within field of view •

Select part features to enable use of 3D object model. Concurrent engineering, if required, to add product features.

Required Accuracy

Hand-off table • • Hand-off tables require less accuracy and perhaps eliminate the need for vision. Robot to robot hand off and direct placement in geometry stations require greater accuracy.

Robot to robot hand-off

Robot Grippers, Tooling
• Interference & tolerances between parts, racks, robots, grippers • Camera, prox./sensor placement
– Careful design required for multiple models

• • • • •

Robust design for camera stability Direct robot handoff and passing tables Access all parts including sides / back of rack. Use of auxiliary sensors Use of alternate back-up process ?

Robot Programming

Robot / System Programming
• • • • • • More complex, innovative programming All rack conditions – first, last, front, back,
top, bottom, right, left.

Cable movement, protection Standards for consistent programming Extensive calibration functionality System diagnostics 3D AutoRacking: A Partnered Solution by COGNEX and SHAFI
– Image processing & fault messages in user language – Prevent interference, crashes due to calculated vision offsets
RELIABOT PC: Simplified Robotic Software Solutions www.shafiinc.com Model: LWB Right Vision System Status
Rack: Full/Empty Left vs. Right: Correct/Incorr Short vs. ect Long: Correct/Incorr Robot ect

Recovery
– Semi automatic routines – Well documented job aides – Back-up palletizing operation

RELIABOT Menu: Train Manual AUTO

Coordinates X mm Y 626. Item5.03 mm Value : 1554 mm Z : 11 Units R -3.62 degr : .51 R X: 1.28 degr R 10.5 ees degr Y: ees Z: 8 ees
Click on camera above to see image

Racks – The Biggest Problem

Vision systems capable of adapting to production racks or racks designed for vision ?

Special Racks vs. Rack Validation

Production Rack Considerations

Rod for robot to open rack flippers A problem with color variation

Typical Rack Problems
• Lack of design coordination
– – – – – Vision, non-vision application Clearance with robotic gripper, camera Robot programming limitations Lack of validation early in development Traditional latch mechanisms

• • • • •

Inconsistent rack fabrication, reuse of old racks Improper rack loading Parts shifting during shipping Rack locating within the station Rack damage

Rack Loading, Part Movement

Parts shift in rack

How much movement is acceptable / excessive?

Shift during transport

Manual loading nonrepeatability

Rack Recommendations
• Analyze total cost of ownership
– Capital, labor, productivity, maintenance, logistics

• Use a concurrent, integrated process
– Multiple supplier interaction is key - part design, process design, tooling design, rack design, robot simulation, rack build, validation. .

• Use design best practices
– Design racks for vision applications – Careful consideration of tolerance stack-up – Best practices include: paint, clearances, fabrication practices, latches, rack locating, dunnage features, etc

• Implement a rack maintenance program • Use common and / or flexible racks

Integrated Rack / Process Roadmap
Program Start Final Program Spec. Design Release PreProduction Vehicle Launch -10 -12 0

A
NEW DESIGN

MLD

Z

Conduct Preliminary Logistics Studies Conduct Rack Studies Develop Resource Allocation Plan

Initiate Rack Design Conduct Design Reviews Prototype Complete Prototype Rack Approved Build Pre-Production Racks Build Production Racks Conduct Rack Validation Continuous Improvements Lessons Learned

Develop Mat’l Handling Master Plan

Cross-Functional Workshop Verify Workshop Assumptions
Original Material Handling Process

Make / Buy Review Design Validation

Prototype Validation Consolidated Lessons Learned

Added Cross Functional Roadmap Milestones

Common, Flexible Racks

Integration
• Concept demonstration • Establish roles & responsibilities
– Various suppliers: Vision equipment, vision integrator, robot manufacturer, line builder/system integrator – Turn key responsibility for lead supplier

• Design, engineering
– Product features and vision – Multiple supplier design reviews, FMEA – Seamless system integration

• Pre-production validation
– Validate in / out of tolerance conditions regarding part positioning, angle, surface finish, ambient light, etc.

• Acceptance testing • Start-up & early production support Productivity, not technology; factory, not laboratory !

“Good” Applications
• • • • Repeatable part features Repeatable part presentation Less precision required (passing tables, clamping, tooling) Robust image processing
– Insensitive to ambient light, part finish, part movement – Redundant feature recognition

• Robust tooling and sensors • Easy recovery • Duplication of / or high similarity to, other successful applications Standardize across programs, plants

“Challenging” Applications
• Parts, tooling, racks & vision system not designed concurrently
– Feature search area too small & image processing not tolerate to production variations – Rack & tooling engineering changes during start-up

• Part movement during transport and/or robot pickup • Precision robot placement and processing required • Cumbersome robot recovery impacts MTTR Perform careful upfront engineering

Vision Not Required

Don’t use vision unless it is certainly the best alternative

Plant Considerations - Training
• Know your audience
– Limited technical information with extensive hands-on – Identify specialists and generalists

• Don’t train too early (or too late)
– Build it into master schedule & account for personnel availability

• Training is not a one time event
– Conduct “fire drills” – Conduct refreshing training

Plant Considerations – Readiness
VISION FAILURES

Job Aids:
Calibration Vision failures Retry, Abort Comm. failures

Problem Trigger is OK, response fails Offsets exceeds limits (Min.X, Max.Y, Max. Rx, etc.) High Uncertainty Value Repeatedly missing feature Problem typically one style, only Fail with all features "C" Fail with all features "0"

Probable Cause Lost communications between vision computer & robot Camera moved, limits too narrow Feature on edge of search window, feature found in wrong location Search window or lighting problem Part repeatablility problem, tends to exclude camera failure Camera unplugged or cable damage No part, camera moved/failed, light changed/failed

Readiness Assessment:
System start-up. – connections, power up, software navigation, . . . Operation – robot prgm’g, lighting, reference images, diagnostics, . . .

Fault recovery, restore, backup System maintenance - alignmt, replacemt, calibrtn, clean’g . . . Eng’g changes

Plant Considerations – Optimization
Search Window Search Window

Vision Picture with shadow influence

Vision Picture with lighting optimized

Build system availability and expertise into your process

Plant Considerations – Problem Solving
• 50% of all downtime and 55%
of all occurrences were caused by two clamps (C and D) and a vision error. • The data revealed that C and D faults can be traced back to Dunnage Point B being out of position. • Two other clamps, PP2 and PP3 account for 46% of remaining downtime and 55% of remaining occurrences and can be caused by dunnage point B and/or proximity switch problems.
318

D
FIS Mssg 321

322

A
PP1 FIS Mssg 272
FIS Mssg 247 ISRA 6 Points

E

323

DODGE ONLY

F
PP3 FIS Mssg 274

PP2 FIS Mssg 273

319

FIS Mssg 320

CHRYSLER ONLY

B A
Lower Dunnage Points

C B
Problem Area

Legend
Dunnage Related Points Proximity Switch Related Points Message Number Codes End Effector Clamp Points

ROBOT 1
F IS M e s s a ge

247 272 273 274 318 319 320 321 322 323 TOTALS

LH OUTER BODY SIDE APERTURE FOUR-WEEK PERIOD: 07/23 to 08/17/2007 Time Occurrences M ESSAGE DESCRIPTION 1:09:23 51 ISRA VISION GENERAL ERROR 1:15:50 63 PP1 - PART PRESENT 1 FAULT 0:30:35 22 PP2 - PART PRESENT 2 FAULT 1:02:22 85 PP3 - PART PRESENT 3 FAULT 0:06:17 3 A-M AT HNDLNG CYL A EXT FAULT 0:08:41 19 B-M AT HNDLNG CYL B EXT FAULT 1:03:06 125 C-M AT HNDLNG CYL C EXT FAULT 1:05:30 59 D-M AT HNDLNG CYL D EXT FAULT 0:01:09 2 E-M AT HNDLNG CYL E EXT FAULT 0:03:36 2 F-M AT HNDLNG CYL F EXT FAULT 6:26:29 429

Plant Considerations – Maintenance
• Establish comprehensive PM practices • Ensure vision system & robot software back-up • Stock required spare parts, especially high flex. cables • Establish process to identify and repair bad racks
– Identify responsibility, source, funding

• Identify long term corrective actions to reduce and eliminate need for rack maintenance
– Both vision and rack improvements

Plant Considerations - Support
• Comprehensive robot programming required
– Integrated vision system operation & robot programming

• Establish service agreements up front
– What service is provided with system ? – Ensure on-going support is provided if plant support is uncertain

• Material handling coordination
– Especially early in the program

Summary – Lessons for Suppliers
• One vision technique or approach is likely not the optimum for all applications • Know outside restraints – racks, part variations, tooling, robot programming, etc.
– All suppliers & customer groups must work together.

• Robust solutions are a must
– Validate all production situations

• System operation must be easy for plant personnel
– Recovery to minimize MTTR is crucial

Summary – Lessons / Questions for Users
• Does the use of vision really save me money considering all cost factors?
– Are there less complicated alternatives?

• This this a proven application ?
– Innovation is good, but make sure there is adequate beta-production validation

• Does my plant have culture to accept, learn and maintain complex systems?
– Do I (plan to) have vision technicians and well trained maintenance personnel ?

Summary – Lessons / Questions for Users
• Do I have a detailed specification governing design, performance, integration, support, etc. ? • Do I have a trusted, turnkey supplier ? • Does my start-up plan allow time and material to fine tune process and work out bugs ? • Has the system been designed with careful attention to the racks ?
– Added up front costs are likely requirements to meet expected production requirements.

Integration is the key !

Contact Information

Robert Anderson
New Technology Manager Advanced Manufacturing Engineering

Chrysler, LLC
Auburn Hills, Michigan Telephone: 248 944 6076 email: ra2@chrysler.com

Acknowledgements to ISRA Vision and to SHAFI, Inc. a Braintech company, for providing some presentation material.

Random Bin Picking Technical Challenges and Approach

Presented by:

Babak Habibi Braintech Inc.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Babak Habibi CTO Braintech, Inc.

Babak Habibi Braintech, Inc. 102 - 930 West 1st Street North Vancouver, British Columbia V7P 3N4 Canada Phone: 604-988-6440 Fax: 604-986-6131 Email: bhabibi@braintech.com
Bio Not Available at time of print.

Presentation not available at time of production.

Random Bin Picking Applications/Solutions

Presented by:

Steven West ABB, Inc.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Steven West Business Development Manager ABB, Inc.

Steven West ABB, Inc. 1250 Brown Road Auburn Hills, Michigan 48326 Phone: 248-391-9000 Fax: 248-391-7390 Email: steven.w.west@us.abb.com
Bio Not Available at time of Print.

Random Bin Picking Applications and Solutions
Steven W. West Development Manager ABB Inc.

Random Bin Picking
The Robotic Bin Picking Challenge: • Extreme part overlap and occlusion • Significant lighting variability and shadowing • Shortage of distinct features on parts • Collision avoidance with other parts, tools and bins • Factory ready and RELIABLE!

Random Bin Picking

Automotive OEM Pilot Cell ABB IRB 2400 picks con rods and place on feeder conveyor

RBP – Software (examples)

Braintech eVision Factory

TM

ABB BinWare

TM

Multi Part Candidate Tracking to offer a choice of identified pickable parts Pick Candidate Re-verification, to enable efficient re-use of identified parts from previous pick cycles Multi Grasp Point Selection, to provide multiple grasping points and scenarios for a given part, while also increasing speed and accuracy when grasping each part Advanced 3D Range Data Analysis, to confidently move to and grasp parts without collisions.

Simplified User Interface provides easy set-up for the most common operations required to develop a Robotic Bin Picking System Robust Motion Planner identifies the optimal path for the robot to pick a part from the bin Position Reachability checks to see if the robot can reach the image capture point and the pick point selected by the vision system Collision Avoidance detects if the robot tool and arm will collide with the bin walls

RBP – System Features
Standardized Compliant Tooling handles minor collisions making the gripper more likely to grasp part • 6 DOF compliance • compliance devices in series with variable amounts of compliance and degrees of freedom • adjustable (pneumatic) compliance controlled by robot progam • robot measures tool compliance (air based: air pressure; spring based: proximity sensors) • use one or more digital limits / signals, or use analog IO • limits trigger robot to stop or change direction of robot Auto Recovery from Collision Detect / Motion Supervision faults. The robot is capable of automatically restarting should the robot gripper collide with a neighboring part.

Click screen to see collision recovery

RBP – Optimal Applications
Part Types • 3D Geometry • Dull surface finish • Semi-planar with 2 – 3 resting positions • Examples: castings, forgings, plastic molded parts, flat stamped blanks

Optimal Application

Optimal Application

RBP – Optimal Applications
• Easily recognizable 2D features for initial detection of possible candidates • Edges that can be consistently used to detect the part footprint, even if they are the silhouette – very important • Bin that have angled side walls near the bottom

RBP – Optimal Applications
Complexity Factors • Parts that link, hook or wedge together • Multi-planer parts that rest in in excess of three positions • Small parts, less than 10 cm • Limited pick points • Parts with deep cavities • Parts that appear to have self-similar areas. • Bottom of the bin / side of the bin (deep bins with 90 degree angles – a “deep box”

RBP – Optimal Applications
Semi – Structured RBP • 3D picks from exit conveyors • RBP module solves part overlapping challenges • Does not require parts to be cingulated

RBP – Gripper Design
Considerations • End of Arm Tooling – Narrow footprint for gripper fingers – Rounded edges on gripper fingers and gripper body – Hooks, suction cups and magnetic grippers – Multi pick point grippers (e.g. ID and OD fingers) – Cable and hose management • Re-grip and flipper stands • Part present switches, and sensors to detect multiple parts picked, or parts that have slipped out of position prior to placement • Robot uses specialized tool to move parts away from bin walls • EOAT design for RBP necessitates “out of the box” thinking

RBP – Gripper Design
“out of the box thinking”

Ishikawa Namiki Laboratory – high speed multi-finger hand

RBP – Cell Design
Considerations • Bin location and orientation – Bin location techniques: • Mechanical • Vision • Robotic • Bin logistics • Bin geometry – New programs – Existing programs • Safety system • Robot reach and mounting – Inverted, shelf-mount, standard • Manual back-up

RBP – Application Development
Step 1: Define requirements and Design Concepts:
Outputs: • Simulation Model • Equipment specification including robot model, robot risers, bin / box locators, safety system, etc... • Gripper concept drawn in 3D model • Cycle time study • Proposal and budget for Step 2

Step 2:

Design and Build Test System:

Outputs: • Design and build prototype gripper • Perform 10,000 pick test run • Measure pick rate quality, system availability, and cycle time • Proposal and budget for Step 3

Step 3:

Design and Build Production Pilot System

Outputs: • 1st production ready unit • Lessons learned from pilot installation • Standardization of factory specific requirements

RBP – Business Case
The “Traditional” Business Case for Robot Automation 1. Reduce operating costs 2. Improve product quality & consistency 3. Improve quality of work for employees 4. Increase production output rates 5. Increase product manufacturing flexibility 6. Reduce material waste and increase yield 7. Comply with safety rules and improve workplace health & safety 8. Reduce labour turnover and difficulty of recruiting workers 9. Reduce capital costs (inventory, work in progress) 10. Save space in high value manufacturing areas
Based on research carried out by the International Federation of Robotics (IFR) Published in World Robotics 2005

RBP – Business Case
Random Bin Picking - Business Case “Adders”

1.

Perfectly ergonomic Eliminate noise from vibratory feeding systems Eliminate potential for worker injury: Repetitive motion Workers hit by forklifts moving bins into position Injuries resulting from dropped parts, pinched fingers, etc…

2.

Reduce MRO costs robotic bin picking cells are mechanically simple systems (robot and gripper) much less complex than vibratory feeding and/or mechanical positioning systems lift assists and hoists often require frequent maintenance and spare parts

3.

Improve productivity robots don’t stop to talk about the “big game” causing the line to starve for parts

Based on research carried out by the International Federation of Robotics (IFR) Published in World Robotics 2005

RBP – OEM Pilot Project
Objectives OEM Automotive Customer – Reduction of packaging and material handling costs. The pilot project offers this customer an opportunity to measure the production readiness bin picking technology without disrupting their production process. ABB - Prove that robotic bin picking technology is production ready and develop factory-based lessons learned to support this customer and other customers on future bin picking orders.

RBP - Pilot Project Performance
Total Parts Picked System Failures System Availability Total Faults Pick Rate Quality Test 1 Test 2 1125 3005 2 5 99.8222% 99.8336% 42 147 96.2667% 95.1082%

Faults: Does not stop the robot or require manual intervention. •Dropped part •No part present in gripper •Two parts linked together •Automatic recovery from collision detect

System Failures: Requires operator intervention or restart of system. •Gripper stuck between parts •Axis 4 Joint out of Range •Vision system does not detect pick-able parts at the bottom or side of bin On average 13 parts remain at bottom of bin when system times out

Contact Information
Steven W. West
Development Manager, Vision Guided Robotics

ABB Inc.
1250 Brown Road Auburn Hills, Michigan 48326 USA Telephone: 248-393-7120 email: steven.w.west@us.abb.com www.abb.com

The Need for Generic 3D Bin Picking

Presented by:

René Dencker Eriksen Scape Technologies

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

René Dencker Eriksen Chief Technology Officer Scape Technologies

René Dencker Eriksen Scape Technologies Kochsgade 31 C, 3. sal DK-5000 Odense C Denmark Phone: 45 65 25 66 03 Fax: 45 70 25 31 14 Email: rde@scaptechnologies.com
Ph.D in Computer Systems Engineering, University of Southern Denmark. Subject: Efficient interpretation of digital images in a structured environment. Three years as Assistant Professor at the Maersk Mc-Kinney Moller Institute for Production Technology at the University of Southern Denmark. Co-founder of Scape Technologies A/S. Chief Technology Officer at Scape Technologies.

Generic 3D Bin Picking
René Dencker Eriksen Chief Technology Officer Scape Technologies

Overview
• • • • • • About Scape Technologies True 3D Bin-Picker – Why and What? Demands from the Industry Challenges in Bin-Picking Case Studies Final Words

About Scape Technologies
• Established 1st of February, 2004 • Spin-off of University of Southern Denmark • Focus on industrial bin-picking based on a patented vision-algorithm.

Why Use a Bin-Picker?
• Rationalization: Labor reduction, fewer
mechanical systems

• Productivity: Higher efficiency, 24 – 7
production

• Flexibility: Reconfigurable, handles many
different objects, takes up little space

• Workforce: Replaces unattractive jobs, avoid
repetitive tasks for your workforce

Demands from the Industry
• • • • • • • • Fast cycle times Generic system to keep total cost low Easy to use (as easy as any 2D system) Handling of different part types Light conditions may change Small size of robot work cell Specific robot brand Constraints on placement

True 3D Bin-Picker
SCAPE Workcell Manager Standard fluorescent lamps Firewire camera SCAPE Workcell Studio

C

C

Handling station SCAPE Tool unit SCAPE communication Server Robot controller

C

EtherNet, RS232

SCAPE computer

Pallet Box (uptil 4 frames high)

Bin-Picker Cell Building

Challenges in Bin-Picking
• • • • • • • Recognition of parts Background is cluttered Robot path planning is complicated Collision avoidance is necessary Gripping strategy is needed Gripper design Re-grip, precision grip, and orientation control may be needed.

Challenge 1: Recognition
• 6 degrees of freedom makes it much harder to recognize the parts. • Cannot be trained in a generic way from real images.

Challenge 2: How to grip a part
• Must have multiple grip positions on the parts. Two reasons preventing grips: 1) The grip position on the part is facing down-wards. 2) The grip is impossible because of robot collision with bin frame

Challenge 3: Gripper Design
We need grip positions from all sides of the part. In most cases this requires two or more grippers on the “Tool-Unit”. Gripper examples:

Challenge 3: Gripper Design
Flexible Tool-Unit

Challenge 4: Path Planning
• Must avoid robot joint limits • Must include collision avoidance

Challenge 5: Re-grip
• A “Handling Station” is in most cases needed for: • Re-grip so parts can be placed correctly • Precision grip • Orientation control to ensure correct grip

Challenge 5: Re-grip
Handling Station Examples

Case Study: Picking Disks
Requirements: • Total cycle time: 10 seconds • Handle 71 different disks (diameter and thickness) • Two pallet frames • Robot has many other tasks, so binpicking can only take about 5 seconds.

Case Study: Picking Disks

Case Study: Picking Rotor Cans
Requirements: • Cycle time: 9.5 seconds • 4 pallet frames • Remove slip sheet between each layer

Case Study: Picking Rotor Cans

Case Study: Picking pipes
Demo Cell Used by AH Automation Example of: • Completely randomly placed parts • Deep bin (4 pallet frames) • Need for multiple grippers

Case Study: Picking pipes

Case Study: Split Cone and Nuts
Requirements: • Bin-pick two different products from two bins • Compact • Robot has many other tasks outside the bins

Case Study: Split Cone and Nuts

Final Words True 3D Bin-Picking is available now!
Things to come: • Even more generic • Improved path planning • More automated training of parts • Handling a bigger variety of part types

Contact Information
René Dencker Eriksen
Chief Technology Officer

Scape Technologies
Kochsgade 31 C, 3. sal DK-5000 Odense C Denmark Telephone: 45 70 25 31 13 email: rde@scapetechnologies.com www.scapetechnologies.com

Robot Visual Servoing – Opportunities and Challenges Ahead

Presented by:

Jane Shi and James Wells General Motors Corporation

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Jane Shi Staff Researcher GM R&D Center

Jane Shi GM R&D Center MC 480-106-359 30500 Mound Road Warren, Michigan 48090 Phone: 586-986-0353 Fax: 586-986-0574 Email: jane.shi@gm.com
Currently, Jane Shi works at General Motors R&D Center of Warren, Michigan as a staff researcher, focusing on the fundamental challenges in achieving a reliable, robust, and capable autonomous and intelligent robotic systems for automotive assembly. Her delivered research results are ranging from analytic models, innovative methods, to data analysis and related practical tools to address a variery of automotive manufacturing challenges in order tom improve its flexibility, efficiency, and reliability. Dr. She joined GM R&D Center in 2002. Prior to 2002, Dr. Shi’s work experience includes FANUC Robotics America, Inc. (1994-2002) and NIST (1988-1989). Jane earned her Ph.D. in robotics from Kansas State University in 1995. Dr. Shi is a member of IEEE and Robotics and Automation (RA) Society.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

James Wells Senior Staff Research Engineer GM R&D Manufacturing Systems Research

James Wells GM R&D Manufacturing Systems Research MC 480-106-359 30500 Mound Road Warren, Michigan 48090 Phone: 810-602-9879 Fax: 586-986-0574 Email: james.w.wells@gm.com
Jim joined GM’s Manufacturing Engineering organization in 1979 and has been working in the area of Robotics since 1982. Jim has held engineering and management positions primarily responsible for robot application development and manufacturing program support including simulation, paint operations, body assembly, robot procurement and specifications. Jim joined the R&D MSR group in 2003 and is currently Senior Staff Research Engineer working on developing advanced robotics with low cost flexible tooling and equipment for vehicle assembly. Jim has served SME Robotics International as Chairman for the RI Board of Advisors (1995) and is currently on the board of the RIA (Robotic Industries Association). Jim holds a Bachelors degree in Electrical Engineering from Rochester Institute of Technology and a Masters degree in Engineering from Purdue.

Presentation not available at time of production.

3D Robot Guidance for Cosmetic Sealer Applications

Presented by:

Kevin Taylor ISRA VISION SYSTEMS

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Kevin Taylor Vice President ISRA VISION SYSTEMS

Kevin Taylor ISRA VISION SYSTEMS 3350 Pine Tree Road Lansing, Michigan 48911 Phone: 517-887-8878 Fax: 517-887-8444 Email: ktaylor@isravision.com
Kevin has been with ISRA Vision since 1999. His first years with ISRA were solely in sales capacity. Several years ago he assumed the position of Vice President with the responsibility of the North American Business Unit. Prior to that, he spent 8 years selling automation to the automotive industry.

3D Robot Guidance for Cosmetic Sealer Applications
Kevin Taylor Vice President ISRA VISION

What is Cosmetic Sealer?
Cosmetic sealer is a sealer application where the appearance of the bead is important since it is painted and seen by the customer. Examples include:
Roof Ditch Door Hem Flange Hood Trunk Lid

Why automate Cosmetic Sealer?
Cosmetic sealer applications are automated for the following reasons:
Greatly Improved Quality of Appearance Consistency of Applied Sealer Reduction of Manpower Reduced Material Consumption

Why is 3D vision required for Cosmetic Sealer applications?
Cosmetic sealer applications require VGR for the following reasons:
Non-repeatable Positioning of Car Body Non-repeatable Positioning of Part to be Sealed Tight Tolerances of Application

The vision system provides:
3D Position of the Part to be Sealed Clearance Verification for Dispense Nozzle Part Position can be Detected with Robot Moving or Stopping in Multiple Acquisition Positions

Picture courtesy of SCA Schucker

Picture courtesy of SCA Schucker

Picture courtesy of SCA Schucker

Picture courtesy of SCA Schucker

Video courtesy of Esys Corporation

• Application
– 3D Measurement of Part in Open or Closed Position
• Measurement Accuracy: ± 0.1 mm • Measurements: Typically four (4) measurement points per part • Measurement Time: Less than 1 second per measurement

– Verification of Bead Position (optional) – Verification of Bead Geometry (optional) – Verification of Bead Surface (optional)

• Multiple Configurations
Stationary Sensors
• Minimum cycle time • Unflexible for Multiple Styles • Multiple Sensors Required

Sensor on Robot: Multiple Measurements
• • • • Maximum Flexibility Lowest Cost Solution Increased Cycle Time Due to Multiple Measurements Robot Moving or Robot Stopped

• Sensor Requirements
High Accuracy Measurement Time of < 0.2 Seconds per Feature Robust Construction for High Reliability Compact Design Automated Calibration Possibility for 3D Measurement AND Inspection Possible to Detect Edges and Holes to Allow Opening of Closures Suitable for Robot Mounting

Typical Requirements for Sensors for High Precision Robot Guidance
– Highly Accurate to 0.05 mm – Mobile or Stationary Mounting – Measurement Features: Edges, Holes, Planes, Free Form Features, Flush & Gapness – Sensor Lengths: 300mm to 800 mm

• Measurement Principle
Area Lighting Multi Line Projector

Camera Plane

3D Point

• Multi Line Sensors are Robust even with Disturbed Features

• Software Requirements for Sealant Applications
Combination of 3D Measurement and Inspection Software Platform is Identical to Other Accepted Robot Guidance Systems Improved Algorithms to Increase Robustness of Measurement Automated Calibration

Software Improved with: • Bead Inspection • Color Functionality

Menu Bar

• Ideal Software

Tool Bar

Indicator Bar Action Area

Interactive Area Information Area Status Bar

• Ideal Software - Diagnostics
All indicators are green – system is ready for operation

Overview Button Selected

Display of last measurement result, group, and task

Display of current system messages

• Ideal Software - Diagnostics
Red indicator signals source of problem

Red indicator signals last measurement failed

In the overview image the failed feature is indicated

Text line for error and diagnostic messages

• Software: Environmental light and color independant measurement • Problem: Blooming
– Destruction of image information by overexposed pixels due to reflections in the imaging area

• Software: Environmental light and color independant measurement • Solution: Color Control Functionality
Completed Image: Images at each shutter time:

• Software: Environmental light and color independant measurement • Solution: Color Control Functionality

Camera image of black and light car body Optimal contrast for all colors!

Application Solution
– 3D Measurement of Open Door
• • • • Measurement Accuracy: ± 0.1 mm Mobile Sensor Mounted on Robot Measurements: Four (4) Features on Door Measurement Time: 0.2 Seconds for Measurement Plus Robot Movement

Measurement points for position detection Measurement points for clearance check

• 1 Sensor on Robot Measuring 4 Door Features

Offset Vector (x,y,z,Rx,Ry,Rz)

• Bead Inspection
– Bead Presence/Absence

Hem Flange – No Bead

Hem Flange – Bead Present

• Bead Inspection
– Bead Geometry

Bead Too Wide and Shallow

Bead too Narrow

• Bead Inspection
– Bead Position and Height

Bead Position 2 mm too low Bead 0.7 mm too thick

Bead Position 2 mm too high

• Bead Inspection
– Optional Possibility: Surface Inspection

• Bead Inspection
– Optional Possibility (In Development): Surface Inspection Scaling

Cosmetic Sealing

Contact Information
Kevin Taylor
Vice President

ISRA VISION
3350 Pine Tree Road Lansing, Michigan 48911 USA Telephone: 517-887-8878 email: ktaylor@isravision.com www.isravision.com

Combining Machine Vision and Robotics to Mimic Complex Human Tasks

Presented by:

Michael Muldoon Averna Vision & Robotics

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Michael Muldoon Business Solutions Engineer AV&R Vision & Robotics Inc. (Averna Vision & Robotics)

Michael Muldoon AV&R Vision & Robotics Inc. (Averna Vision & Robotics) 269 Rue Prince Montreal, Quebec H3C 2N4 Canada Phone: 514-788-1420 Fax: 514-866-5830 Email: michael.muldoon@avr-vr.com
Mike studied at the University of Windsor graduating with a Bachelor of Applied Science in Electrical Engineering. Since then his main focus has been expanding the Machine Vision and Robotics technologies for applications ranging from laser welding to depalletizing to deburring and surface inspection. He spent the first part of his career in Windsor Ontario heavily involved in automotive production where the volume and range of parts was always the challenge. Now, in Montreal Quebec, he specializes in Aerospace applications which has different demands as it is low volume, complex and high precision production environments.

Combining Machine Vision and Robotics to Mimic Complex Human Tasks
Michael Muldoon, P.Eng Business Solutions Engineer AV&R Vision & Robotics
(Averna Vision & Robotics)

Presentation Agenda
• Automation Challenges • Aerospace Industry • Material Removal • Surface Inspection • Cad-to-path Strategies • Performance Testing • Case Study 1: Surface Inspection & Gauging System • Case Study 2: Finishing & Inspection System

Automation Challenges
Some Areas Typically Left to Humans:
• High accuracy material removal/part finishing; • Surface Inspections on complex, brilliant surfaces; • Random Bin Picking; • Assembly where there is tight tolerances and part variations; • Low volume & high complexity tasks.

“Humans are wired for change and very easily adapt to changing conditions, but are not very consistent”

Aerospace Industry Challenges
• Low volume production (batches) & frequent changeovers • Tight tolerances and complex surfaces usually left to human to finish & inspect • Aerospace engine manufacturer’s parts are typically shiny metal parts with complex surfaces and critical quality constraints. • Two areas that are drivers of change: • Material Removal • Ergonomics • Cost savings • Quality • Surface Inspection • Consistent quality • Critical requirements • Labor intensive

Material Removal
Typical Material Removal Requirements (Deburring) • Break all sharp edges, remove all burrs • Create a radius on edge • Typical 0.005" - 0.025" Control Strategies & Process Variation: • Feed Control • Consistent part location and size • Inconsistent burr or flash size • Pressure Control • Inconsistent part location & size • Consistent burr or flash size • Tool/feedback compliance • Active/passive • Calibration • Tool wear • Part Variations & Location

Surface Inspection
Random Surface Defect Detection •Resolution
•Typical min. defect size 0.001-0.015"

•Defect Classification
•NI Particle Analysis VI

Defect Examples: • Dents • Scratches • Cracks • Pits • Ripples • Die Marks

Inspection Algorithms

Raw Image

Left 0.008"

Right 0.016"

Inspection Algorithms

Raw Image

Left 0.020"

Right 0.020"

Cad-To-Path Strategies
Simulation Environment Vision Environment (Station HMI) Real-World Production

Future-> Real-time analysis & adjustment of robot path/recipe (ie cast parts, MRO facilities)

Performance Testing
Statistical Analysis Requires: • Defined performance requirements • Test Plan • Samples! • Ability to present useful information of data

Po

Pe

Kappa

A1-A2

0.995

0.7485

0.98

A1-A3

0.985

0.7414

0.94

A2-A3

0.98

0.738

0.92

Overall Repeatability Kappa

0.95

Good Agreement

Case Study 1: Surface Inspection & Gauging System
Description:
•Low volume production (batches) & frequent changeovers •Complex Surfaces usually left to human inspection •High levels of precision (+/-0.001") •Multiple inspection requirements: •Dimensional gauging •Surface Inspection •OCR Inspection

Solution Overview
Description:
•Stand-alone system •Approx. cycle time: 12 seconds per part •100% Inspection with guaranteed results • System Components: •Fanuc LR-Mate 200iC •LabVIEW 8.5 •NI-PCIe-1430 frame grabber •NI-PCI-6514 I/O card
OCV Serial Number Root Inspection Surface Inspections Dimensional Measurement

Case Study 2: Part Finishing & Inspection System
• Inspect serial number for part tracking & traceability • Deburr & create a radius on the all finished edges of the root form • Inspect all finishing operations • Inspect for random surface defects (scratches, nicks, pits etc)

Blade Root Form

Solution Overview
•Robot: • Fanuc LR Mate 200iC • 5kg Payload NI Smart Camera • R-J3 Controller •Vision System: • NI 1722 Smart Camera •Software: Ring •LabVIEW 8.5 Light •Inspection sequence •User Interfaces (viewer, alarm, calibration) •NI Vision Acquisition Algorithms
HMI Robot Controller PC

Bar Light

Robot

Final Quality Part

Finish

Inspect

Contact Information
Michael Muldoon, P.Eng
Business Solutions Engineer

AV&R Vision & Robotics Inc.
269 Rue Prince Montreal, Quebec Canada Telephone: 514-788-1420, x532 email: michael.muldoon@avr-vr.com www.avt-vr.com

Using 3D Laser Scanning for Robot Guidance

Presented by:

Brian Windsor SICK, Inc.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Brian Windsor Business Development Manager - Machine Vision SICK, Inc.

Brian Windsor SICK, Inc. 6900 West 110th Street Minneapolis, Minnesota 55438 Phone: 810-923-1880 Fax: 248-997-1068 Email: brian.windsor@sick.com
Brian has been working at SICK, Inc. for 3 years and is currently a Business Development Manager for SICK’s Machine Vision products. He has been involved in technical sales and engineering of industrial sensor and machine vision products for the past 15 years.

Using 3D Laser Scanning for Robot Guidance
Brian Windsor Business Development SICK, Inc.

3D Imaging Technology

What is 3D Profiling?
Camera captures the laser light resulting in a contour of the surface

Complete shape is built-up of contour slices as the object moves through the laser plane

3D Image Data

Intensity-based contrast

Height-based contrast

Height Resolution Calculation

Ordinary

Reverse ordinary

Specular

Look-away

Profile Capturing
Profiles can be captured based on time or encoder pulses

Inconsistent Speed

Slow Speed

Constant Speed

Scanning Methods
• Robot or other device moves camera, object is stationary • Object is moving on a conveyor, camera is stationary

Occlusion
Occlusion
• Camera Occlusion - the height of an object can block the laser creating areas of missing data • Laser Occlusion - the laser cannot ‘bend’ to see vertical surfaces

Performance Factors

3D Scanning Advantages

Application Example
• Application – Automated generation of order pallets for bulk and route delivery • Concept – Industrial robots on linear slides for pallet building – Laser guided vehicles for internal pallet transportation

Palletizing Application
• • This concept includes a high-speed CCD camera and a laser line projector. One image represents one profile line of the search area, where the grey value represents a height information. For acquiring an whole image a scan movement orthogonal to the laser line has to be performed. In this 3D profile the different articles can be found. Since this approach needs no article specific features, there is no teach-in process required!

• • •

Palletizing Application

Camera and laser unit

Scan movement

Palletizing Application

Fridge Packs

Range Data

Vision Result

Palletizing Application

Vision result 20oz Crates Range Data
(Bottle groups can be separated, position of the trays can be found)

Palletizing Application

Vision result 2L Crates Grey Scale Picture
(Bottle groups can be separated, position of the trays can be found)

Palletizing Application

Random Bin Picking

Random bin picking of metal tubes using 3D triangulation camera to scan the bin

Belt Picking
• • 3D triangulation camera is used to scan piles of scissor blades The robot picks the blade on top

Contact Information
Brian Windsor
Business Development

SICK, Inc.
6900 West 110th St Minneapolis, Minneapolis 55438 USA Telephone: 248-997-7618 email: brian.windsor@sick.com www.sickusa.com

Vision Options for “Dual Arm” Robot Guidance

Presented by:

Greg Garmann Motoman, Inc.

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Greg Garmann Software & Controls Technology Leader Motoman, Inc.

Greg Garmann Motoman, Inc. 1050 Dorset Road Troy, Ohio 45373 Phone: 937-440-2668 Fax: 937-440-2626 Email: greg.garmann@motoman.com
Computer Engineering Degree from Wright State University, Dayton, Ohio 21 Years of experience in Automation.

Vision Options for “Dual Arm” Robot Guidance
Greg Garmann
Software & Controls Technology Leader

Motoman Inc.

Robot Technology
• New developments in robot technology require new ways of working with vision systems. The “human-like” flexibility of movement with Motoman’s new dual arm robots provides unique solutions for the automation world. This presentation will show options in handling vision opportunities with dual arm robots.

Ultimate Flexibility

Gear Assembly

Air Conditioner Assembly

Air Conditioner Assembly

Spool Handling

Packing

Packing

Window Glass Sealing

Sunroof Assembly (Bolting)

Vision (Ageria) Solution

Vision (Ageria) Solution: Bolt Picking 1

Vision (Ageria) Solution: Bolt Picking 2

Parts Picking with Dual Arm

Contact Information
Greg Garmann
Software & Controls Technology Leader

Motoman Inc.
805 Liberty Lane West Carrollton, OH 45449 USA Telephone: 937-440-2668 email: greg.garmann@Motoman.com www.Motoman.com

Distance, Pitch & Yaw from a 2D Image

Presented by:

Steve Prehn FANUC Robotics America

International Conference for Vision Guided Robotics
September 30 – October 2, 2008

Steve Prehn Senior Product Manager - Vision FANUC Robotics America, Inc.

Steve Prehn FANUC Robotics America, Inc. 3900 W. Hamlin Road Rochester Hills, Michigan 48309 Phone: 248-276-4065 Email: steven.prehn@fanucrobotics.com
Steve Prehn has worked in the machine vision market for over 20 years. In addition to implementing over 200 vision systems, he has acted as product manager for VisionBlox software at Integral Vision, and CorrectPlace and CorrectPrint products at ESI. He is now product manager at FANUC, applying his knowledge of machine vision to extend the reach of iRVision into the material handling market. He has a bachelors of science in electrical engineering from DeVry Institute in Columbus, Ohio.

Distance, Pitch & Yaw from a 2D Image
Understanding the Dynamics of the Robotic / Vision Coordinate Interface

Steve Prehn Senior Product Manager - Vision FANUC Robotics America, Inc.

Image to Robot Relationship

In two-dimensional applications, the XY plane of the user frame specified here should be parallel to the target workpiece plane. How do you compensate when this is not the case?

Vision To Robot Transformations Considerations
• Camera mounting style
– Fixed position or Robot mounted camera

• Part Presentation issues
– In which axes is the part likely to move?
• X, Y, Rotation, Z, Pitch and Yaw

– Is the part consistent or is its presentation consistent? – Is it possible to correlate position from different perspectives? – Can structured light be used to help identify location?

2D Single Camera issue
Camera Image

• Height change creates subtle apparent size change. • Are you sure the part size is not different – creating the same affect?

Distance Calculation from an Accurate Focal Length
• Knowns:
Calculate Height Known Width

• Calibrated Focal length of Lens • Camera Array size • If Part size is know, calculate distance of the part from the camera

Consistent part size
• Find parts at two known heights. 1) Calculate scale change and correlate this to the height difference. (Delta to delta determines lens magnification) 2) The part size at this trained distance is then known

Multi-plane Calibration

Find the Calibration Grid at Two Levels. The camera will be calibrated at every height between the levels

Stereo Triangulation Method
• Locate multiple points and calculate z offset from two images • Height is found from relationship between points within the two images. • Clear edges and points will reduce confusion

Camera 1 Image

Camera 2 Image

Stereo Triangulation Method
Camera 1 Image

Camera 2 Image • On round parts, transformations may not be applied to exactly the same point – creating the possibility of error.

A 2D Change of Perspective
Camera Image

• As part orientation changes in pitch and yaw, surface points converge or diverge.

Structured Light
2D Camera Image Projected Line from a Laser

* Images Courtesy of Sick - IVP

How Does 3DL Work?
Structured-light projector CCD Camera

Laser ON

Laser OFF

3D processing Detection of distance and pose by structured-light Composition

2D processing Detection of position and rotation of the object from 2D image

Yaw, Pitch and Z 3D data with position and orientation

X, Y and Roll

X, Y, Z, Yaw, Pitch and Roll

Image View:
Normal and Parallel

Image View:
Z Movement

Image View:
Extreme Yaw

Image View:
Extreme Pitch

Image View:
Normal and Parallel

3DL Vision Results

3DL Vision Results

Contact Information
Steve Prehn
Senior Product Manager - Vision

FANUC Robotics America, Inc.
3900 W. Hamlin Road Rochester Hills, Michigan 48309 USA Telephone: 248-276-4065 email: steven.prehn@fanucrobotics.com www.fanucrobotics.com

VGR Panel Discussion

Sign up to vote on this title
UsefulNot useful