## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Licence Toolbox home page Discussion group

LGPL http://www.petercorke.com/vision http://groups.google.com.au/group/robotics-tool-box

Copyright c 2011 Peter Corke peter.i.corke@gmail.com September 2011 http://www.petercorke.com

Preface

This, the third release of the Toolbox, represents a decade of development. The last release was in 2005 and this version captures a large number of changes over that period but with extensive work over the last two years to support my new book “Robotics, Vision & Control” shown to the left.

Peter C0rke

Robotics,

Vision and Control

The practice of robotics and computer vision each involve the application of computational algorithms to data. The research community has developed a very large body of algorithms but for a newcomer to the ﬁeld this can be quite daunting. For more than 10 years the author has maintained two opensource matlab® Toolboxes, one for robotics and one for vision. They provide implementations of many important algorithms and allow users to work with real problems, not just trivial examples. This new book makes the fundamental algorithms of robotics, vision and control accessible to all. It weaves together theory, algorithms and examples in a narrative that covers robotics and computer vision separately and together. Using the latest versions of the Toolboxes the author shows how complex problems can be decomposed and solved using just a few simple lines of code. The topics covered are guided by real problems observed by the author over many years as a practitioner of both robotics and computer vision. It is written in a light but informative style, it is easy to read and absorb, and includes over 1000 matlab® and Simulink® examples and ﬁgures. The book is a real walk through the fundamentals of mobile robots, navigation, localization, armrobot kinematics, dynamics and joint level control, then camera models, image processing, feature extraction and multi-view geometry, and ﬁnally bringing it all together with an extensive discussion of visual servo systems.

Corke

Robotics,

Vision and Control

Peter Corke

1

Robotics, Vision and Control

isbn 978-3-642-20143-1

9 783642 201431

›

springer.com

The Machine Vision Toolbox (MVTB) provides many functions that are useful in machine vision and vision-based control. It is a somewhat eclecFUNDAMENTAL tic collection reﬂecting my personal interest in areas ALGORITHMS IN MATLAB® of photometry, photogrammetry, colorimetry. It includes over 100 functions spanning operations such 123 as image ﬁle reading and writing, acquisition, display, ﬁltering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration and color space conversion. The Toolbox, combined R with MATLAB and a modern workstation computer, is a useful and convenient environment for investigation of machine vision algorithms. For modest image sizes the processing rate can be sufﬁciently “real-time” to allow for closed-loop control. Focus of attention methods such as dynamic windowing (not provided) can be used to increase the processing rate. With input from a ﬁrewire or web camera (support provided) and output to a robot (not provided) it would be possible to implement a visual R servo system entirely in MATLAB . An image is usually treated as a rectangular array of scalar values representing intenR sity or perhaps range. The matrix is the natural datatype for MATLAB and thus makes the manipulation of images easily expressible in terms of arithmetic statements R in MATLAB language. Many image operations such as thresholding, ﬁltering and R statistics can be achieved with existing MATLAB functions. The Toolbox extends this core functionality with M-ﬁles that implement functions and classes, and mex-ﬁles for some compute intensive operations. It is possible to use mex-ﬁles to interface with image acquisition hardware ranging from simple framegrabbers to robots. Examples for ﬁrewire cameras under Linux are provided. The routines are written in a straightforward manner which allows for easy underR standing. MATLAB vectorization has been used as much as possible to improve efﬁciency, however some algorithms are not amenable to vectorization. If you have the

Machine Vision Toolbox for MATLAB

R

3

Copyright c Peter Corke 2011

is open-source. and I commend it to you. though this is much less signiﬁcant today than it was in the past. However the book “Robotics. This toolbox is not a clone of the Mathwork’s own Image Processing Toolbox (IPT) although there are many functions in common. Vision & Control” provides a detailed discussion (over 600 pages. This toolbox predates IPT by many years. contains many functions that are useful for image feature extraction and control. This toolbox considers images generally as arrays of double precision numbers. R R Machine Vision Toolbox for MATLAB R 4 Copyright c Peter Corke 2011 . This is extravagant on storage. Some particularly compute intensive functions are provided as mex-ﬁles and may need to be compiled for the particular platform. The manual is now auto-generated from the comments in the MATLAB code itself which reduces the effort in maintaining code and a separate manual as I used to — the downside is that there are no worked examples and ﬁgures in the manual.MATLAB compiler available then this can be used to compile bottleneck functions. nearly 400 ﬁgures and 1000 code examples) of how to use the Toolbox functions to solve many types of problems in robotics. It was developed under Unix and Linux systems and some functions rely on tools and utilities that exist only in that environment.

. . . . . . . . . . . . . . . . . . . Functions and classes Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Polygon . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Video . . . . . . . . . 1. . . . . . . . . . . . . . . . . . . . . . . . FishEyeCamera . . . . . . . . Movie . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Other toolboxes . . . . . . . CentralCamera . . . . . . . . . . . . . . . . . . . . . . . . . . RegionFeature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . .2 Support . . . . . . . . . . . . . . . . . PointFeature . . . . . . . . . . . . . . Hough . . . . . . . . . . . . . . . . . . . . . . . . . . Tracker . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 What’s new . . . . . . . . . BagOfWords . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Introduction 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . about . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . AxisWebCamera . . . . . . . . . . . . . . . . 2 . . . . . 3 10 10 10 11 11 11 11 12 12 13 13 19 29 32 37 40 41 45 48 53 55 59 62 63 70 73 78 80 83 85 87 88 89 89 90 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . FeatureMatch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . SphericalCamera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4 MATLAB version issues . . . SurfPointFeature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . CatadioptricCamera . . . . . .6. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . SiftPointFeature . . .3 How to obtain the Toolbox 1. . . . . LineFeature . . . angdiff . . . . . . . . . . . . . . . . . . .Contents Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . PGraph . . . . . . . . . . . . . . . . . . . . . . .6 Use in research . . . . ScalePointFeature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. anaglyph . . . . . . . . . . . . . . . . . . . . . .7 Acknowledgements . . . . . . . . . . . . . .5 Use in teaching . . . . . . . . . . 1. . . . . . . . . Machine Vision Toolbox for MATLAB R 5 Copyright c Peter Corke 2011 . . . . . . . . . Ray3D . . . . blackbody . . . . . . . . . .

. . . . . . . . . . . . igamma . . . . . . . . . . . . . . . cie primaries circle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . distance . . . . . . . . . . . . . . . . . . . . . . . . . iconv . . . . . . . . . . . . . . ianimate . . . . . . . . . . . . . . . . . col2im . . . . . . . . . . . . . . . . . . . . iblobs . . . . . . . h2e . . . . . . . bresenham . . . . . . . . . . . . . . . . . . . . . . . . . 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS CONTENTS boundmatch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . epidist . . . . . . . . . . . . . ccdresponse . . . . . . . . . . . . . . e2h . . . . . . . homwarp . . . . . edgelist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . colordistance colorize . . . R . . . . . . . . . . . . . . . . . . . . . . . . . . homline . . . . . . . . . . . colorspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ibbox . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ihist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . epiline . . . . . . . . . . colorkmeans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . camcald . . . . . . . . . homography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . cmfxyz . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 90 91 91 92 92 92 93 94 94 95 95 96 96 97 98 99 99 100 100 101 101 102 102 103 103 103 104 104 105 106 106 107 107 108 109 110 110 111 112 113 114 115 116 116 117 117 118 119 120 Machine Vision Toolbox for MATLAB Copyright c Peter Corke 2011 . . . . . . . . . . . . . . . . . . . . . . closest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iconcat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . diff2 . . . . . . . . . . . . . . . . . . . . . . fmatrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . idisp . . . igraphseg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . idisplabel . . . . . . . . . . . . . . . . . . iclose . . . . colnorm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iint . . . . . . . . . . . . . . icp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . colorname . . . . . . . . . . cmfrgb . . . . . . . . . . . . . . . . . . . . . humoments . . . . . . . . . . gauss2d . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . hitormiss . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . icorner . . . . . . . . . . . . iendpoint . . . . . . . . . . . . . . . . . . . . . . . . icanny . . . . . . . . . . . . . . . . . . . . . . . idouble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . homtrans . . . . . . . . . . . . . . . . . . . . . . . . . . icolor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . idecimate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iwindow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . istretch . . . . . . . . . . . . . . . . . . intgimage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ilabel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS CONTENTS iisum . . . . imeshgrid . . . . . . kcircle . . . . . . . . . . . . . . . . imatch . . . . irotate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . irank . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isurf . . . . . . . . . . . ithin . . . . . . . . . . . . . . . . . . . . . . . . . . iscalemax . . . . . . . . . . iopen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ipad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . irectify . . . . . . . . . . . . . . . . . imorph . ipyramid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isvec . . . . . . . . . . . . . . . . . . kgauss . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . itriplepoint . . . . . . . . . . . . . . . . . ipixswitch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iline . . . . . . . . . . . . . imono . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . R . . . . iroi . . . . . . . . . . . . . ithresh . . . . . . isrot . iscale . . . . . 120 120 121 122 122 123 123 124 125 125 126 126 127 127 128 128 129 129 130 131 132 132 133 133 134 134 135 135 136 136 136 137 138 138 139 140 140 141 142 143 143 144 144 145 145 146 147 147 148 148 Machine Vision Toolbox for MATLAB Copyright c Peter Corke 2011 . . . inormhist . . . . . . . . . . . isobel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ireplicate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ishomog . . . . . . . . . . . . . . . . . . . . . . . . . istereo . . . . . . . . . . . . . . . . . . . . . . . . . . . iscalespace . . . . . . . . . . . . . . . . . isimilarity . . . . . . . imser . ismooth . . . . . . . . . . . . . . . . . . . . . . . . . . . kdog . . . . . . . . . . . itrim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isift . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iproﬁle . . . . . . . . . . . . . . . 7 . . . . . . . . . . . isize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iscolor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iread . . . . . . . . . . . . . . . . . . . . . . . . . . . isamesize . . . . . imoments . . . . . . . . . ipaste . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . kdgauss . . . . . . . . . . . im2col . . . . . . . . . . .

. . . . . . . rotz . . . mlabel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ksobel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ramp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot ellipse . . . mkcube . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . rluminos . . . . . . . . mpq . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . otsu . . . plot sphere . . . . . numrows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ransac . . . . . ncc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . r2t . . . . . . . . . . . . . . . . klog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . maxﬁlt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . npq poly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . mkgrid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot homline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . pnmﬁlt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . pgmﬁlt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot box . . . . . . . . . . . . . . . . . . peak2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . norm2 . . rg addticks . . . . . . . . . . . . rotx . mplot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS CONTENTS klaplace . . . . . . . . . . . . . . . . . . . . R . . . . . . . . . . . . . . . . . . . . . . . . . radgrad . . . . . . . ktriangle . . . . . . . . plotp . . . . . . . . . . . . . medﬁlt1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot poly . . . . . . . . . . . . . . plot ellipse inv plot frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . kmeans . . mtools . lambda2xy . peak . . . . . . . . . . . . . . . . niblack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 . . . . . . . . . . loadspectrum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot circle . . . . . . . . . . . . . . . . . . . . . 149 149 150 150 151 151 152 152 153 153 153 154 154 155 155 155 156 156 156 157 158 158 158 159 159 160 160 161 162 162 163 163 164 164 164 165 165 166 166 167 167 168 168 169 169 171 171 172 172 172 Machine Vision Toolbox for MATLAB Copyright c Peter Corke 2011 . . . . . . . . . . . . . . . . . . . . . . . . . . luminos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . lambda2rg . . . . . . . . . . . . . plot2 . . . . . . . . . . roty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . npq . . . . . . . . . . . . . . mpq poly . . numcols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . tr2rt . . . . . . . . . . . . . . . . . . . . . . unit . . . . . . . . tr2rpy . . . trnorm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . tristim2cc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . zsad . . . . . . . . . . . . . rt2tr . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS CONTENTS rpy2tr . . . . . vex . . . . . . . . . . . . . . . useﬁg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 173 174 174 175 175 175 176 176 177 178 179 179 180 180 181 181 182 182 183 183 184 184 185 185 185 186 186 187 187 188 188 189 Machine Vision Toolbox for MATLAB R 9 Copyright c Peter Corke 2011 . . . . . . . . . . . . . . . . . . . . . . . . . . . ssd . . . . . . . . . . . . se2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . upq . . . . . trotz . . . . . . . . . . . . . xycolorspace yaxis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . t2r . . . . . . . . . . . . . . . . . . . . . . tb optparse . . . . . skew . . . . . . . . stdisp . . . . . . . . . . . . . . . . . upq poly . . . . . . . . . . . . . . . . . . . . . xaxis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . testpattern . . . . . . transl . . . . . . . . . . . . . . . . . . . . . . . trprint . . . . . . . . . . zssd . . . trotx . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . troty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . tpoly . . . . . . . . . . . . . . . zncc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . sad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . zcross . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . tr2angvec . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

I am very happy to accept contributions for inclusion in future versions of the toolbox. You might instead like to communicate with other users via the Google Group called “Robotics Toolbox” http://groups. I need you to write a Machine Vision Toolbox for MATLAB R 10 Copyright c Peter Corke 2011 .Chapter 1 Introduction 1.google. I am happy to correspond with people who have found genuine bugs or deﬁciencies but my response time can be long and I can’t guarantee that I respond to your email. and you will be suitably acknowledged.2 Support There is no support! This software is made freely available in the hope that you ﬁnd it useful in solving whatever problems you have to hand. I can guarantee that I will not respond to any requests for help with assignments or homework. That’s what you your teachers. You need to signup in order to post.com. lecturers and professors are paid to do. tutors.au/group/robotics-tool-box which is a forum for discussion.1 • x New features: • x Bugﬁxes: • Improved error messages in many functions • Removed trailing commas from if and for statements What’s new Changes: 1. no matter how urgent or important they might be to you. and the signup process is moderated by me so allow a few days for this to happen.

I.5 Use in teaching This is deﬁnitely encouraged! You are free to put the PDF manual (vision. Volume = {12}. Month = nov. Year = {2005}.pdf or the web-based documentation html/*.1. type of organization and application.html on a server for class use.6 Use in research If the Toolbox helps you in your endeavours then I’d appreciate you citing the Toolbox when you publish. A menu-driven demonstration can be invoked by the function rtdemo. Number = {4}. The details are @article{Corke05f.3. The web page requests some information from you regarding such as your country. Machine Vision Toolbox for MATLAB R 11 Copyright c Peter Corke 2011 . The ﬁle robot. If you plan to distribute paper copies of the PDF manual then every copy must include the ﬁrst two pages (cover and licence).pdf is a manual that describes all functions in the Toolbox. Author = {P. Journal = {IEEE Robotics and Automation Magazine}. 1.petercorke. This is just a means for me to gauge interest and to help convince my bosses (and myself) that this is a worthwhile activity. 1. HOW TO OBTAIN THE TOOLBOX CHAPTER 1.com The ﬁles are available in either gzipped tar format (.4 MATLAB version issues The Toolbox has been tested under R2011a.zip). Corke}. 1.gz) or zip format (.3 How to obtain the Toolbox The Machine Vision Toolbox is freely available from the Toolbox home page at http://www. It is R auto-generated from the comments in the MATLAB code and is fully hyperlinked: to external web sites. Title = {Machine Vision Toolbox}. the table of content to functions. and the “See also” functions to each other. 1. INTRODUCTION few words about why you want to join the list so I can distinguish you from a spammer or a web-bot.

S.org is a great collection of advanced computer vision algorithms for MATLAB. 1. See the ﬁle CONTRIB for details. pp 16–25. 1994 University of British Columbia. P.1.1 Other toolboxes Matlab Central http://www. contributed by Nuno Alexandre Cid Martins of I. RANSAC code by Peter Kovesi.6. Pascal Fua at the CVLab-EPFL. 12(4). Twente.the graph-based image segmentation software by Pedro Felzenszwalb. which is also given in electronic form in the CITATION ﬁle.Functions such as SURF. ACKNOWLEDGEMENTS CHAPTER 1. IEEE Robotics and Automation Magazine. Coimbra.7 Acknowledgements Last. Vincent Lepetit. color space conversions by Pascal Getreuer.R. MSER.. but not least. the k-means and MSER algorithms by Andrea Vedaldi and Brian Fulkerson. numerical routines for geometric vision by various members of the Visual Geometry Group at Oxford (from the web site of the Hartley and Zisserman book.7. and the SURF feature detector by Dirk-Jan Kroon at U.mathworks. 1.vlfeat. pose estimation by Francesco Moreno-Noguer. and there are hundreds of modules available. November 2005. The Camera Calibration Toolbox by Jean-Yves Bouguet is used unmodiﬁed. Machine Vision Toolbox for MATLAB R 12 Copyright c Peter Corke 2011 .I. this release includes functions for computing image plane homographies and the fundamental matrix. graph-based segmentation and pose estimation are based on great code Some of the MEX ﬁle use some really neat macros that were part of the package VISTA Copyright 1993. INTRODUCTION Pages = {16-25} } or “Machine Vision Toolbox”.com/matlabcentral is a great resource for user contributed MATLAB code. VLFeat http://www. Corke.

Methods plot hold ishold clf ﬁgure mesh point line plot camera rpy move centre delete char display plot projection of world point to image plane control ﬁgure hold for image plane window test ﬁgure hold for image plane clear image plane ﬁgure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera in world view set camera attitude clone Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Properties (read/write) npix pp rho T image dimensions (2 × 1) principal point (2 × 1) pixel dimensions (2 × 1) in metres camera pose as homogeneous transformation R Machine Vision Toolbox for MATLAB 13 Copyright c Peter Corke 2011 .Chapter 2 Functions and classes Camera Camera superclass An abstract superclass for Toolbox camera classes.

used by all subclasses. S ‘centre’. The ‘image’ option paints the speciﬁed image onto the image plane and allows points and lines to be overlaid. S ‘noise’. P ‘pixel’. N ‘sensor’. FUNCTIONS AND CLASSES Properties (read only) nu nv u0 v0 number of pixels in u-direction number of pixels in v-direction principal point u-coordinate principal point v-coordinate Notes • Camera is a reference object. C Name of camera Load image IM to image plane Image plane resolution: N × N or N=[W H] Image sensor size in metres (2 × 1) [metres] Principal point (2 × 1) Pixel size: S × S or S=[W H] Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation Color of image plane background (default [1 1 0. N ‘image’. Camera.8]) Notes • Normally the class plots points and lines into a set of axes that represent the image plane. • Camera objects can be used in vectors and arrays • This is an abstract class and must be subclassed and a project() method deﬁned. this window is protected and can only be accessed by the plot methods of this object. Options ‘name’.Camera Create camera object Constructor for abstact Camera class.CHAPTER 2. SIGMA ‘pose’. T ‘color’. IM ‘resolution’. C = Camera(options) creates a default (abstract) camera with null parameters. Machine Vision Toolbox for MATLAB R 14 Copyright c Peter Corke 2011 . • The object can create a window to display the Camera image plane.

CHAPTER 2.display Display value C.display() displays a compact human-readable representation of the camera parameters.char() is a compact string representation of the camera parameters.clf Clear the image plane C.centre Get camera position p = C. Camera. Camera. Camera.clf() removes all graphics from the camera’s image plane. FUNCTIONS AND CLASSES See also CentralCamera.char Convert to string s = C. Camera.delete Camera object destructor C. SphericalCamera Camera.centre() is the 3-dimensional position of the camera centre (3 × 1).delete() destroys all ﬁgures associated with the Camera object and removes the object. CatadioptricCamera. Machine Vision Toolbox for MATLAB R 15 Copyright c Peter Corke 2011 . ﬁsheyecamera.

Camera. See also Camera.line Plot homogeneous lines on image plane C.ishold Return image plane hold status H = C.hold(H) hold mode is set on if H is true (or > 0).char Camera. otherwise false (0). Camera.ﬁgure() is the handle of the ﬁgure that contains the camera’s image plane graphics.u + b. Machine Vision Toolbox for MATLAB R 16 Copyright c Peter Corke 2011 . C.ﬁgure Return ﬁgure handle H = C.v + c = 0.hold Control hold on image plane graphics C. and off if H is false (or 0). Camera.hold() sets “hold on” for the camera’s image plane.ishold() returns true (1) if the camera’s image plane is in hold mode. FUNCTIONS AND CLASSES Notes • This method is invoked implicitly at the command line when the result of an expression is a Camera object and the command has no trailing semicolon.line(L) plots lines on the camera image plane which are deﬁned by columns of L (3 × N ) considered as lines in homogeneous form: a.CHAPTER 2.

If p is 2 × N the points are assumed to be image plane coordinates and are plotted directly. z are of the same size and the corresponding elements of the matrices deﬁne 3D points. Machine Vision Toolbox for MATLAB R 17 Copyright c Peter Corke 2011 .move Instantiate displaced camera C2 = C. options) projects world points p (3 × N ) to the image plane and plots them. y. Camera. Options ‘Tobj’.clf Camera.mesh Plot mesh object on image plane C. Temporarily overrides the current camera pose C. Camera. y.plot. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane.T. options) projects a 3D shape deﬁned by the matrices x. Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. y.plot(p.hold.mesh(x. See also mesh. sphere. z.plot Plot points on image plane C. cylinder. T ‘Tcam’. Camera. Camera. z to the image plane and plots them. mkcube. uv = C. If p has 3 dimensions (3xNxS) then it is considered a sequence of point sets and is displayed as an animation.plot(p) as above but returns the image plane coordinates uv (2 × N ). FUNCTIONS AND CLASSES Camera.CHAPTER 2. The matrices x.move(T) is a new camera object that is a clone of C but its pose is displaced by the homogeneous transformation T with respect to the current pose of C.

‘Tobj’.2 x maximum axis dimension) Notes • The graphic handles are stored within the Camera object. T See also Camera. N Number of frames per second for point sequence display ‘sequence’ Annotate the points with their index ‘textcolor’.clf Camera. T ‘scale’. FUNCTIONS AND CLASSES Options Transform all points by the homogeneous transformation T before projecting them to the camera image plane. Options ‘Tcam’.T. T Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. Camera. S Text size for annotation (default 12) ‘drawnow’ Execute MATLAB drawnow function Additional options are considered MATLAB linestyle parameters and are passed directly to plot.plot camera(options) draw a camera as a simple 3D model in the current ﬁgure. ‘fps’.hold. C Text color for annotation (default black) ‘textsize’.plot camera Display camera icon in world view C. S Camera displayed in pose T (homogeneous transformation 4 × 4) Overall scale factor (default 0.CHAPTER 2.point(p) plots points on the camera image plane which are deﬁned by columns of p (3 × N ) considered as points in homogeneous form. Camera. Machine Vision Toolbox for MATLAB R 18 Copyright c Peter Corke 2011 . Overrides the current camera pose C.point Plot homogeneous points on image plane C. Camera.mesh. ‘Tcam’.

p.CHAPTER 2.y]. C. a subclass of Camera. CentralCamera Perspective camera class A concrete class for a central-projection perspective camera.p. The image is not inverted.rpy(rpy) as above but rpy=[R. Machine Vision Toolbox for MATLAB R 19 Copyright c Peter Corke 2011 . y) sets the camera attitude to the speciﬁed roll-pitch-yaw angles.rpy(R. FUNCTIONS AND CLASSES Camera. the focal point is at z=0 and the image plane is at z=f. The camera coordinate system is: 0------------> u X | | | + (principal point) | | Z-axis is into the page.rpy Set camera attitude C. v Y This camera model assumes central projection. that is.

FUNCTIONS AND CLASSES Methods project K C H invH F E invE fov ray plot hold ishold clf ﬁgure mesh point line plot camera plot line tr plot epiline ﬂowﬁeld visjac p visjac p polar visjac l visjac e rpy move centre estpose delete char display project world points camera intrinsic matrix camera matrix camera motion to homography decompose homography camera motion to fundamental matrix camera motion to essential matrix decompose essential matrix ﬁeld of view Ray3D corresponding to point plot projection of world point on image plane control hold for image plane test ﬁgure hold for image plane clear image plane ﬁgure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera in world view draw line in theta/rho format draw epipolar line compute optical ﬂow image Jacobian for point features image Jacobian for point features in polar coordinates image Jacobian for line features image Jacobian for ellipse features set camera attitude clone Camera after motion get world coordinate of camera centre estimate pose object destructor convert camera parameters to string display camera parameters Properties (read/write) npix pp rho f k p distortion T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: pixel dimensions (2 × 1) in metres intrinsic: focal length intrinsic: radial distortion vector intrinsic: tangential distortion parameters intrinsic: camera distortion [k1 k2 k3 p1 p2] extrinsic: camera pose as homogeneous transformation Machine Vision Toolbox for MATLAB R 20 Copyright c Peter Corke 2011 .CHAPTER 2.

C Camera matrix C = C. CentralCamera. C = CentralCamera(options) as above but with speciﬁed parameters.C() is the 3×4 camera matrix. • Camera objects can be used in vectors and arrays See also Camera CentralCamera.CentralCamera Create central projection camera object C = CentralCamera() creates a central projection camera with canonic parameters: f=1 and name=’canonic’.CHAPTER 2. FUNCTIONS AND CLASSES Properties (read only) nu nv u0 v0 number of pixels in u-direction number of pixels in v-direction principal point u-coordinate principal point v-coordinate Notes • Camera is a reference object. Machine Vision Toolbox for MATLAB R 21 Copyright c Peter Corke 2011 . also known as the camera calibration or projection matrix.

Soatto. SIGMA ‘pose’.F(F) is the essential matrix based on the fundamental matrix F (3 × 3) and the intrinsic parameters of camera C.T and the second is a relative motion represented by the homogeneous transformation T. u. S. D ‘distortion-bouguet’. N ‘sensor’. T ‘color’. 2003.CHAPTER 2. Springer. Reference Y. IM ‘resolution’.and v-axes parallel to x. S.invE Machine Vision Toolbox for MATLAB R 22 Copyright c Peter Corke 2011 . CentralCamera.8]) See also Camera.Sastry. S ‘centre’. p. E = C. SphericalCamera CentralCamera. The ﬁrst view is from the current camera pose C. CatadioptricCamera. S ‘noise’. E = C. f=8mm.F. J.Ma.177 See also CentralCamera. N ‘focal’. camera at origin. D ‘default’ ‘image’. ﬁsheyecamera.and y-axes respectively.E(T) is the essential matrix relating two camera views. C Name of camera Focal length [metres] Distortion vector [k1 k2 k3 p1 p2] Distortion vector [k1 k2 p1 p2 k3] Default camera parameters: 1024 × 1024.F(C2) is the essential matrix relating two camera views described by camera objects C (ﬁrst view) and C2 (second view). “An invitation to 3D”. F ‘distortion’. FUNCTIONS AND CLASSES Options ‘name’.Kosecka. Display an image rather than points Image plane resolution: N × N or N=[W H] Image sensor size in metres (2 × 1) Principal point (2 × 1) Pixel size: S × S or S=[W H] Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation Color of image plane background (default [1 1 0. P ‘pixel’. 10um pixels. optical axis is z-axis.E Essential matrix E = C.

E CentralCamera. The ﬁrst view is from the current camera pose C.F Fundamental matrix F = C.Kosecka.H Homography matrix H = C. from two viewpoints. Springer. d) is a 3 × 3 homography matrix for the camera observing the plane with normal n and at distance d. FUNCTIONS AND CLASSES CentralCamera. S. Reference Y.177 See also CentralCamera.T and the second is a relative motion represented by the homogeneous transformation T. p.F(C2) is the fundamental matrix relating two camera views described by camera objects C (ﬁrst view) and C2 (second view). F = C. See also CentralCamera. The ﬁrst view is from the current camera pose C.T and the second is after a relative motion represented by the homogeneous transformation T.Sastry.H(T. “An invitation to 3D”. n.F(T) is the fundamental matrix relating two camera views.K() is the 3 × 3 intrinsic parameter matrix.Soatto. S. J. 2003.Ma.CHAPTER 2.H CentralCamera. Machine Vision Toolbox for MATLAB R 23 Copyright c Peter Corke 2011 .K Intrinsic parameter matrix K = C.

vol. V.invE(E) decomposes the essential matrix E (3 × 3) into the camera motion. In practice there are multiple solutions and s (4x4xN) is a set of homogeneous transformations representing possible camera motion. Reference “EPnP: An accurate O(n) solution to the PnP problem”. Journal on Computer Vision. See also quiver CentralCamera. FUNCTIONS AND CLASSES CentralCamera.fov Camera ﬁeld-of-view angles.invE Decompose essential matrix s = C. Int.ﬂowﬁeld(v) displays the optical ﬂow pattern for a sparse grid of points when the camera has a spatial velocity v (6 × 1). CentralCamera. uv (2×N ) are the corresponding image plane coordinates. 81.CHAPTER 2. Fua. Feb. 155-166. uv) is an estimate of the pose of the object deﬁned by coordinates xyz (3×N ) in its own coordinate frame. and P. 2009. a = C. Moreno-Noguer. CentralCamera.estpose Estimate pose from object model and camera view T = C.fov() are the ﬁeld of view angles (2 × 1) in radians for the camera x and y (horizontal and vertical) directions.ﬂowﬁeld Optical ﬂow C. Machine Vision Toolbox for MATLAB R 24 Copyright c Peter Corke 2011 . F. Lepetit. pp.estpose(xyz.

Ma.Kosecka. s.CHAPTER 2.Soatto.Sastry. In practice there are multiple solutions and s is a vector of structures with elements: • T.Kosecka. FUNCTIONS AND CLASSES s = C. Reference Hartley & Zisserman. s. section 5. “An invitation to 3D”. translation not to scale • n. p120122 Notes • The transformation is from view 1 to view 2.Ma. 2003.Soatto. 259 Y. Springer. Chap 9.invH Decompose homography matrix s = C. “Multiview Geometry”. Reference Y. p.Sastry. J. p) as above but only solutions in which the world point p is visible are returned. normal vector to the plane (3 × 3) Notes • There are up to 4 solutions • Only those solutions that obey the positive depth constraint are returned • The required camera intrinsics are taken from the camera object • The transformation is from view 1 to view 2. p116. Springer. J. s.E CentralCamera.invE(E. “An invitation to 3D”. 2003. camera motion as a homogeneous transform matrix (4 × 4). See also CentralCamera. s.invH(H) decomposes the homography H (3 × 3) into the camera motion and the normal to the plane.3 Machine Vision Toolbox for MATLAB R 25 Copyright c Peter Corke 2011 .

p) plots the epipolar lines due to the fundamental matrix f and the image points p. Temporarily overrides the current camera pose C. ‘Tobj’.plot epiline Plot epipolar line C.CHAPTER 2.project Project world points to image plane uv = C. p) as above but return a vector of graphic handles.T. p.project(p. H = C. T Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane.plot line tr Plot line in theta-rho format CentralCamera. options) are the image plane coordinates (2 × N ) corresponding to the world points p (3 × N ). ‘Tcam’. See also Hough CentralCamera. FUNCTIONS AND CLASSES See also CentralCamera.plot epiline(f. CentralCamera.plot epiline(f.H CentralCamera. If Tcam (4x4xS) is a transform sequence then uv (2xNxS) represents the sequence of projected points as the camera moves in the world. Options Transform all points by the homogeneous transformation T before projecting them to the camera image plane. T Machine Vision Toolbox for MATLAB R 26 Copyright c Peter Corke 2011 .plot epiline(f.plot line tr(L) plots lines on the camera’s image plane that are described by columns of L with rows theta and rho respectively. ls) as above but draw lines using the line style arguments ls. C. one per line.

Rives. Reference Hartley & Zisserman. one for each point deﬁned by the columns of p. “Multiview Geometry”.2E2uv + 2E3u + 2E4v + E5 = 0.visjac e Visual motion Jacobian for point feature J = C.visjac e(E. 8. pp. FUNCTIONS AND CLASSES If Tobj (4x4x) is a transform sequence then uv (2xNxS) represents the sequence of projected points as the object moves in the world. Reference B. 313-326.plot CentralCamera. Chaumette.ray 3D ray for image point R = C. F.c. Espiau. “A New Approach to Visual Servoing in Robotics”. Machine Vision Toolbox for MATLAB R 27 Copyright c Peter Corke 2011 .d) such that aX + bY + cZ + d = 0. vol. p 162 See also Ray3D CentralCamera. The ellipse lies in the world plane pl = (a.CHAPTER 2.b. IEEE Transactions on Robotics and Automation. See also Camera. pl) is the image Jacobian (5 × 6) for the ellipse E (5 × 1) described by u2 + E1v2 . and P. The Jacobian gives the rates of change of the ellipse parameters in terms of camera spatial velocity.ray(p) returns a vector of Ray3D objects. June 1992.

Machine Vision Toolbox for MATLAB R 28 Copyright c Peter Corke 2011 . 1996. R&A. z) is the image Jacobian (2N × 6) for the image plane points uv (2 × N ). The Jacobian gives the image-plane point velocity in terms of camera spatial velocity.visjac p. CentralCamera.visjac l CentralCamera. Reference B. Chaumette. Vol 12(5).visjac p polar. and P. CentralCamera. CentralCamera. 8.visjac p. FUNCTIONS AND CLASSES See also CentralCamera.visjac e CentralCamera.visjac l(L. Oct. F. See also CentralCamera. IEEE Transactions on Robotics and Automation. or a vector (N × 1) of depth for each point.b. pp. pl) is the image Jacobian (2N × 6) for the image plane lines L (2 × N ).visjac l Visual motion Jacobian for line feature J = C. Each column of L is a line in theta-rho format. 313-326.CHAPTER 2.visjac p(uv. and the rows are theta and rho respectively. Hager & Corke. Espiau. June 1992. pp 651-670. The depth of the points from the camera is given by z which is a scalar for all points.visjac p Visual motion Jacobian for point feature J = C. Rives. Hutchinson.c. vol. Reference “A tutorial on Visual Servo Control”. IEEE Trans.visjac p polar.d) such that aX + bY + cZ + d = 0. “A New Approach to Visual Servoing in Robotics”. CentralCamera. The lines all lie in the plane pl = (a. The Jacobian gives the rates of change of the line parameters in terms of camera spatial velocity.

F.visjac e CentralCamera. and F.visjac p polar(rt. Conf on Intelligent Robots and Systems (IROS). Oct. CentralCamera.visjac e SiftPointFeature SIFT point corner feature object A subclass of PointFeature for SIFT features. CentralCamera. Machine Vision Toolbox for MATLAB R 29 Copyright c Peter Corke 2011 .visjac l.visjac l. See also CentralCamera. CentralCamera. Spindler.CHAPTER 2. 5962-5967.visjac p. z) is the image Jacobian (2N × 6) for the image plane points rt (2 × N ) described in polar form. (St. FUNCTIONS AND CLASSES See also CentralCamera. radius and theta. 2009. The Jacobian gives the image-plane polar point coordinate velocity in terms of camera spatial velocity. I. Corke. Int.visjac p polar. Louis). in Proc. pp. The depth of the points from the camera is given by z which is a scalar for all point.visjac p polar Visual motion Jacobian for point feature J = C. Chaumette. or a vector (N × 1) of depths for each point. Reference “Combining Cartesian and polar coordinates in IBVS”. CentralCamera. P.

Journal on Computer Vision. FUNCTIONS AND CLASSES Methods plot plot scale distance match ncc uv display char Plot feature position Plot feature scale Descriptor distance Match features Descriptor similarity Return feature coordinate Display value Convert value to string Properties u horizontal coordinate v vertical coordinate strength feature strength theta feature orientation [rad] scale feature scale descriptor feature descriptor (vector) index of image containing feature image id Properties of a vector of SiftCornerFeature objects are returned as a vector. References “Distinctive image features from scale-invariant keypoints”. SurfPointFeature Machine Vision Toolbox for MATLAB R 30 Copyright c Peter Corke 2011 .Lowe.91-110. Notes • SiftCornerFeature is a reference object. Nov. See also isift. ScalePointFeature. pp. vol. If F is a vector (N ×1) of SiftCornerFeature objects then F.u is a 2×N matrix with each column the corresponding u coordinate. You can download a SIFT implementation which this class can utilize. 2004. D. PointFeature.SIFT. See README. Int.60.CHAPTER 2. • SiftCornerFeature objects can be used in vectors and arrays • The SIFT algorithm is patented and not distributed with this toolbox.

match(f2. options) is a vector of FeatureMatch objects that describe candidate matches between the two vectors of SIFT features F and f2. See also isift SiftPointFeature.2) Machine Vision Toolbox for MATLAB R 31 Copyright c Peter Corke 2011 . C ‘alpha’. SiftPointFeature. If F is a vector then each element is plotted.match Match SIFT point features m = F. Options ‘circle’ ‘clock’ ‘arrow’ ‘disk’ ‘color’. A Indicate scale by a circle (default) Indicate scale by circle with one radial line for orientation Indicate scale and orientation by an arrow Indicate scale by a translucent disk Color of circle or disk (default green) Transparency of disk. Correspondence is based on descriptor similarity. FUNCTIONS AND CLASSES SiftPointFeature.SiftPointFeature Create a SIFT point feature object f = SiftPointFeature() is a point feature object with null parameters. 1=opaque. f = PointFeature(u. f = PointFeature(u.plot scale(options. strength) as above but with speciﬁed strength. v. ls) as above but the optional line style arguments ls are passed to plot.plot scale Plot feature scale F.CHAPTER 2. 0=transparent (default 0. F.plot scale(options) overlay a marker to indicate feature point position and scale. v) is a point feature object with speciﬁed coordinates.

support(images.T] = F.support Support region of feature out = F. w) as above but if the features were extracted from an image sequence images then the feature is extracted from the appropriate image in the same sequence. w) is an image of the support region of the feature F. F. [out.support(images. FUNCTIONS AND CLASSES SiftPointFeature.support(im. w) as above but the support region is displayed.support(im.CHAPTER 2. Machine Vision Toolbox for MATLAB R 32 Copyright c Peter Corke 2011 . out = F. extracted from the image im in which the feature appears. See also SiftPointFeature SphericalCamera Spherical camera class A concrete class a spherical-projection camera. w) as above but returns the pose of the feature as a 3 × 3 homogeneous transform in SE(2) that comprises the feature position and orientation. The support region is scaled to w × w and rotated so that the feature’s orientation axis is upward.

• SphericalCamera objects can be used in vectors and arrays See also Camera Machine Vision Toolbox for MATLAB R 33 Copyright c Peter Corke 2011 .CHAPTER 2. FUNCTIONS AND CLASSES Methods project plot hold ishold clf ﬁgure mesh point line plot camera rpy move centre delete char display project world points plot/return world point on image plane control hold for image plane test ﬁgure hold for image plane clear image plane ﬁgure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera set camera attitude copy of Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Properties (read/write) npix pp rho T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: pixel dimensions (2 × 1) in metres extrinsic: camera pose as homogeneous transformation Properties (read only) nu nv number of pixels in u-direction number of pixels in v-direction Note • SphericalCamera is a reference object.

T Name of camera Pixel size: S × S or S(1)xS(2) Pose of the camera as a homogeneous transformation See also Camera. FUNCTIONS AND CLASSES SphericalCamera. N ‘pixel’. Options ‘name’. See also SphericalCamera.project(p.project Project world points to image plane pt = C. options) are the image plane coordinates for the world points p.plot Machine Vision Toolbox for MATLAB R 34 Copyright c Peter Corke 2011 . Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. The columns of p (3 × N ) are the world points and the columns of pt (2 × N ) are the corresponding spherical projection points.CHAPTER 2. Options ‘Tobj’. Overrides the current camera pose C. CatadioptricCamera SphericalCamera. CentralCamera. each column is phi (longitude) and theta (colatitude).T. ﬁsheyecamera. C = CentralCamera(options) as above but with speciﬁed parameters.SphericalCamera Create spherical projection camera object C = SphericalCamera() creates a spherical projection camera with canonic parameters: f=1 and name=’canonic’. T ‘Tcam’. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane. S ‘pose’.

showing the target points and the camera The results structure contains time-history information about the image plane. of 4-vector.0. Jacobian condition number. params) Simulate IBVS with for a square target comprising 4 points is placed in the world XY plane. showing the desired view (*) and the current view (o) 2. The params structure can be used to override simulation defaults by providing elements.a stopping criteria on feature error norm (0) . FUNCTIONS AND CLASSES SphericalCamera. params) Simulate IBVS with for a square target comprising 4 points is placed in the world XY plane. showing the desired view (*) and the current view (o) Machine Vision Toolbox for MATLAB R 35 Copyright c Peter Corke 2011 . The camera view.CHAPTER 2.sph2 Implement spherical IBVS for point features results = sph(T) results = sph(T.01) . The external view.sph Implement spherical IBVS for point features results = sph(T) results = sph(T.gain. error norm.camera intrinsic structure (camparam) . Two windows are shown and animated: 1.the side length of the target in world units (0. defaults in parentheses: target size .depth of points to use for Jacobian.5) target center . can be scalar or diagonal 6 × 6 matrix (0. camera pose.2) niter eterm lambda ci depth . error.the number of iterations to run the simulation (500) . The camera view. The camera/robot is initially at pose T and is driven to the orgin. The camera/robot is initially at pose T and is driven to the orgin. image plane size and desired feature locations. scalar for If null take actual value all points.center of the target in world coords (0. Two windows are shown and animated: 1. from simulation ([]) SEE ALSO: ibvsplot SphericalCamera.

or a vector (N × 1) for each point. scalar for If null take actual value all points. image plane size and desired feature locations. CentralCamera. IEEE Int. of 4-vector. for all points.visjac p Visual motion Jacobian for point feature J = C. Reference “Spherical image-based visual servo and structure estimation”. See also CentralCamera.0. error. from simulation ([]) SEE ALSO: ibvsplot SphericalCamera.visjac l.visjac p polar. error norm. can be scalar or diagonal 6 × 6 matrix (0. (Anchorage). Robotics and Automation.the side length of the target in world units (0.3) niter eterm lambda ci depth . defaults in parentheses: target size . I. The external view. May 3-7 2010. P.camera intrinsic structure (camparam) . FUNCTIONS AND CLASSES 2.depth of points to use for Jacobian. z) is the image Jacobian (2N × 6) for the image plane points pt (2 × N ) described by phi (longitude) and theta (colatitude).center of the target in world coords (0.gain.visjac p(pt. The depth of the points from the camera is given by z which is a scalar.5) target center .CHAPTER 2. showing the target points and the camera The results structure contains time-history information about the image plane. pp.the number of iterations to run the simulation (500) . 5550-5555. Conf. in Proc.01) .visjac e Machine Vision Toolbox for MATLAB R 36 Copyright c Peter Corke 2011 . CentralCamera. The Jacobian gives the image-plane velocity in terms of camera spatial velocity. Corke. camera pose. Jacobian condition number. The params structure can be used to override simulation defaults by providing elements.a stopping criteria on feature error norm (0) .

Andreas Ess.CHAPTER 2. Computer Vision and Image Understanding (CVIU). Methods plot plot scale distance match ncc uv display char Plot feature position Plot feature scale Descriptor distance Match features Descriptor similarity Return feature coordinate Display value Convert value to string Properties u horizontal coordinate v vertical coordinate strength feature strength scale feature scale theta feature orientation [rad] descriptor feature descriptor (vector) index of image containing feature image id Properties of a vector of SurfCornerFeature objects are returned as a vector. 2008 Machine Vision Toolbox for MATLAB R 37 Copyright c Peter Corke 2011 . “SURF: Speeded Up Robust Features”. If F is a vector (N × 1) of SurfCornerFeature objects then F.u is a 2 × N matrix with each column the corresponding u coordinate. 110. pp. Notes • SurfCornerFeature is a reference object. 346–359. No. • SurfCornerFeature objects can be used in vectors and arrays Reference Herbert Bay. 3. FUNCTIONS AND CLASSES SurfPointFeature SURF point corner feature object A subclass of PointFeature for SURF features. Vol. Tinne Tuytelaars. Luc Van Gool.

See also FeatureMatch Machine Vision Toolbox for MATLAB R 38 Copyright c Peter Corke 2011 . T ‘median’ match threshold (default 0. Options ‘thresh’. FUNCTIONS AND CLASSES See also isurf. Correspondence is based on descriptor similarity. options) is a vector of FeatureMatch objects that describe candidate matches between the two vectors of SURF features F and f2. [m.match(f2. PointFeature.match Match SURF point features m = F. options) as above but returns a correspodence matrix where each row contains the indices of corresponding features in F and f2 respectively. ScalePointFeature.05) Threshold at the median distance Notes • for no threshold set to []. f = PointFeature(u. SiftPointFeature SurfPointFeature.C] = F.CHAPTER 2.SurfPointFeature Create a SURF point feature object f = SurfPointFeature() is a point feature object with null parameters. See also isurf SurfPointFeature. v. f = PointFeature(u.match(f2. strength) as above but with speciﬁed strength. v) is a point feature object with speciﬁed coordinates.

extracted from the image im in which the feature appears.2) SurfPointFeature.T] = F. 1=opaque.plot scale(options. w) as above but returns the pose of the feature as a 3 × 3 homogeneous transform in SE(2) that comprises the feature position and orientation. w) as above but the support region is displayed. [out. w) is an image of the support region of the feature F. F. FUNCTIONS AND CLASSES SurfPointFeature.support(images. If F is a vector then each element is plotted.CHAPTER 2. ls) as above but the optional line style arguments ls are passed to plot. A Indicate scale by a circle (default) Indicate scale by circle with one radial line for orientation Indicate scale and orientation by an arrow Indicate scale by a translucent disk Color of circle or disk (default green) Transparency of disk.support(images.support Support region of feature out = F. w) as above but if the features were extracted from an image sequence images then the feature is extracted from the appropriate image in the same sequence.plot scale(options) overlay a marker to indicate feature point position and scale. Options ‘circle’ ‘clock’ ‘arrow’ ‘disk’ ‘color’.plot scale Plot feature scale F. See also SurfPointFeature Machine Vision Toolbox for MATLAB R 39 Copyright c Peter Corke 2011 . 0=transparent (default 0. C ‘alpha’. The support region is scaled to w × w and rotated so that the feature’s orientation axis is upward.support(im. F.support(im. out = F.

axis. S ‘resolution’. G ‘scale’. Options ‘uint8’ ‘ﬂoat’ ‘double’ ‘grey’ ‘gamma’.AxisWebCamera Axis web camera constructor a = AxisWebCamera(url. options) is an AxisWebCamera object that acquires images from an Axis Communications (www. otherwise the result is not predictable. • The speciﬁed ‘resolution’ must match one that the camera is capable of. Machine Vision Toolbox for MATLAB R 40 Copyright c Peter Corke 2011 . Methods grab size close char Aquire and return the next image Size of image Close the image source Convert the object parameters to human readable string See also ImageSource. FUNCTIONS AND CLASSES AxisWebCamera Image from Axis webcam A concrete subclass of ImageSource that acquires images from a web camera built by Axis Communications (www. Obtain an image of size S=[W H]. Video AxisWebCamera. S Notes: Return image with uint8 pixels (default) Return image with ﬂoat pixels Return image with double precision pixels Return greyscale image Apply gamma correction with gamma=G Subsample the image by S in both directions.com).com) web camera.CHAPTER 2.axis.

CHAPTER 2. Machine Vision Toolbox for MATLAB R 41 Copyright c Peter Corke 2011 .char() is a string representing the state of the camera object in human readable form. BagOfWords Bag of words class The BagOfWords class holds sets of features for a number of images and supports image retrieval by comparing new images with those in the ‘bag’. FUNCTIONS AND CLASSES AxisWebCamera. See also AxisWebCamera.grab Acquire image from the camera im = A.display AxisWebCamera.close() closes the connection to the web camera. and this function will return the most recently captured image held in the camera. AxisWebCamera.grab() is an image acquired from the web camera.close Close the image source A. Notes • Some web cameras have a ﬁxed picture taking interval.char Convert to string A.

Conf.1470-1477. f can also be a cell array.Sivic and A. Ninth IEEE Int. b = BagOfWords(f. Notes • Uses the MEX function vl kmeans to perform clustering (vlfeat.BagOfWords Create a BagOfWords object b = BagOfWords(f. k) is a new bag of words created from the feature vector f and with k words.org). in Proc.CHAPTER 2. pp. as produced by ISURF() for an image sequence. See also PointFeature BagOfWords. FUNCTIONS AND CLASSES Methods isword occurrences remove stop wordvector wordfreq similarity contains exemplars display char Return all features assigned to word Return number of occurrences of word Remove stop words Return word frequency vector Return words and their frequencies Compare two word bags List the images that contain a word Display examples of word support regions Display the parameters of the bag of words Convert the parameters of the bag of words to a string Properties K nstop nimages The number of clusters speciﬁed The number of stop words speciﬁed The number of images in the bag Reference J. Oct. Machine Vision Toolbox for MATLAB R 42 Copyright c Peter Corke 2011 . The features are sorted into k clusters and each cluster is termed a visual word. “Video Google: a text retrieval approach to object matching in videos”. b2) is a new bag of words created from the feature vector f but clustered to the words (and stop words) from the existing bag b2.Zisserman. on Computer Vision. 2003.

images.contains Find images containing word k = B. See also BagOfWords. isurf BagOfWords. FUNCTIONS AND CLASSES See also PointFeature. The examples are displayed as a table of thumbR Machine Vision Toolbox for MATLAB 43 Copyright c Peter Corke 2011 .char BagOfWords.CHAPTER 2.contains(w) is a vector of the indices of images in the sequence that contain one or more instances of the word w.display Display value B.exemplars display exemplars of words B.exemplars(w.display() displays the parameters of the bag in a compact human readable form.char() is a compact string representation of a bag of words. BagOfWords. BagOfWords. Notes • This method is invoked implicitly at the command line when the result of an expression is a BagOfWords object and the command has no trailing semicolon.char Convert to string s = B. options) displays examples of the support regions of the words speciﬁed by the vector w.

Options ‘ncolumns’.isword Features from words f = B. If w is a vector of words the result is a vector of features assigned to all the words in w. The original sequence of images from which the features were extracted must be provided as images.isword(w) is a vector of feature objects that are assigned to any of the word w.remove stop(n) removes the n most frequent words (the stop words) from the bag. BagOfWords. BagOfWords.remove stop Remove stop words B.CHAPTER 2. BagOfWords.occurrence Word occurrence n = B. All remaining words are renumbered so that the word labels are consecutive. M ‘width’. N ‘maxperimage’.wordfreq() is a vector of word labels w and the corresponding elements of n are the number of occurrences of that word. FUNCTIONS AND CLASSES nail images.occurrence(w) is the number of occurrences of the word w across all features in the bag. Machine Vision Toolbox for MATLAB R 44 Copyright c Peter Corke 2011 . w Number of columns to display (default 10) Maximum number of exemplars to display from any one image (default 2) Width of each thumbnail [pixels] (default 50) BagOfWords.wordfreq Word frequency statistics [w.n] = B.

Notes • The word vector is expensive to compute so a lazy evaluation is performed on the ﬁrst call to this function CatadioptricCamera Catadioptric camera class A concrete class for a catadioptric camera.wordvector(J) is the word frequency vector for the J’th image in the bag. Methods project plot hold ishold clf ﬁgure mesh point line plot camera rpy move centre delete char display project world points to image plane plot/return world point on image plane control hold for image plane test ﬁgure hold for image plane clear image plane ﬁgure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera set camera attitude copy of Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Machine Vision Toolbox for MATLAB R 45 Copyright c Peter Corke 2011 . The vector is K × 1 and the angle between any two WFVs is an indication of image similarity.wordvector Word frequency vector wf = B. FUNCTIONS AND CLASSES BagOfWords.CHAPTER 2. subclass of Camera.

• Camera objects can be used in vectors and arrays See also CentralCamera.CHAPTER 2.CatadioptricCamera Create central projection camera object C = CatadioptricCamera() creates a central projection camera with canonic parameters: f=1 and name=’canonic’. C = CatadioptricCamera(options) as above but with speciﬁed parameters. FUNCTIONS AND CLASSES Properties (read/write) npix pp rho f p T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: pixel dimensions (2 × 1) [metres] intrinsic: focal length [metres] intrinsic: tangential distortion parameters extrinsic: camera pose as homogeneous transformation Properties (read only) nu nv u0 v0 number of pixels in u-direction number of pixels in v-direction principal point u-coordinate principal point v-coordinate Notes • Camera is a reference object. Machine Vision Toolbox for MATLAB R 46 Copyright c Peter Corke 2011 . Camera CatadioptricCamera.

A ‘resolution’. K ‘maxangle’. See also Camera. Options ‘Tobj’.T. The columns of p (3 × N ) are the world points and the columns of uv (2 × N ) are the corresponding image plane points. N ‘sensor’. Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation Notes • The elevation angle range is from -pi/2 (below the mirror) to maxangle above the horizontal plane. options) are the image plane coordinates for the world points p. ‘sine’. Temporarily overrides the current camera pose C.and v-axes parallel to x.and y-axes respectively. M ‘k’. Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. u. T ‘Tcam’. N ‘focal’.CHAPTER 2. Image plane resolution: N × N or N=[W H]. ‘stereographic’ Parameter for the projection model The maximum viewing angle above the horizontal plane. S ‘centre’. S ‘noise’. ﬁsheyecamera. ‘equisolid’. optical axis is z-axis. F ‘default’ ‘projection’. P ‘pixel’. CatadioptricCamera. See also Camera.project(p.plot Machine Vision Toolbox for MATLAB R 47 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES Options ‘name’. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane. Image sensor size in metres (2 × 1) Principal point (2 × 1) Pixel size: S × S or S=[W H]. f=8mm. 10um pixels.project Project world points to image plane uv = C. SphericalCamera CatadioptricCamera. SIGMA ‘pose’. Catadioptric model: ‘equiangular’ (default). camera at origin. T Name of camera Focal length (metres) Default camera parameters: 1024 × 1024.

such as ScalePointFeature. Methods plot show ransac inlier outlier subset display char Plot corresponding points Show summary statistics of corresponding points Determine inliers and outliers Return inlier matches Return outlier matches Return a subset of matches Display value of match Convert value of match to string Properties p1 Point coordinates in view 1 (2 × 1) p2 Point coordinates in view 2 (2 × 1) p Point coordinates in view 1 and 2 (4 × 1) distance Match strength between the points Properties of a vector of FeatureMatch objects are returned as a vector. See also PointFeature.p1 is a 2 × N matrix with each column the corresponding view 1 point coordinate. If F is a vector (N × 1) of FeatureMatch objects then F. SiftPointFeature Machine Vision Toolbox for MATLAB R 48 Copyright c Peter Corke 2011 . Note • FeatureMatch is a reference object. • FeatureMatch objects can be used in vectors and arrays • Operates with all objects derived from PointFeature. SurfPointFeature and SiftPointFeature. FUNCTIONS AND CLASSES FeatureMatch Feature correspondence object This class represents the correspondence between two PointFeature objects.CHAPTER 2. SurfPointFeature. A vector of FeatureMatch objects can represent the correspondence between sets of points.

m = FeatureMatch(f1. See also PointFeature. FUNCTIONS AND CLASSES FeatureMatch.FeatureMatch Create a new FeatureMatch object m = FeatureMatch(f1. Notes • This method is invoked implicitly at the command line when the result of an expression is a FeatureMatch object and the command has no trailing semicolon.char Convert to string s = M.char() is a compact string representation of the match object.CHAPTER 2. FeatureMatch. f2) as above but the strength is set to NaN.display Display value M. If M is a vector then the string has multiple lines. SurfPointFeature.char Machine Vision Toolbox for MATLAB R 49 Copyright c Peter Corke 2011 . s) is a new FeatureMatch object describing a correspondence between point features f1 and f2 with a strength of s. one per element. f2. Notes • Only the coordinates of the PointFeature are kept. If M is a vector then the elements are printed one per line. SiftPointFeature FeatureMatch.display() displays a compact human-readable representation of the feature pair. See also FeatureMatch.

ransac FeatureMatch.inlier Inlier features m2 = M.inlier() is a subset of the FeatureMatch vector M that are considered to be inliers.inlier. FeatureMatch. Each column contains the coordinates of a pair of corresponding points [u1. FeatureMatch.p() is a 4 × N matrix containing the feature point coordinates.outlier. FUNCTIONS AND CLASSES FeatureMatch. See also FeatureMatch. Notes • Outliers are not determined until after RANSAC is run. Machine Vision Toolbox for MATLAB R 50 Copyright c Peter Corke 2011 .v1. See also FeatureMatch.u2. Notes • Inliers are not determined until after RANSAC is run.ransac FeatureMatch.outlier() is a subset of the FeatureMatch vector M that are considered to be outliers.CHAPTER 2.p Feature point coordinate pairs p = M.outlier Outlier features m2 = M.v2].

plot Show corresponding points M. The ﬁgure must comprise views 1 and 2 side by side.p2 FeatureMatch. FeatureMatch. Machine Vision Toolbox for MATLAB R 51 Copyright c Peter Corke 2011 .im2}) m. FUNCTIONS AND CLASSES See also FeatureMatch.plot() overlays the correspondences in the FeatureMatch vector M on the current ﬁgure.v) properties of the feature F2 passed to the constructor. FeatureMatch. See also FeatureMatch. FeatureMatch.p1.FeatureMatch. These are the (u.p1 Feature point coordinates from view 1 p = M.v) properties of the feature F1 passed to the constructor.p2 Feature point coordinates from view 2 p = M.FeatureMatch. These are the (u. See also FeatureMatch. for example by: idisp({im1.p1.plot(ls) as above but the optional line style arguments ls are passed to plot.CHAPTER 2.p FeatureMatch. FeatureMatch.p FeatureMatch.p2.plot() M. FeatureMatch.p2() is a 2 × N matrix containing the feature points coordinates from view 1.p1() is a 2 × N matrix containing the feature points coordinates from view 1.

Machine Vision Toolbox for MATLAB R 52 Copyright c Peter Corke 2011 .show Display summary statistics of the FeatureMatch vector M.ransac Apply RANSAC M. and an error is created if this UserData is not found. ransac FeatureMatch. m = f1.CHAPTER 2. FUNCTIONS AND CLASSES Notes • Using IDISP as above adds UserData to the ﬁgure. options) applies the RANSAC algorithm to ﬁt the point correspondences to the model described by the function func. homography. 1e-4). inliers and outliers (and their percentages). See also fmatrix.ransac(func. Elements of the FeatureMatch vector have their status updated in place to indicate whether they are inliers or outliers. m.ransac( @fmatrix. See also idisp FeatureMatch.match(f2). The options are passed to the RANSAC() function. f2 = isurf(im2).show() is a compact summary of the FeatureMatch vector M that gives the number of matches. Example f1 = isurf(im1).

CHAPTER 2. v. The camera coordinate system is: 0------------> u. FishEyeCamera Fish eye camera class A concrete class a ﬁsheye lense projection camera.subset(n) is a FeatureMatch vector with no more than n elements sampled uniformly from M. the focal point is at z=0 and the image plane is at z=f. Y This camera model assumes central projection. The image is not inverted. Methods project plot hold ishold clf ﬁgure mesh point line plot camera rpy move centre delete char display project world points to image plane plot/return world point on image plane control hold for image plane test ﬁgure hold for image plane clear image plane ﬁgure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera set camera attitude copy of Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Machine Vision Toolbox for MATLAB R 53 Copyright c Peter Corke 2011 .subset Subset of matches m2 = M. that is. X | | | + (principal point) | | Z-axis is into the page. FUNCTIONS AND CLASSES FeatureMatch.

optical axis is z-axis. N ‘default’ ‘projection’. C = FishEyeCamera(options) as above but with speciﬁed parameters. u. ‘stereographic’ Parameter for the projection model Image plane resolution: N × N or N=[W H]. Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation R Machine Vision Toolbox for MATLAB 54 Copyright c Peter Corke 2011 . • Camera objects can be used in vectors and arrays See also Camera FishEyeCamera. N ‘sensor’. P ‘pixel’. M ‘k’. Options ‘name’.axes respectively. ‘equisolid’. 10um pixels. K ‘resolution’.CHAPTER 2. S ‘centre’. FUNCTIONS AND CLASSES Properties (read/write) npix pp f rho T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: focal length [metres] intrinsic: pixel dimensions (2 × 1) [metres] extrinsic: camera pose as homogeneous transformation Properties (read only) nu nv number of pixels in u-direction number of pixels in v-direction Notes • Camera is a reference object. f=8mm. Fisheye model: ‘equiangular’ (default). T Name of camera Default camera parameters: 1024 × 1024. camera at origin. S ‘noise’.and v-axes are parallel to x. ‘sine’. SIGMA ‘pose’.and y. Image sensor size [metres] (2 × 1) Principal point (2 × 1) Pixel size: S × S or S=[W H].FishEyeCamera Create ﬁsheyecamera object C = FishEyeCamera() creates a ﬁsheye camera with canonic parameters: f=1 and name=’canonic’.

See also Camera. FUNCTIONS AND CLASSES Notes • If K is not speciﬁed it is computed such that the circular imaging region maximally ﬁlls the square image plane. options) are the image plane coordinates for the world points p. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane.T. CentralCamera. In this version of the Hough transform lines are described by: d = y cos(theta) + x sin(theta) Machine Vision Toolbox for MATLAB R 55 Copyright c Peter Corke 2011 . For every edge pixel in the input image a set of cells in the Hough accumulator (voting array) are incremented.plot Hough Hough transform class The Hough transform is a technique for ﬁnding lines in an image using a voting scheme. CatadioptricCamera.project(p. Options ‘Tobj’.CHAPTER 2.project Project world points to image plane uv = C. Temporarily overrides the current camera pose C. SphericalCamera FishEyeCamera. Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. See also FishEyeCamera. T ‘Tcam’. The columns of p (3 × N ) are the world points and the columns of uv (2 × N ) are the corresponding image plane points.

H). with columns corresponding to theta and rows corresponding to offset (d).CHAPTER 2. See also LineFeature Hough. Methods plot show lines char display Overlay detected lines Display the Hough accumulator Return line features Convert Hough parameters to string Display Hough parameters Properties Nrho Ntheta A rho theta edgeThresh houghThresh suppress interpWidth Number of bins in rho direction Number of bins in theta direction The Hough accumulator (Nrho x Ntheta) rho values for the centre of each bin vertically Theta values for the centre of each bin horizontally Threshold on relative edge pixel strength Threshold on relative peak strength Radius of accumulator cells cleared around peak Width of region used for peak interpolation Notes • Hough is a reference object. By default the vote is incremented by Machine Vision Toolbox for MATLAB R 56 Copyright c Peter Corke 2011 .Hough Create Hough transform object ht = Hough(E. A horizontal line has theta = 0. Offset is in the range -rho max to rho max where rho max=max(W. and d is the perpendicular distance between (0. a vertical line has theta = pi/2 or -pi/2. For every pixel in the edge image E (H ×W ) greater than a threshold the corresponding elements of the accumulator are incremented. options) is the Hough transform of the edge image E.0) and the line. Theta spans the range -pi/2 to pi/2 in Ntheta steps. FUNCTIONS AND CLASSES where theta is the angle the line makes to horizontal axis. The voting array is 2-dimensional.

Default 400 × 401. See also Hough.display Display value HT. Options ‘equal’ ‘interpwidth’.suppress (default 0) Set number of bins. T ‘edgethresh’. Hough. W ‘houghthresh’. FUNCTIONS AND CLASSES the edge strength but votes can be made equal with the option ‘equal’. if N is scalar set Nrho=Ntheta=N. W ‘nbins’.1).edgeThresh (default 0. Nrho].CHAPTER 2.char Machine Vision Toolbox for MATLAB R 57 Copyright c Peter Corke 2011 .5) Set ht. Hough. Notes • This method is invoked implicitly at the command line when the result of an expression is a Hough object and the command has no trailing semicolon. N All edge pixels have equal weight. else N = [Ntheta.display() displays a compact human-readable string representation of the Hough transform parameters. T ‘suppress’.char() is a compact string representation of the Hough transform parameters.houghThresh (default 0.char Convert to string s = HT. The threshold is determined from the maximum edge strength value x ht. otherwise the edge pixel value is the vote strength Interpolation width (default 3) Set ht.edgeThresh. Set ht.

lines Hough. reﬁned to subpixel precision.houghThresh times the maximum vote value.plot() overlays all detected lines on the current ﬁgure. L = HT.show Display the Hough accumulator as image s = HT. LineFeature Hough. Machine Vision Toolbox for MATLAB R 58 Copyright c Peter Corke 2011 . H = HT. HT. The peak detection loop breaks early if the remaining peak has a strength less than HT.CHAPTER 2. HT. ls) as above but the optional line style arguments ls are passed to plot. Lines are the coordinates of peaks in the Hough accumulator.lines(n) as above but returns no more than n LineFeature objects.plot() as above but returns a vector of graphics handles for each line. See also Hough. then all elements in an HT. FUNCTIONS AND CLASSES Hough.plot(n.plot.lines() is a vector of LineFeature objects that represent the dominant lines in the Hough accumulator. The highest peak is found. See also Hough.plot Plot line features HT.suppress radius around are zeroed so as to eliminate multiple close minima. The process is repeated for all peaks.show() displays the Hough vote accumulator as an image using the hot colormap. where ‘heat’ is proportional to the number of votes.plot(n) overlays a maximum of n strongest lines on the current ﬁgure.lines Find lines L = HT.

Methods plot seglength display char Plot the line segment Determine length of line segment Display value Convert value to string Properties rho Offset of the line theta Orientation of the line strength Feature strength length Length of the line Properties of a vector of LineFeature objects are returned as a vector. • LineFeature objects can be used in vectors and arrays See also Hough. RegionFeature.CHAPTER 2. hot LineFeature Line feature class This class represents a line feature. Note • LineFeature is a reference object. If L is a vector (N × 1) of LineFeature objects then L.rho is an N × 1 vector of the rho element of each feature. FUNCTIONS AND CLASSES See also colormap. PointFeature Machine Vision Toolbox for MATLAB R 59 Copyright c Peter Corke 2011 .

LENGTH is undeﬁned. LineFeature. If L is a vector then the string has multiple lines. strength.char() is a compact string representation of the line feature. If L is a vector then the elements are printed one per line. FUNCTIONS AND CLASSES LineFeature. See also LineFeature.LineFeature Create a line feature object L = LineFeature() is a line feature object with null parameters.char Machine Vision Toolbox for MATLAB R 60 Copyright c Peter Corke 2011 . one per element. Notes • This method is invoked implicitly at the command line when the result of an expression is a LineFeature object and the command has no trailing semicolon. L = LineFeature(rho.display Display value L. theta. LineFeature.char Convert to string s = L. L = LineFeature(l2) is a deep copy of the line feature l2. L = LineFeature(rho.display() displays a compact human-readable representation of the feature.CHAPTER 2. theta. strength) is a line feature object with the speciﬁed properties. length) is a line feature object with the speciﬁed properties.

Notes • If L is a vector then each element is plotted.seglength(edge. See also icanny Machine Vision Toolbox for MATLAB R 61 Copyright c Peter Corke 2011 . gap) is a copy of the line feature object with the property length updated to the length of the line (pixels). Small gaps.CHAPTER 2.plot() overlay the line on current plot. LineFeature. FUNCTIONS AND CLASSES LineFeature.seglength(edge) as above but the maximum allowable gap is 5 pixels. This method examines the edge pixels in the original image and determines the longest stretch of non-zero pixels along the line.plot(ls) as above but the optional line style arguments ls are passed to plot. l2 = L.points(edge) is the set of points that lie along the line in the edge image edge are determined.seglength Compute length of line segments The Hough transform identiﬁes lines but cannot determine their length. l2 = L. less than gap pixels are tolerated.points Return points on line segments p = L. L. See also icanny LineFeature.plot Plot line L.

Machine Vision Toolbox for MATLAB R 62 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES Movie Class to read movie ﬁle A concrete subclass of ImageSource that acquires images from a web camera built by Axis Communications (www.CHAPTER 2. G ‘scale’.axis.char Convert to string M.close Close the image source M. Options ‘uint8’ ‘ﬂoat’ ‘double’ ‘grey’ ‘gamma’.com).char() is a string representing the state of the movie object in human readable form.Movie Image source constructor m = Movie(ﬁle. Movie. options) is an Movie object that returns frames from the movie ﬁle ﬁle. Movie. S Return image with uint8 pixels (default) Return image with ﬂoat pixels Return image with double precision pixels Return greyscale image Apply gamma correction with gamma=G Subsample the image by S in both directions Read every S’th frame from the movie Movie.close() closes the connection to the movie. S ‘skip’.

grab Acquire next frame from movie im = M. undirected graph create an n-d. planar. Options ‘skip’. eid Graph connectivity is maintained by a labeling algorithm and this is updated every time an edge is added. S ‘frame’. undirected graph Graphs • are undirected • are symmetric cost edges (A to B is same cost as B to A) • are embedded in coordinate system • have no loops (edges from A to A) • vertices are represented by integer ids.grab() acquires the next image from the movie im = M. F Skip frames. and return current+S frame Return frame F within the movie Notes • If no output argument given the image is displayed using IDISP.grab(options) as above but allows the next frame to be speciﬁed. FUNCTIONS AND CLASSES Movie.CHAPTER 2. PGraph Simple graph class g = PGraph() g = PGraph(n) create a 2D. Machine Vision Toolbox for MATLAB R 63 Copyright c Peter Corke 2011 . vid • edges are represented by integer ids.

pick() char(g) return vid for edge return cost for edge list return coordinate of node v return vid for edge return component id for vertex return number of edges for all nodes set goal vertex for path planning return vertex id closest to picked point display summary info about the graph Planning paths through the graph g.closest(coord) return vertex closest to coord g. and plan paths return d of neighbour of v closest to goal return list of nodes from v to goal Graph and world points g.CHAPTER 2.distance(v1.component(v) g.path(v) set goal vertex. v2) g.connectivity() g.add edge(v1. v) g.n number of nodes Machine Vision Toolbox for MATLAB R 64 Copyright c Peter Corke 2011 .distances(coord) return sorted distances from coord and vertices To change the distance metric create a subclass of PGraph and override the method distance metric().add node(coord.next(v) g.clear() add vertex.add node(coord) g.edges(e) g. return vid add vertex and edge to v. v2) distance between v1 and v2 as the crow ﬂies g.goal(v) g. return eid remove all nodes and edges from the graph Information from graph g.plot() g.neighbours(v) g. return vid add edge from v1 to v2.cost(e) g. FUNCTIONS AND CLASSES Methods Constructing the graph g.coord(v) g. Object properties (read/write) g.

Machine Vision Toolbox for MATLAB R 65 Copyright c Peter Corke 2011 .PGraph Graph class constructor g = PGraph(d.CHAPTER 2. options) returns a graph object embedded in d dimensions. and returns the node id v.add node Add a node to the graph v = G. v. E = G. v) adds a node with coordinate x and connected to node v by an edge. v = G. v2. PGraph. where x is D × 1. and returns the edge id E.add edge(v1.add edge Add an edge to the graph E = G. v = G. Options ‘distance’. v2) add an edge between nodes with id v1 and v2.add node(x. PGraph. M ‘verbose’ Use the distance metric M for path planning Specify verbose operation Note • The distance metric is either ‘Euclidean’ or ‘SE2’ which is the sum of the squares of the difference in position and angle modulo 2pi.add node(x. C) adds a node with coordinate x and connected to node v by an edge with cost C. C) add an edge between nodes with id v1 and v2 with cost C. FUNCTIONS AND CLASSES PGraph.add edge(v1.add node(x) adds a node with coordinate x.

coord Coordinate of node x = G. of node id v.coord(v) return coordinate vector. and the distance d.CHAPTER 2.d] = G.closest Find closest node v = G. D × 1. [v. PGraph. PGraph. Machine Vision Toolbox for MATLAB R 66 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES PGraph.char Convert graph to string s = G.char() returns a compact human readable representation of the state of the graph including the number of vertices.connectivity() returns the total number of edges in the graph.clear Clear the graph G.CLEAR() removes all nodes and edges.CLOSEST(x) return id of node geometrically closest to coordinate x. PGraph.connectivity Graph connectivity C = G.closest(x) return id of node geometrically closest to coordinate x. edges and components. PGraph.

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.cost

Cost of edge

C = G.cost(E) return cost of edge id E.

PGraph.display

Display state of the graph

G.display() displays a compact human readable representation of the state of the graph including the number of vertices, edges and components.

See also

PGraph.char

PGraph.distance

Distance between nodes

d = G.distance(v1, v2) return the geometric distance between the nodes with id v1 and v2.

PGraph.distances

distance to all nodes

d = G.distances(v) returns vector of geometric distance from node id v to every other node (including v) sorted into increasing order by d. [d,w] = G.distances(v) returns vector of geometric distance from node id v to every other node (including v) sorted into increasing order by d where elements of w are the corresponding node id.

Machine Vision Toolbox for MATLAB

R

67

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.edges

Find edges given vertex

E = G.edges(v) return the id of all edges from node id v.

PGraph.goal

Set goal node

G.goal(vg) for least-cost path through graph set the goal node. The cost of reaching every node in the graph connected to vg is computed.

See also

PGraph.path cost is total distance from goal

PGraph.neighbours

Neighbours of a node

n = G.neighbours(v) return a vector of ids for all nodes which are directly connected neighbours of node id v. [n,C] = G.neighbours(v) return a vector n of ids for all nodes which are directly connected neighbours of node id v. The elements of C are the edge costs of the paths to the corresponding node ids in n.

PGraph.next

Find next node toward goal

v = G.next(vs) return the id of a node connected to node id vs that is closer to the goal.

See also

PGraph.goal, PGraph.path

Machine Vision Toolbox for MATLAB

R

68

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.path

Find path to goal node

p = G.path(vs) return a vector of node ids that form a path from the starting node vs to the previously speciﬁed goal. The path includes the start and goal node id.

See also

PGraph.goal

PGraph.pick

Graphically select a node

v = G.pick() returns the id of the node closest to the point clicked by user on a plot of the graph.

See also

PGraph.plot

PGraph.plot

Plot the graph

G.plot(opt) plot the graph in the current ﬁgure. Nodes are shown as colored circles.

Options

‘labels’ ‘edges’ ‘edgelabels’ ‘MarkerSize’, S ‘MarkerFaceColor’, C ‘MarkerEdgeColor’, C ‘componentcolor’ Display node id (default false) Display edges (default true) Display edge id (default false) Size of node circle Node circle color Node circle edge color Node color is a function of graph component

Machine Vision Toolbox for MATLAB

R

69

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.showComponent

t

G.showcomponent(C) plots the nodes that belong to graph component C.

PGraph.showVertex

Highlight a vertex

G.showVertex(v) highlights the vertex v with a yellow marker.

PGraph.vertices

Find vertices given edge

v = G.vertices(E) return the id of the nodes that deﬁne edge E.

PointFeature

PointCorner feature object

A superclass for image corner features.

Methods

plot distance ncc uv display char Plot feature position Descriptor distance Descriptor similarity Return feature coordinate Display value Convert value to string

Machine Vision Toolbox for MATLAB

R

70

Copyright c Peter Corke 2011

If F is a vector (N × 1) of PointFeature objects then F. Notes • This method is invoked implicitly at the command line when the result of an expression is a PointFeature object and the command has no trailing semicolon.char Convert to string s = F. SiftPointFeature PointFeature.char() is a compact string representation of the point feature. Machine Vision Toolbox for MATLAB R 71 Copyright c Peter Corke 2011 . SurfPointFeature. strength) as above but with speciﬁed strength. one per element. f = PointFeature(u. PointFeature.display Display value F. v) is a point feature object with speciﬁed coordinates. v.u is a 2 × N matrix with each column the corresponding point coordinate.display() displays a compact human-readable representation of the feature. If F is a vector then the string has multiple lines.PointFeature Create a point feature object f = PointFeature() is a point feature object with null parameters. f = PointFeature(u. PointFeature.CHAPTER 2. FUNCTIONS AND CLASSES Properties u horizontal coordinate v vertical coordinate strength feature strength descriptor feature descriptor (vector) Properties of a vector of PointFeature objects are returned as a vector. If F is a vector then the elements are printed one per line. See also ScalePointFeature.

If F is a vector then d is a vector whose elements are the distance between the corresponding element of F and f1. options) as above but returns a correspodence matrix where each row contains the indices of corresponding features in F and f2 respectively.match(f2.distance Distance between feature descriptors d = F. [m. If F is a vector then D is a vector whose elements are the distance between the corresponding element of F and f1. T ‘median’ match threshold (default 0.ncc Feature descriptor similarity s = F.match Match point features m = F.CHAPTER 2. options) is a vector of FeatureMatch objects that describe candidate matches between the two vectors of point features F and f2.ncc(f1) is the similarty between feature descriptors which is a scalar in the interval -1 to 1.match(f2. PointFeature.C] = F. the norm of the Euclidean distance.05) Threshold at the median distance See also FeatureMatch PointFeature. Options ‘thresh’. where 1 is perfect match.distance(f1) is the distance between feature descriptors. Machine Vision Toolbox for MATLAB R 72 Copyright c Peter Corke 2011 .char PointFeature. FUNCTIONS AND CLASSES See also PointFeature.

plot(ls) as above but the optional line style arguments ls are passed to plot.plot() overlay a marker at the feature position.plot Plot feature F. Methods plot area moments centroid perimeter transform inside intersection difference union xor display char Plot polygon Area of polygon Moments of polygon Centroid of polygon Perimter of polygon Transform polygon Test if points are inside polygon Intersection of two polygons Difference of two polygons Union of two polygons Exclusive or of two polygons print the polygon in human readable form convert the polgyon to human readable string Properties vertices extent n List of polygon vertices. miny maxy] Number of vertices Notes • this is reference class object • Polygon objects can be used in vectors and arrays R Machine Vision Toolbox for MATLAB 73 Copyright c Peter Corke 2011 . If F is a vector then each element is plotted. F. one per column Bounding box [minx maxx. FUNCTIONS AND CLASSES PointFeature.CHAPTER 2.General polygon class p = Polygon(vertices). Polygon .

Polygon Polygon class constructor p = Polygon(v) is a polygon with vertices given by v. FUNCTIONS AND CLASSES Acknowledgement The methods inside. HEIGHT]. Polygon. Machine Vision Toolbox for MATLAB R 74 Copyright c Peter Corke 2011 .mit. so use with care. Polygon.html and require a licence. Polygon. one column per vertex.centroid() is the centroid of the polygon.area Area of polygon a = P. difference. p = Polygon(C.edu. and xor are based on code written by: Kirill K. wh) is a rectangle centred at C with dimensions wh=[WIDTH.char() is a compact representation of the polgyon in human readable form. http://puddle.mit.edu/ glenn/kirill/saga.CHAPTER 2. intersection.area() is the area of the polygon.centroid Centroid of polygon x = P. kirill@plume.char String representation s = P. However the author does not respond to email regarding the licence. union. Pankratov. Polygon.

• If the result d is not simply connected or consists of several polygons. The corresponding elements of in are either true or false. else 0. returns coordinates of P.intersect Intersection of polygon with list of polygons i = P. Machine Vision Toolbox for MATLAB R 75 Copyright c Peter Corke 2011 .intersect(plist) indicates whether or not the Polygon P intersects with i(j) = 1 if p intersects polylist(j).display Display polygon P. See also Polygon. FUNCTIONS AND CLASSES Polygon.display() displays the polygon in a compact human readable form. Polygon. Polygon. Notes • If polygons P and q are not intersecting.inside(p) tests if points given by columns of p are inside the polygon.difference(q) is polygon P minus polygon q.difference Difference of polygons d = P.char Polygon.CHAPTER 2.inside Test if points are inside polygon in = p. resulting vertex list will contain NaNs.

q) is the pq’th moment of the polygon. Machine Vision Toolbox for MATLAB R 76 Copyright c Peter Corke 2011 . See also mpq poly Polygon. each column is [x y]’. • If intersection consist of several disjoint polygons (for non-convex P or q) then vertices of i is the concatenation of the vertices of these polygons. Notes • If these polygons are not intersecting. FUNCTIONS AND CLASSES Polygon.perimeter Perimeter of polygon L = P.intersection(q) is a Polygon representing the intersection of polygons P and q. Polygon. returns empty polygon. y1 y2].perimeter() is the perimeter of the polygon.moments Moments of polygon a = P.CHAPTER 2.intersect line Intersection of polygon and line segment i = P. Polygon.intersection Intersection of polygons i = P.intersect line(L) is the intersection points of a polygon P with the line segment L=[x1 x2. i is an N × 2 matrix with one column per intersection.moments(p.

returns a polygon with vertices of both polygons separated by NaNs.transform Transformation of polygon vertices p2 = P.union Union of polygons i = P. Notes • If these polygons are not intersecting.union(q) is a Polygon representing the union of polygons P and q. Polygon.plot() plot the polygon. Machine Vision Toolbox for MATLAB R 77 Copyright c Peter Corke 2011 .transform(T) is a new Polygon object whose vertices have been transfored by the 3 × 3 homgoeneous transformation T. • If the result P is not simply connected (such as a polygon with a “hole”) the resulting contour consist of counter.plot Plot polygon P. returns a polygon with vertices of both polygons separated by NaNs.union(q) is a Polygon representing the union of polygons P and q. Polygon. Polygon.xor Exclusive or of polygons i = P.clockwise “outer boundary” and one or more clock-wise “inner boundaries” around “holes”. P.plot(ls) as above but pass the arguments ls to plot. Notes • If these polygons are not intersecting.CHAPTER 2. FUNCTIONS AND CLASSES Polygon.

Machine Vision Toolbox for MATLAB R 78 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES • If the result P is not simply connected (such as a polygon with a “hole”) the resulting contour consist of counter.clockwise “outer boundary” and one or more clock-wise “inner boundaries” around “holes”. deﬁned by a point on the ray and a direction unit-vector. Ray3D Ray in 3D space This object represents a ray in 3D space.Ray3D Ray constructor R = Ray3D(p0. unit vector (3 × 1) Notes • Ray3D objects can be used in vectors and arrays Ray3D.CHAPTER 2. Methods intersect closest char display Intersection of ray with plane or ray Closest distance between point and ray Ray parameters as human readable string Display ray parameters in human readable form Properties P0 d A point on the ray (3 × 1) Direction of the ray. d) is a new Ray3D object deﬁned by a point on the ray p0 and a direction vector d.

one per element. See also Ray3D. corresponding to the intersection of R(i) with r2. If R is a vector then then x has multiple columns.char Ray3D.CHAPTER 2. Ray3D.closest(p) as above but also returns the distance E between x and p. [x.E] = R. Ray3D. Notes • This method is invoked implicitly at the command line when the result of an expression is a Ray3D object and the command has no trailing semicolon.display() displays a compact human-readable representation of the Ray3D’s value. [x.closest(p) is the point on the ray R closest to the point p.intersect(r2) is the point on R that is closest to the ray r2.char Convert to string s = R.char() is a compact string representation of the Ray3D’s value.intersect(r2) as above but also returns the closest distance between the rays.E] = R.closest Closest distance between point and ray x = R.display Display value R. FUNCTIONS AND CLASSES Ray3D. If R is a vector then the string has multiple lines. If R is a vector then the elements are printed one per line. Machine Vision Toolbox for MATLAB R 79 Copyright c Peter Corke 2011 .intersect Intersetion of ray with line or plane x = R.

CHAPTER 2. less for other shapes a structure containing moments of order 0 to 2 Machine Vision Toolbox for MATLAB R 80 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES x = R.b. If R is a vector then x has multiple columns. minimum vertical coordinate bounding box. maximum horizontal coordinate bounding box. corresponding to the intersection of R(i) with p. RegionFeature Region feature class This class represents a region feature. maximum vertical coordinate the number of pixels the value of the pixels forming this region the label assigned to this region a list of indices of features that are children of this feature coordinate of a point on the perimeter a list of edge points 2 × N matrix number of edge pixels true if region touches edge of the image major axis length of equivalent ellipse minor axis length of equivalent ellipse angle of major ellipse axis to horizontal axis aspect ratio b/a (always <= 1.d) where aX + bY + cZ + d = 0.c.intersect(p) returns the point of intersection between the ray R and the plane p=(a. horizontal coordinate centroid. vertical coordinate bounding box.0) 1 for a circle. minimum horizontal coordinate bounding box. Methods boundary box plot plot boundary plot box plot ellipse display char Return the boundary as a list Return the bounding box Plot the centroid Plot the boundary Plot the bounding box Plot the equivalent ellipse Display value Convert value to string Properties uc vc umin umax vmin vmax area class label children edgepoint edge perimeter touch a b theta shape circularity moments centroid.

For example R. imoments RegionFeature.boundary() is a polar representation of the boundary with respect to the centroid. RegionFeature. If R is a vector then the string has multiple lines. • RegionFeature objects can be used in vectors and arrays • This class behaves differently to LineFeature and PointFeature when getting properties of a vector of RegionFeature objects. FUNCTIONS AND CLASSES Note • RegionFeature is a reference object.RegionFeature Create a region feature object R = RegionFeature() is a region feature object with null parameters. See also iblobs. one per element. These vectors have 400 elements irrespective of region size.char Convert to string s = R. ymin.th] = R.boundary Boundary in polar form [d.box Return bounding box b = R. d(i) and th(i) are the distance to the boundary point and the angle respectively. RegionFeature. RegionFeature.CHAPTER 2. Machine Vision Toolbox for MATLAB R 81 Copyright c Peter Corke 2011 .char() is a compact string representation of the region feature. ymax].uc will be a list not a vector.xmax.box() is the bounding box in standard Toolbox form [xmin.

Notes • If R is a vector then each element is plotted. RegionFeature. Machine Vision Toolbox for MATLAB R 82 Copyright c Peter Corke 2011 .plot(ls) as above but the optional line style arguments ls are passed to plot.plot Plot centroid R.display Display value R.plot() overlay the centroid on current plot. If R is a vector then the elements are printed one per line.CHAPTER 2. Notes • this method is invoked implicitly at the command line when the result of an expression is a RegionFeature object and the command has no trailing semicolon.plot boundary plot boundary R. R. If R is a vector then each element is plotted.display() is a compact string representation of the region feature. FUNCTIONS AND CLASSES RegionFeature. R. It is indicated with overlaid o.plot boundary() overlay perimeter points on current plot. See also RegionFeature.plot boundary(ls) as above but the optional line style arguments ls are passed to plot.and xmarkers.char RegionFeature.

If R is a vector then each element is plotted. R.plot box(ls) as above but the optional line style arguments ls are passed to plot.CHAPTER 2.plot box() overlay the the bounding box of the region on current plot.plot box Plot bounding box R. FUNCTIONS AND CLASSES See also boundmatch RegionFeature. ScalePointFeature ScalePointCorner feature object A subclass of PointFeature for features with scale. Methods plot plot scale distance ncc uv display char Plot feature position Plot feature scale Descriptor distance Descriptor similarity Return feature coordinate Display value Convert value to string Machine Vision Toolbox for MATLAB R 83 Copyright c Peter Corke 2011 .plot ellipse() overlay the the equivalent ellipse of the region on current plot. R. If R is a vector then each element is plotted.plot ellipse(ls) as above but the optional line style arguments ls are passed to plot. RegionFeature.plot ellipse Plot equivalent ellipse R.

2) Machine Vision Toolbox for MATLAB R 84 Copyright c Peter Corke 2011 . SurfPointFeature.plot scale(options. ScalePointFeature. 1=opaque. v) is a point feature object with speciﬁed coordinates. f = ScalePointFeature(u. If F is a vector (N × 1) of ScalePointFeature objects then F. v. f = ScalePointFeature(u. A Indicate scale by a circle (default) Indicate scale by a translucent disk Color of circle or disk (default green) Transparency of disk. f = ScalePointFeature(u.CHAPTER 2. v. Options ‘circle’ ‘disk’ ‘color’.plot scale Plot feature scale F. F. ls) as above but the optional line style arguments ls are passed to plot. strength.ScalePointFeature Create a scale point feature object f = ScalePointFeature() is a point feature object with null parameters.plot scale(options) overlay a marker at the feature position.u is a 2 × N matrix with each column the corresponding point coordinate. FUNCTIONS AND CLASSES Properties u horizontal coordinate v vertical coordinate strength feature strength scale feature scale descriptor feature descriptor (vector) Properties of a vector of ScalePointFeature objects are returned as a vector. See also PointFeature. 0=transparent (default 0. If F is a vector then each element is plotted. SiftPointFeature ScalePointFeature. strength) as above but with speciﬁed strength. scale) as above but with speciﬁed feature scale. C ‘alpha’.

one per active track. im (HxWxS) is an image sequence and C (S × 1) is a cell array of vectors of PointFeature subclass objects. A complete history of all tracks is maintained. R ‘nslots’. The elements of the cell array are the point features for the corresponding element of the image sequence. FUNCTIONS AND CLASSES Tracker Track points in image sequence This class assigns each new feature a unique identiﬁer and tracks it from frame to frame until it is lost. A vector of track history structures with elements id and uv which is the path of the feature. C. See also PointFeature Tracker. Machine Vision Toolbox for MATLAB R 85 Copyright c Peter Corke 2011 . T ‘movie’.Tracker Create new Tracker object T = Tracker(im. During operation the image sequence is animated and the point features are overlaid along with annotation giving the unique identiﬁer of the track. Methods plot tracklengths Plot all tracks Length of all tracks Properties track history A vector of structures. N ‘thresh’. Options ‘radius’. options) is a new tracker object. M Search radius for feature in next frame (default 20) Maximum number of tracks (default 800) Similarity threshold (default 0.CHAPTER 2.8) Write the frames as images into the folder M as with sequential ﬁlenames.

char() is a compact string representation of the Tracker parameters and status. Tracker. See also Tracker. FUNCTIONS AND CLASSES See also PointFeature Tracker.plot Show feature trajectories T.CHAPTER 2. Machine Vision Toolbox for MATLAB R 86 Copyright c Peter Corke 2011 .char Tracker.plot() overlays the tracks of all features on the current plot.char Convert to string s = T. Tracker.tracklengths() is a vector containing the length of every track.display() displays a compact human-readable string representation of the Tracker object Notes • This method is invoked implicitly at the command line when the result of an expression is a Tracker object and the command has no trailing semicolon.display Display value T.tracklengths Length of all tracks T.

G ‘scale’. Machine Vision Toolbox for MATLAB R 87 Copyright c Peter Corke 2011 . Methods grab size close char Aquire and return the next image Size of image Close the image source Convert the object parameters to human readable string See also ImageSource. otherwise the result is not predictable. Options ‘uint8’ ‘ﬂoat’ ‘double’ ‘grey’ ‘gamma’. • The speciﬁed ‘resolution’ must match one that the camera is capable of.Video Video camera constructor v = Video(camera. S Notes: Return image with uint8 pixels (default) Return image with ﬂoat pixels Return image with double precision pixels Return greyscale image Apply gamma correction with gamma=G Subsample the image by S in both directions. If camera is ‘?’ a list of available cameras.CHAPTER 2. AxisWebCamera. options) is a Video object that acquires images from the local video camera speciﬁed by the string camera. and their characteristics is displayed. S ‘resolution’. Movie Video. FUNCTIONS AND CLASSES Video Class to read from local video camera A concrete subclass of ImageSource that acquires images from a local camera. Obtain an image of size S=[W H].

FUNCTIONS AND CLASSES Video. about x as above but this is the command rather than functional form See also whos Machine Vision Toolbox for MATLAB R 88 Copyright c Peter Corke 2011 . Notes • the function will block until the next frame is acquired. Video.close Close the image source V. Video. about Compact display of variable type about(x) displays a compact line that describes the class and dimensions of x.CHAPTER 2.grab Acquire image from the camera im = V.char() is a string representing the state of the camera object in human readable form.grab() acquires an image from the camera.char Convert to string V.close() closes the connection to the camera.

if negative it is reduced. The result is in the interval [-pi pi). the ﬁrst for left. See also stdisp angdiff Difference of two angles d = angdiff(th1. useful if the images were captured with a non-human stereo baseline or ﬁeld of view. These adjustments are achieved by trimming the images. By default the left image is red. and the right image is cyan. d = angdiff(th) returns the equivalent angle to th in the interval [-pi pi). color) as above but the string color describes the color coding as a string with 2 letters. the second for right. th2) returns the difference between angles th1 and th2 on the circle. right. Use this option to make the images more natural/comfortable to view. If disp is positive the disparity is increased. and each is one of: ‘r’ red ‘g’ green ‘b’ green ‘c’ cyan ‘m’ magenta a = anaglyph(left. Return the equivalent angle in the interval [-pi pi). right. color. a = anaglyph(left. and th2 a scalar then return a column vector where th2 is modulo subtracted from the corresponding elements of th1.CHAPTER 2. right) is an anaglyph image where the two images of a stereo pair are combined into a single image by coding them in two different colors. disp) as above but allows for disparity correction. FUNCTIONS AND CLASSES anaglyph Convert stereo images to an anaglyph image a = anaglyph(left. Machine Vision Toolbox for MATLAB R 89 Copyright c Peter Corke 2011 . If th1 is a column vector.

x is also N × 1 and is a correlation whose peak indicates the relative orientation of one proﬁle with respect to the other. y2) is a list of integer coordinates for points lying on the line segement (x1. r2) is the correlation of the two boundary proﬁles R1 and r2.y1) to (x2.CHAPTER 2. FUNCTIONS AND CLASSES blackbody Compute blackbody emission spectrum E = blackbody(lambda. then E is a column vector whose elements correspond to to those in lambda. Endpoints must be integer. p2) as above but p1=[x1. Machine Vision Toolbox for MATLAB R 90 Copyright c Peter Corke 2011 . % emission of sun plot(l. See also RegionFeature. 6500). T) is the blackbody radiation power density [W/m3 ] at the wavelength lambda [m] and temperature T [K]. xcorr bresenham Generate a line p = bresenham(x1. [x. x2. Each is an N × 1 vector of distances from the centroid of an object to points on its perimeter at equal angular increments.y1] and p2=[x2. For example: l = [380:10:700]’*1e-9.y2). r2) as above but also returns the relative scale s which is the size of object 2 with respect to object 1. p = bresenham(p1. If lambda is a column vector. % visible spectrum e = blackbody(l.s] = boundmatch(R1.boundary.y2]. y1. e) boundmatch Match boundary proﬁles x = boundmatch(R1.

[C.CHAPTER 2. See also CentralCamera ccdresponse CCD spectral response R = ccdresponse(lambda) is the spectral response of a typical silicon imaging sensor at the wavelength lambda.E] = camcald(d) as above but E is the maximum residual error after back substitution [pixels]. Reference An ancient Fairchild data book for a sensor with no IR ﬁlter ﬁtted.Z) is the coordinate of a world point and [U. If lambda is a vector then R is a vector of the same length whose elements are the response at the corresponding element of lambda. Machine Vision Toolbox for MATLAB R 91 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES See also icanvas camcald Camera calibration from data points C = camcald(d) is the camera matrix (3 × 4) determined by least squares from corresponding world and image-plane points. The response is normalized in the range 0 to 1.Y. Notes: • This method cannot handle lense distortion.V] is the corresponding image plane coordinate. d is a table of points with rows of the form [X Y Z U V] where (X.

R. b) is the correspondence for N-dimensional point sets a (N × N A) and b (N × N B). green and blue primaries respectively. that is. R. [k. x = circle(C. k (1 x NA) is such that the element J = k(I).CHAPTER 2. opt) return an N × 2 matrix whose rows deﬁne the coordinates [x. b) as above and d1(I)=—a(I)-b(J)— is the distance of the closest point. circle Compute points on a circle circle(C. Options ‘n’. opt) plot a circle centred at C with radius R. R Machine Vision Toolbox for MATLAB 92 Copyright c Peter Corke 2011 .d1] = closest(a.y] of points around the circumferance of a circle centred at C and of radius R. and x is N × 3. N Specify the number of points (default 50) closest Find closest points in N-dimensional space. that the I’th column of a is closest to the Jth column of b. k = closest(a. C is normally 2 × 1 but if 3 × 1 then the circle is embedded in 3D. but the circle is always in the xy-plane with a z-coordinate of C(3). FUNCTIONS AND CLASSES See also rluminos cie primaries Deﬁne CIE primary colors p = CIE PRIMARIES() is a 3-vector with the wavelengths [m] of the CIE 1976 red.

5.ucl.ioo. they were measured directly. since.) • These CMFs differ slightly from those of Stiles & Burch (1955).5.ac. and 22500 (444.5.d1. If lambda is a vector then each row of rgb is the color matching function of the corresponding element of lambda.CHAPTER 2. 19000 (526.uk • The Stiles & Burch 2-deg CMFs are based on measurements made on 10 observers. unlike the CIE 2 deg functions (which were reconstructed from chromaticity data). The data are referred to as pilot data. while Table I(5.32). Notes • Is a MEX ﬁle.16). See also distance cmfrgb Color matching function Tthe color matching function is the tristimulus required to match a particular wavelength excitation. b) as above but also returns the distance to the second closest point. Notes • Data from http://cvrl.44) cm-1 Machine Vision Toolbox for MATLAB R 93 Copyright c Peter Corke 2011 .5. FUNCTIONS AND CLASSES [k.3) of Wyszecki & Stiles (1982) is gives them in interpolated 1 nm steps. E) is the CIE color matching (1×3) function for an illumination spectrum deﬁned by intensity E (N × 1) and wavelength lambda (N × 1) [m].d2] = closest(a.3) of Wyszecki & Stiles (1982).3) of Wyszecki & Stiles (1982) gives the Stiles & Burch functions in 250 cm-1 steps. As noted in footnote a on p.3) of Wyszecki & Stiles (1982). 335 of Table 1(5. rgb = cmfrgb(lambda. rgb = cmfrgb(lambda) is the CIE color matching function (N × 3) for illumination at wavelength lambda (N × 1) [m]. the CMFs have been ”corrected in accordance with instructions given by Stiles & Burch (1959)” and renormalized to primaries at 15500 (645. (Table 1(5. • From Table I(5. but probably represent the best estimate of the 2 deg CMFs.

CHAPTER 2. out = col2im(pix. xyz = cmfxyz(lambda. imsize is a 2-vector (N. ccxyz col2im Convert pixel vector to image out = col2im(pix.uk See also cmfrgb.ac. FUNCTIONS AND CLASSES See also ccxyz cmfxyz matching function The color matching function is the XYZ tristimulus required to match a particular wavelength excitation. If lambda is a vector then each row of xyz is the color matching function of the corresponding element of lambda. imsize) is an image (HxWxP) comprising the pixel values in pix (N × P ) with one row per pixel where N=H × W . E) is the CIE xyz color matching (1 × 3) function for an illumination spectrum deﬁned by intensity E (N × 1) and wavelength lambda (N × 1) [m].ucl. im) as above but the dimensions of out are the same as im.ioo.M). R Machine Vision Toolbox for MATLAB 94 Copyright c Peter Corke 2011 . xyz = cmfxyz(lambda) is the CIE xyz color matching function (N ×3) for illumination at wavelength lambda (N × 1) [m]. Note • CIE 1931 2-deg xyz CMFs from cvrl. Notes • The number of rows in pix must match the product of the elements of imsize.

See also colorspace Machine Vision Toolbox for MATLAB R 95 Copyright c Peter Corke 2011 . rg) is the Euclidean distance on the rg-chromaticity plane from coordinate rg=[r. Notes • The output image could be thresholded to determine color similarity. d is an image with the same dimensions as im and the value of each pixel is the color space distance of the corresponding pixel in im. colordistance Colorspace distance d = colordistance(im. FUNCTIONS AND CLASSES See also im2col colnorm Column-wise norm of a matrix cn = colnorm(a) returns an M × 1 vector of the normals of each column of the matrix a which is N × M .CHAPTER 2. • Note that Euclidean distance in the rg-chromaticity space does not correspond well with human perception of color differences.g] to every pixel in the color image im. Perceptually uniform spaces such as Lab remedy this problem.

k) as above but also returns the cluster centres C (k × 2) where the I’th row is the rg-chromaticity of the I’th cluster and corresponds to the label I. color) as above but a the mask is the return value of the function handle func applied to the image im. See also imono. A k-means clustering of the chromaticity of all input pixels is performed. func. out = colorize(im. @isnan. ipixswitch colorkmeans Color image segmentation by clustering L = colorkmeans(im. and returns a per-pixel logical result. icolor. mask.C] = colorkmeans(im. The label image L has the same row and column dimension as im and each pixel has a value in the range 0 to k-1 which indicates which cluster the corresponding pixel belongs to. FUNCTIONS AND CLASSES colorize Colorize a greyscale image out = colorize(im. R Machine Vision Toolbox for MATLAB 96 Copyright c Peter Corke 2011 . [1 0 0]) Notes • With no output arguments the image is displayed. color) is a color image where each pixel in out is set to the corresponding element of the greyscale image im or a speciﬁed color according to whether the corresponding value of mask is true or false respectively.G.B). [0 0 1]) Display image with NaN values shown in red out = colorize(im. @isnan. A k-means clustering of the chromaticity of all input pixels is performed. k. [L.CHAPTER 2. im<100. eg. The color is speciﬁed as a 3-vector (R. options) is a segmentation of the color image im into k classes. Examples Display image with values < 100 in blue out = colorize(im.

See also colorname Map between color names and RGB values rgb = colorname(name) is the rgb-tristimulus value corresponding to the color speciﬁed by the string name. the root mean square error of all pixel chromaticities with respect to their cluster centre. “?burnt” Machine Vision Toolbox for MATLAB R 97 Copyright c Peter Corke 2011 . name = colorname(rgb) is a string giving the name of the color that is closest (Euclidean) to the given rgb-tristimulus value. C) is a segmentation of the color image im into k classes which are deﬁned by the cluster centres C (k × 2) in chromaticity space. • The residual is an indication of quality of ﬁt. • Clustering is performed in rg-chromaticity space. XYZ = colorname(name. ‘xy’) is the XYZ-tristimulus value corresponding to the color speciﬁed by the string name. k) as above but also returns the residual R. low is good.C.CHAPTER 2. ‘xy’) is a string giving the name of the color that is closest (Euclidean) to the given XYZ-tristimulus value. name = colorname(XYZ. Notes • Color name may contain a wildcard. FUNCTIONS AND CLASSES [L. Pixels are assigned to the closest (Euclidean) centre. eg. Options Various options are possible to choose the initial cluster centres for k-means: ‘random’ ‘spread’ ‘pick’ randomly choose k points from randomly choose k values within the rectangle spanned by the input chromaticities.R] = colorkmeans(im. Since cluster centres are provided the k-means segmentation step is not required. L = colorkmeans(im. interactively pick cluster centres Notes • The k-means clustering algorithm used in the ﬁrst three forms is computationally expensive and time consuming.

im) converts the image im to a different color space according to the string s which speciﬁes the source and destination color spaces. As MATLAB’s native datatype. or alternatively. double data is the natural choice.i3) as above but speciﬁes separate input channels. • Color space names are case insensitive.709 gamma-corrected) Luma (ITU-R BT. and R Machine Vision Toolbox for MATLAB 98 Copyright c Peter Corke 2011 . • MATLAB uses two standard data formats for R’G’B’: double data with intensities in the range 0 to 1. it can be omitted. i1. Supported color spaces are: ‘RGB’ ‘YPbPr’ ‘YCbCr’/’YCC’ ‘YUV’ ‘YIQ’ ‘YDbDr’ ‘JPEGYCbCr’ ‘HSV’/’HSB’ ‘HSL’/’HLS’/’HSI’ ‘XYZ’ ‘Lab’ ‘Luv’ ‘Lch’ R’G’B’ Red Green Blue (ITU-R BT. and uint8 data with integer-valued intensities from 0 to 255. [o1. colorspace(s.o2. • When R’G’B’ is the source or destination.CHAPTER 2. Input and output images have 3 planes.i2. • Tristimulus values are in the range 0 to 1 colorspace Color space conversion of image out = colorspace(s.txt. FUNCTIONS AND CLASSES • Based on the standard X11 color database rgb. For example ‘yuv<’ is short for ‘yuv<-rgb’.o3] = colorspace(s. s = ‘src->dest’.601) + Chroma Luma + Chroma (“digitized” version of Y’PbPr) NTSC PAL Y’UV Luma + Chroma NTSC Y’IQ Luma + Chroma SECAM Y’DbDr Luma + Chroma JPEG-Y’CbCr Luma + Chroma Hue Saturation Value/Brightness Hue Saturation Luminance/Intensity CIE XYZ CIE L*a*b* (CIELAB) CIE L*u*v* (CIELUV) CIE L*ch (CIELCH) Notes • RGB input is assumed to be gamma encoded • RGB output is gamma encoded • All conversions assume 2 degree observer and D65 illuminant. s = ‘dest<-src’. im) as above but speciﬁes separate output channels or planes.

for memory and computational performance. • If im is an M × 3 array. out will also have size M × 3. distance Euclidean distances between sets of points d = distance(a. some functions also operate with uint8 R’G’B’. colorspace will ﬁrst cast it to double R’G’B’ before processing.100). d = distance(A. FUNCTIONS AND CLASSES the R’G’B’ format used by colorspace. like a colormap.200). Notes • This fully vectorized (VERY FAST!) • It computes the Euclidean distance between two vectors by: ||A-B|| = sqrt ( ||A||ˆ2 + ||B||ˆ2 .CHAPTER 2. Example A = rand(400.B ) R Machine Vision Toolbox for MATLAB 99 Copyright c Peter Corke 2011 .J) is the distance between points a(I) and d(J). The distance d is M × N and element d(I. Given uint8 R’G’B’ color data. Author Pascal Getreuer 2005-2006 diff2 diff3(v) compute 2-point difference for each point in the vector v.B).2*A. However.b) is the Euclidean distances between L-dimensional points described by the matrices a (L × M ) and b (L × N ) respectively. B = rand(400.

University of Amsterdam.3. • The seed point is always the ﬁrst element of the returned edgelist. Intelligent Autonomous Systems (IAS) group. seed) return the list of edge pixels of a region in the image im starting at edge coordinate seed (i. not image frame. tel. but the direction of edge following is speciﬁed.nl Last Rev: Oct 29 16:35:48 MET DST 1999.(+31)20-5257524. seed. bunschot@wins. FUNCTIONS AND CLASSES Author Roland Bunschoten.uva. Kruislaan 403 1098 SJ Amsterdam.2 and Solaris Matlab v5.y). Machine Vision Toolbox for MATLAB 100 R Copyright c Peter Corke 2011 . Tested: PC Matlab v5. non zero is counter-clockwise. See also closest e2h Euclidean to homogeneous edgelist Return list of edge pixels for region E = edgelist(im.CHAPTER 2. in matrix coordinate frame.j). The result E is a matrix. • seed must be a point on the edge of the region. direction == 0 (default) means clockwise. non-zero is an object. E = edgelist(im. each row is one edge point coordinate (x. Note that direction is with respect to y-axis upward. Notes • im is a binary image where 0 is assumed to be background. direction) returns the list of edge pixels as above. Thanx: Nikos Vlassis.

Coimbra.R. Oct 27. epidist Machine Vision Toolbox for MATLAB 101 R Copyright c Peter Corke 2011 . Nuno Alexandre Cid Martins.j) is the distance from the point p2(j) to the epipolar line due to point p1(i). See also epiline. I.S. Author Based on fmatrix code by. fmatrix epiline Draw epipolar lines epiline(f. p1. 1998. p. one per line drawn. p) draws epipolar lines in current ﬁgure based on points p (2 × N ) and the fundamental matrix f (3 × 3). H = epiline(f. ls) as above but return a vector of graphic handles. epiline(f. p. Points are speciﬁed by the columns of p.CHAPTER 2. See also fmatrix. ls) as above but draw lines using the line style arguments ls. p2) is the distance of the points p2 (2 × M ) from the epipolar lines due to points p1 (2 × N ) where f (3 × 3) is a fundamental matrix relating the views containing image points p1 and p2. FUNCTIONS AND CLASSES See also ilabel epidist Distance of point from epipolar line d = epidist(f. d (N × M ) is the distance matrix where element d(i.

Machine Vision Toolbox for MATLAB 102 R Copyright c Peter Corke 2011 . no outlier rejection is performed. Author Based on fundamental matrix code by Peter Kovesi. sigma) Returns a unit volume Gaussian smoothing kernel. that is. Reference Hartley and Zisserman. The Gaussian has a standard deviation of sigma.au/. which means it can be passed to ransac(). http://www. page 270.csse. • f is a rank 2 matrix. k is (2W+1) x (2W+1). and the convolution kernel has a half size of w. that is. options) is the fundamental matrix (3 × 3) that relates two sets of corresponding points p1 (2 × N ) and p2 (2 × N ) from two different camera views. epidist gauss2d kernel k = gauss2d(im. homography.CHAPTER 2. it is singular. • Contains a RANSAC driver. FUNCTIONS AND CLASSES fmatrix Estimate fundamental matrix f = fmatrix(p1.uwa.edu. Notes • The points must be corresponding. p2. See also ransac. epiline. If w is not speciﬁed it defaults to 2*sigma. ‘Multiple View Geometry in Computer Vision’. The University of Western Australia. School of Computer Science & Software Engineering. c.

CHAPTER 2. itriplepoint. x2. Unlike standard morphological operations S has three possible values: 0. y2) returns a 3 × 1 vectors which describes a line in homogeneous form that contains the two Euclidean points (x1. iendpoint homline Homogeneous line from two points L = homline(x1.y2). 1 and don’t care (represented by NaN). ithin.y1) and (x2. See also imorph. y1. Homogeneous points X (3 × 1) on the line must satisfy L’*X = 0. See also plot homline Machine Vision Toolbox for MATLAB 103 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES h2e Homogeneous to Euclidean hitormiss Hit or miss transform H = hitormiss(im. se) is the hit-or-miss transform of the binary image im with the structuring element se.

School of Computer Science & Software Engineering. p) applies homogeneous transformation T to the points stored columnwise in p. Notes • The points must be corresponding. FUNCTIONS AND CLASSES homography Estimate homography H = homography(p1.edu. fmatrix homtrans Apply a homogeneous transformation p2 = homtrans(T. which means it can be passed to ransac(). Author Based on homography code by Peter Kovesi.CHAPTER 2. p2) is the homography (3 × 3) that relates two sets of corresponding points p1 (2×N ) and p2 (2×N ) from two different camera views of a planar object.csse.uwa. http://www. invhomog. • The points must be projections of points lying on a world plane • Contains a RANSAC driver. • If T is in SE(2) (3 × 3) and – p is 2 × N (2D points) they are considered Euclidean (R2 ) – p is 3 × N (2D points) they are considered projective (p2 ) • If T is in SE(3) (4 × 4) and – p is 3 × N (3D points) they are considered Euclidean (R3 ) – p is 4 × N (3D points) they are considered projective (p3 ) Machine Vision Toolbox for MATLAB 104 R Copyright c Peter Corke 2011 . no outlier rejection is performed. See also ransac.au/. The University of Western Australia.

itrim. ie. See also homography.offs] = homwarp(H. that is tp=T*T1. See also e2h. options) is a warp of the image im obtained by applying the homography H to the coordinates of every input pixel. S output image contains all the warped pixels. if T = N × N and T=NxNxP then the result is NxNxP. interp2 Machine Vision Toolbox for MATLAB 105 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES tp = homtrans(T.H] Notes • The edges of the resulting output image will in general not be be vertical and horizontal lines. R ‘scale’. im. options) as above but offs is the offset of the warped tile out with respect to the origin of im. but its position with respect to the input image is given by the second return value offs.CHAPTER 2. im. V ‘roi’. set unmapped pixels to this value output image contains the speciﬁed ROI in the input image scale the output by this factor ensure output image is D × D size of output image S=[W. S ‘dimension’. D ‘size’. h2e homwarp Warp image by an homography out = homwarp(H. T1) applies homogeneous transformation T to the homogeneous transformation T1. Options ‘full’ ‘extrapval’. If T1 is a 3-dimensional transformation then T is applied to each plane as deﬁned by the ﬁrst two dimensions. [out.

features. Visual pattern recognition by moment invariants. IRE Trans. options) displays a greyscale image sequence im (HxWxN) where N is the number of frames in the sequence. Animate image sequence with overlaid corner features: c = icorner(im. ’gs’). features. Notes • im is assumed to be a binary image of a single connected region Reference M-K. options) displays a greyscale image sequence im with point features overlaid. IT-8:pp. Hu. The feature is plotted using the object’s plot method and additional options are passed through to that method. 200). ’nfeat’. 1962. % computer corners % features shown as green squares Machine Vision Toolbox for MATLAB 106 R Copyright c Peter Corke 2011 . See also npq ianimate Display an image sequence ianimate(im.CHAPTER 2. FUNCTIONS AND CLASSES humoments Hu moments phi = humoments(im) is the vector (7 × 1) of Hu moment invariants for the binary image im. Examples Animate image sequence: ianimate(seq). features (N × 1) cell array whose elements are vectors of feature objects. ianimate(seq. on Information Theory. 179-187. ianimate(im.

isurf. box = ibbox(im) is the minimal bounding box that contains the non-zero pixels in the image im. iblobs features f = iblobs(im. I set the frame rate (default 5 frames/sec) endlessly loop over the sequence save the animation as a series of PNG frames in the folder M plot no more than N features per frame (default 100) display only the I’th frame from the sequence See also PointFeature. FUNCTIONS AND CLASSES Options ‘fps’. options) is a vector of RegionFeature objects that describe each connected region in the image im. idisp ibbox Find bounding box box = ibbox(p) is the minimal bounding box that contains the points described by the columns of p (2 × N ).CHAPTER 2. iharris. M ‘npoints’. Machine Vision Toolbox for MATLAB 107 R Copyright c Peter Corke 2011 . YMIN YMAX]. N ‘only’. F ‘loop’ ‘movie’. The bounding box is a 2 × 2 matrix [XMIN XMAX.

0 ‘connect’.A2] accept only blobs with area in interval A1 to A2 ‘shape’. maximum vertical coordinate the number of pixels the value of the pixels forming this region the label assigned to this region a list of indices of features that are children of this feature coordinate of a point on the perimeter a list of edge points 2 × N matrix number of edge pixels true if region touches edge of the image major axis length of equivalent ellipse minor axis length of equivalent ellipse angle of major ellipse axis to horizontal axis aspect ratio b/a (always <= 1. maximum horizontal coordinate bounding box.0) 1 for a circle. [A1. less for other shapes a structure containing moments of order 0 to 2 See also RegionFeature. options) returns an edge image using the Canny edge detector. The edges within im are marked by non-zero values in E. and larger values correspond to stronger edges.CHAPTER 2. minimum vertical coordinate bounding box. Machine Vision Toolbox for MATLAB 108 R Copyright c Peter Corke 2011 .S2] accept only blobs with shape in interval S1 to S2 ‘touch’ ignore blobs that touch the edge (default accept) ‘class’. A set pixel aspect ratio. C set connectivity. default 1. FUNCTIONS AND CLASSES Options ‘aspect’. horizontal coordinate centroid. 4 (default) or 8 ‘greyscale’ compute greyscale moments 0 (default) or 1 ‘boundary’ compute boundary (default off) ‘area’. vertical coordinate bounding box. C accept only blobs of pixel value C (default all) The RegionFeature object has many properties including: uc vc umin umax vmin vmax area class label children edgepoint edge perimeter touch a b theta shape circularity moments centroid. imoments icanny edge detection E = icanny(im. ilabel. minimum horizontal coordinate bounding box. [S1.

Tel Aviv University.1 x strongest edge) set the upper hysteresis threshold (default 0. se. se) is the image im after morphological closing with the structuring element se.CHAPTER 2. See also isobel.5 x strongest edge) Author Oded Comay. FUNCTIONS AND CLASSES Options ‘sd’. n) as above but the structuring element se is applied n times. kdgauss iclose closing out = iclose(im. out = iclose(im. T ‘th1’. 1996-7. S ‘th0’. the effective structuing element is the Minkowski sum of the structuring element with itself n times. See also iopen. that is n dilations followed by n erosions. imorph Machine Vision Toolbox for MATLAB 109 R Copyright c Peter Corke 2011 . T set the standard deviation for smoothing (default 1) set the lower hysteresis threshold (default 0. This is an dilation followed by erosion. Notes • Cheaper to apply a smaller structuring element multiple times than one large one.

C = icolor(im.CHAPTER 2. colorize. ipixswitch iconcat Concatenate images C = iconcat(im.options) as above but displays the concatenated images using idisp. value of unset background pixels Machine Vision Toolbox for MATLAB 110 R Copyright c Peter Corke 2011 .options) as above but also returns the vector u whose elements are the coordinates of the left (or top in vertical mode) edge of the corresponding image. Create an aqua tinted version of the greyscale image c = icolor(im. B direction of concatenation: ‘horizontal’ (default) or ‘vertical’. See also imono. The images do not have to be of the same size.options) concatenates images from the cell array im. Options ‘dir’. [0 1 1]). D ‘bgval’. color) as above but each output pixel is color (3 × 1) times the corresponding element of im. Examples Create a color image that looks the same as the greyscale image c = icolor(im). iconcat(im.u] = iconcat(im. [C. and smaller images are surrounded by background pixels which can be speciﬁed. FUNCTIONS AND CLASSES icolor Colorize a greyscale image C = icolor(im) is a color image C (HxWx3)where each color plane is equal to im (H × W ).

Options ‘same’ ‘full’ ‘valid’ output image is same size as largest input image (default) output image is larger than the input image output image is smaller than the input image. and contains only valid pixels Notes • This function is a convenience wrapper for the builtin function CONV2. See also conv2 Machine Vision Toolbox for MATLAB 111 R Copyright c Peter Corke 2011 . im2. resulting in an output image with the same number of planes. The smaller image is taken as the kernel and convolved with the larger image.CHAPTER 2. ‘h’ or ‘v’ • In vertical mode all images are right justiﬁed • In horizontal mode all images are top justiﬁed See also idisp iconv Image convolution C = iconv(im1. FUNCTIONS AND CLASSES Notes • Works for color or greyscale images • Direction can be abbreviated to ﬁrst character. options) convolves im1 with im2. If the larger image is color (has multiple planes) the kernel is applied to each plane.

R ‘nfeat’. E ‘suppress’. CT ‘edgegap’. options) is a vector of PointFeature objects describing detected corner features.CHAPTER 2. The vector has zero mean and unit norm. The PointFeature object has many properties including: u v strength descriptor horizontal coordinate vertical coordinate corner strength corner descriptor (vector) Options ‘cmin’. This is a non-scale space detector and by default the Harris method is used. P ‘color’ minimum corner strength minimum corner strength as a fraction of maximum corner strength don’t return features closer than E to the edge of image (default 2) don’t return a feature closer than R pixels to an earlier feature (default 0) return the N strongest corners (default Inf) choose the detector where D is one of ‘harris’ (default). If im is an image sequence a cell array of PointFeature vectors is returned. CM ‘cminthresh’. FUNCTIONS AND CLASSES icorner Corner detector f = icorner(im. D ‘k’. • The function stops when: – the corner strength drops below cmin – the corner strenght drops below cMinThresh x strongest corner – the list of corners is exhausted • Features are returned in descending strength order • If im has more than 2 dimensions it is either a color image or a sequence • If im is NxMxP it is taken as an image sequence and f is a cell array whose elements are feature vectors for the corresponding image in the sequence. specify that im is a color image not a sequence Notes • Corners are processed in order from strongest to weakest. K ‘patch’. N ‘detector’. D ‘sigma’. ‘noble’ or ‘klt’ kernel width for smoothing (default 2) kernel for gradient (default kdgauss(2)) set the value of k for Harris detector use a P × P patch of surrounding pixel values as the feature vector. • If im is NxMx3 it is taken as a sequence unless the option ‘color’ is given Machine Vision Toolbox for MATLAB 112 R Copyright c Peter Corke 2011 . S ‘deriv’.

• The descriptor is a vector of ﬂoat types to save space References • “A combined corner and edge detector”.. IEEE Computer Society. • “Finding corners”.J. J. Stephens. pp. See also PointFeature.001) eliminate correspondences more than T x the median distance at each iteration.6. show the points p1 and p2 at each iteration. Harris and M. N ‘mindelta’. 1988. T ‘maxiter’. Manchester. • “Good features to track”.CHAPTER 2. Options ‘dplot’. options) is the homogeneous transformation that best transforms the set of points p2 to p1 using the iterative closest point algorithm.Noble. Computer Vision and Pattern Recognition. 593-593. T ‘distthresh’. d ‘plot’ ‘maxtheta’. isurf icp Point cloud alignment T = icp(p1. Proc. limit the change in rotation at each step to T (default 0. T show the points p1 and p2 at each iteration. Proc.G. with a delay of d [sec].05 rad) stop after N iterations (default 100) stop when the relative change in error norm is less than T (default 0. pp. May 1988. J. pp 147-151. Image and Vision Computing. vol. • The default descriptor is a vector [Ix* Iy* Ixy*] which are the unique elements of the structure tensor. Tomasi. p2. [T. Machine Vision Toolbox for MATLAB 113 R Copyright c Peter Corke 2011 . C.5 [sec].121-128. FUNCTIONS AND CLASSES • If im is NxMx3xP it is taken as a sequence of color images and f is a cell array whose elements are feature vectors for the corresponding color image in the sequence. p2. 1994. options) as above but also returns the norm of the error between the transformed point set p2 and p1. Shi and C. Fourth Alvey Vision Conf. with a delay of 0. where * denotes squared and smoothed.d] = icp(p1.

Mach. The image is smoothed with a Gaussian kernel with standard deviation m/2 then subsampled. each plane is decimated. m.CHAPTER 2. ismooth Machine Vision Toolbox for MATLAB 114 R Copyright c Peter Corke 2011 . m) is a decimated version of the image im whose size is reduced by m (an integer) in both dimensions. 239-256. idecimate an image s = idecimate(im. []) as above but no smoothing is applied prior to decimation.McKay. 2. s = idecimate(im. IEEETrans. s = idecimate(im.. Notes • If the image has multiple planes. pp. Reference “A method for registration of 3D shapes”. • Smoothing is used to eliminate aliasing artifacts and the standard deviation should be chosen as a function of the maximum spatial frequency in the image. sd) as above but the standard deviation of the smoothing kernel is set to sd. Intell. vol. m. See also iscale.Besl and H.or 3-dimensional. Feb. no. 1992. P. FUNCTIONS AND CLASSES Notes • Points can be 2. Pattern Anal. • For noisy data setting distthresh and maxtheta can help to prevent the the solution from diverging. 14.

F ‘square’ ‘wide’ ‘ﬂatten’ ‘ynormal’ ‘cscale’. histogram and zooming. If the image is zoomed. N ‘nogui’ ‘noaxes’ ‘noframe’ ‘plain’ ‘bar’ ‘print’. zero is white color map: random values. FUNCTIONS AND CLASSES idisp image display tool idisp(im. Options ‘ncolors’. The image is displayed in a ﬁgure with a toolbar across the top. frame or GUI add a color bar to the image write the image to ﬁle F in EPS format display aspect ratio so that pixels are squate make ﬁgure very wide. • The “line” button allows two points to be speciﬁed and a new ﬁgure displays intensity along a line between those points. positive is blue. Machine Vision Toolbox for MATLAB 115 R Copyright c Peter Corke 2011 . maximum value is black color map: greyscale signed. good for superimposed graphics Notes • Color images are displayed in true color mode: pixel triples map to display pixels • Grey scale images are displayed in indexed mode: the image pixel value is mapped through the color map to determine the display pixel value. zero is white. negative is red.and y-axes respectively. they are ﬁrst concatenated (horizontally). image is inverted C is a 2-vector that speciﬁes the grey value range that spans the colormap. XY ‘colormap’. XY is a cell array whose elements are vectors that span the x. options) displays an image and allows interactive investigation of pixel value. useful for displaying stereo pair display image planes (colors or sequence) as horizontally adjacent images y-axis increases upward.CHAPTER 2. C ‘xydata’. – The “zoom” button requires a left-click and drag to specify a box which deﬁnes the zoomed view. Set colormap C (N × 3) color map: greyscale unsigned. zero is black. maximum value is white color map: greyscale unsigned. User interface: • Left clicking on a pixel will display its value in a box at the top. positive is blue. • The “histo” button displays a histogram of the pixel values in a new ﬁgure. the histogram is computed over only those pixels in view. If im is a cell array of images. highlights ﬁne structure color map: greyscale unsigned. zero is black color map: greyscale signed. linear proﬁle. negative is red. C ‘grey’ ‘invert’ ‘signed’ ‘invsigned’ ‘random’ ‘dark’ number of colors in the color map (default 256) display the image without the GUI no axes on the image no axes or frame on the image no axes. darker than ‘grey’.

which by default (’greyscale’) is the range black to white. im is a greyscale N × M or color NxMx3 image.CHAPTER 2. labelimage. The pixel classes to be displayed are given by the elements of labels which is a scalar a vector of class labels. icolorize. FUNCTIONS AND CLASSES • The minimum and maximum image values are mapped to the ﬁrst and last element of the color map. colormap. See also image. idisplabel(im. colorseg idouble Convert integer image to double imd = idouble(im) returns an image with double precision elements in the range 0 to 1. See also iblobs. labelimage. iconcat idisplabel Display an image with mask idisplabel(im. bg) as above but the grey level of the non-selected pixels is speciﬁed by bg in the range 0 to 1. and labelimage is an N × M image containing integer pixel class labels for the corresponding pixels in im. labels) displays only those image pixels which belong to a speciﬁc class. caxis. The integer pixels are assumed to span the range 0 to the maximum value of their integer class. Non-selected pixels are displayed as white. See also iint Machine Vision Toolbox for MATLAB 116 R Copyright c Peter Corke 2011 . labels.

2. hitormiss igamma correction out = igamma(im.CHAPTER 2. • For images of type int the pixels are assumed in the range 0 to max integer value. ithin. ‘sRGB’) is a gamma decoded version of im using the sRGB decoding function (JPEG images sRGB encoded). See also itriplepoint. out = igamma(im. • For images of type double the pixels are assumed to be in the range 0 to 1. Machine Vision Toolbox for MATLAB 117 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES iendpoint Find end points in a binary skeleton image out = iendpoint(im) is a binary image where pixels are set if the corresponding pixel in the binary image im is the end point of a single-pixel wide line such as found in an image skeleton. All pixels are raised to the power gamma. gamma) is a gamma corrected version of im. • Gamma decoding is typically performed in the display with gamma=2. Notes • Gamma encoding is typically performed in a camera with gamma=0. Gamma encoding can be performed with gamma > 1 and decoding with gamma < 1. Computed using the hit-or-miss morphological operator.45. • For images with multiple planes the gamma correction is applied to all planes.

5).jpg’). 2006. 0. 1500. 100.CHAPTER 2. k. Int.m] = igraphseg(im. 167181. 59. FUNCTIONS AND CLASSES igraphseg Graph-based segmentation L = igraphseg(im. [l. L = igraphseg(im. Notes • Is a MEX ﬁle Author Pedro Felzenszwalb. 2004. Example im = iread(’58060. idisp(im) Reference “Efﬁcient graph-based image segmentation”. sigma) as above but m is the number of regions found. Felzenszwalb and D. vol. [L. Journal on Computer Vision. k is the scale parameter. k. L is an image of the same size as im where each element is the label assigned to the corresponding pixel in im. min is the minimum region size (pixels).5). pp.m] = igraphseg(im. imser Machine Vision Toolbox for MATLAB 118 R Copyright c Peter Corke 2011 . min. min) is a graph-based segmentation of the greyscale or color image im. P. min. and a larger value indicates a preference for larger regions. sigma) as above and sigma is the width of a Gaussian which is used to initially smooth the image (default 0. Sept. k. Huttenlocher. See also ithresh.

FUNCTIONS AND CLASSES ihist Image histogram ihist(im. For an image with multiple planes the histogram of each plane is given in a separate subplot.x] = ihist(im. ’normcdf’). For an image with multiple planes H is a matrix with one column per image plane. Options ‘nbins’ ‘cdf’ ‘normcdf’ ‘sorted’ number of histogram bins (default 256) compute a cumulative histogram compute a normalized cumulative histogram histogram but with occurrence sorted in descending order Example [h. • For a uint8 image the MEX function fhist is used See also hist Machine Vision Toolbox for MATLAB 119 R Copyright c Peter Corke 2011 .CHAPTER 2. bar(x.h). options) as above but also returns the bin coordinates as a column vectors. H = ihist(im.x] = ihist(im. [h. Notes • For a uint8 image the histogram spans the greylevel range 0-255 • For a ﬂoating point image the histogram spans the greylevel range 0-1 • For ﬂoating point images all NaN and Inf values are ﬁrst removed. options) displays the image histogram. options) is the image histogram as a column vector.h).x] = ihist(im). plot(x. [H.

FUNCTIONS AND CLASSES iint Convert image to integer class out = iint(im) is an image with 8-bit unsigned integer elements in the range 0 to 255. The value of parents(I) is the label of the parent or enclosing region of region I. A value of 0 indicates that the region has no single enclosing region.CHAPTER 2. for a multilevel image it means that it touches more than one other region. See also idouble iisum Sum of integral image s = iisum(ii. [L.y2).m.y1) and bottom-right (x2. same size as im. See also intgimage ilabel Label an image L = ilabel(im) performs connectivity analysis on the image im and returns a label image L. y2. [L. for a binary image this means the region touches the edge of the image.m] = ilabel(im) as above but returns the value of the maximum label value. x1. Region labels are in the range 1 to M. where each pixel value represents the integer region label assigned to the corresponding pixel in im. y1.parents] = ilabel(im) as above but also returns region hierarchy information. ii is a precomputed integral image. The ﬂoating point pixels values in im are assumed to span the range 0 to 1. x2) is the sum of pixels in the rectangular image region deﬁned by its top-left (x1. Machine Vision Toolbox for MATLAB 120 R Copyright c Peter Corke 2011 .

v) as above but the pixel on the line are set to v. Notes • Is a MEX ﬁle. p2.maxlabel. To use 8-way connectivity pass a second argument of 8. out = iline(im.class.CHAPTER 2. • The image can be binary or multi-level • Connectivity is performed using 4 nearest neighbours by default. [L.maxlabel. eg. IBLOBS is a higher level interface.class] = ilabel(im) as above but also returns the class of pixels within each region.parents. otherwise it does not. • This is a “low level” function. The pixels on the line are set to 1. each a 2-vector [X. p1. ipaste Machine Vision Toolbox for MATLAB 121 R Copyright c Peter Corke 2011 . Notes • Uses the Bresenham algorith • Only works for greyscale images • The line looks jagged since no anti-aliasing is performed See also bresenham. imoments iline Draw a line in an image out = iline(im. FUNCTIONS AND CLASSES [L.Y]. p1. ilabel(im. See also iblobs. If edge(I) is 1 then region I touches edge of the image.edge] = ilabel(im) as above but also returns the edge-touch status of each region. p2) is a copy of the image im with a line drawn between the points p1 and p2. iproﬁle. The value of class(I) is the value of the pixels that comprise region I. 8).parents.

Notes • Useful for tracking a template in an image sequence.0 Machine Vision Toolbox for MATLAB 122 R Copyright c Peter Corke 2011 . xmax.score] = imatch(im1.y).y) and of size s. If s is a scalar the search region is [-s. s. See also col2im imatch Template matching xm = imatch(im1. The rows correspond to horizontal positions of the template. a perfect match score is 1. y. ymax] relative to (x.and y-offsets relative to (x. s] % relative to (x.CHAPTER 2. centred at (x.CC] where (DX.y). im2. -s. ymin. x.y) and its half-width is H. • ZNCC matching is used. x. • im1 and im2 must be the same size. The return value is xm=[DX. s) works as above but also returns a matrix of matching score values for each template position tested. More generally s is a 4-vector s=[xmin. y. [xm. FUNCTIONS AND CLASSES im2col Convert an image to pixel per row format out = im2col(im) returns the image (HxWxP) as a pixel vector (N ×P ) where each row is a pixel value (1 × P ). The template in im1 is centred at (x.DY) are the x. s) is the matching subimage of im1 (template) within the image im2.DY. The pixels are in image column order and there are N=W × H rows. The template is searched for within im2 inside a rectangular region. • Is a MEX ﬁle.y) and CC is the similarity score (zero-mean normalized cross correlation) for the best match in the search region. w2. im2. and columns the vertical position. H.

v. All pixels are equally weighted. The RegionFeature object has many properties including: Machine Vision Toolbox for MATLAB 123 R Copyright c Peter Corke 2011 . effectively a greyscale image.CHAPTER 2. f = imoments(u. H) as above but the domain is w × H. effectively a binary image.u) = u and v(v. FUNCTIONS AND CLASSES See also isimilarity imeshgrid Domain matrices for image [u.v] = imeshgrid(w. [u. f = imoments(u.v] = imeshgrid(size) as above but the domain is described size which is scalar size× size or a 2-vector [w H].v] = imeshgrid(im) return matrices that describe the domain of image im and can be used for the evaluation of functions over the image. w) as above but the pixels have weights given by the vector w. [u.u) = v. The element u(v. See also meshgrid imoments Image moments f = imoments(im) is a RegionFeature object that describes the greyscale moments of the image im. v) as above but the moments are computed from the pixel coordinates given as vectors u and v.

m02. horizontal coordinate centroid. FUNCTIONS AND CLASSES uc vc area a b theta shape moments centroid. • This function does not perform connectivity. Notes • For a binary image the zeroth moment is the number of non-zero pixels.CHAPTER 2. if connected blobs are required then the ILABEL function must be used ﬁrst. vertical coordinate the number of pixels major axis length of equivalent ellipse minor axis length of equivalent ellipse angle of major ellipse axis to horizontal axis aspect ratio b/a (always <= 1. Different conversion functions are supported. m20. Options ‘r601’ ‘r709’ ITU recommendation 601 (default) ITU recommendation 709 See also colorize. icolor. m01. the elements are m00. m10. See also RegionFeature. m11. options) is a greyscale equivalent to the color image im. colorspace Machine Vision Toolbox for MATLAB 124 R Copyright c Peter Corke 2011 .0) a structure containing moments of order 0 to 2. imoments imono Convert color image to monochrome out = imono(im. ilabel. or its area.

part of VLFeat (vlfeat. [label.m] = imser(im. Sept. Pajdla. m. ’grey’. Urban. FUNCTIONS AND CLASSES imorph Morphological neighbourhood processing out = imorph(im. options) is a greyscale segmentation of the image im based on maximally stable extremal regions.org). 22. Matas. The labels [L. Notes • Is a wrapper for vl mser.png’. 761767. ’light’).m] = imser(im. options) as above but m is the number of regions found. 2004. Image and Vision Computing. L is an image of the same size as im where each element is the label assigned to the corresponding pixel in im. se. vol. pp. op) is the image im after morphological processing with the operator op and structuring element se.CHAPTER 2. O. Chum. ’double’). imser Maximally stable extremal regions L = imser(im. idisp(im) Reference “Robust wide-baseline stereo from maximally stable extremal regions”. Options ‘dark’ ‘light’ looking for dark features against a light background (default) looking for light features against a dark background Example im = iread(’castle_sign2. by Andrea Vedaldi and Brian Fulkerson • vl mser is a MEX ﬁle Machine Vision Toolbox for MATLAB 125 R Copyright c Peter Corke 2011 . and T. J.

CHAPTER 2. FUNCTIONS AND CLASSES See also ithresh.ˆ2). igraphseg inormhist Histogram normalization out = inormhist(im) is a histogram normalized version of the image im. Examples Create integral images for sum of pixels over rectangular regions i = intimage(im). Create integral images for sum of pixel squared values over rectangular regions i = intimage(im. See also ihist intgimage Compute integral image out = intimage(im) is an integral image corresponding to im. • Highlights image detail in dark areas of an image. Integral images can be used for rapid computation of summations over rectangular regions. Notes • The histogram of the normalized image is approximately uniform. Machine Vision Toolbox for MATLAB 126 R Copyright c Peter Corke 2011 .

imorph ipad Pad an image with constants out = ipad(im. n) is a padded version of the image im with a block of NaN values n pixels wide on the sides of im as speciﬁed by sides which is a string containing one or more of the characters: ‘t’ top ‘b’ bottom ‘l’ left ‘r’ right out = ipad(im. Machine Vision Toolbox for MATLAB 127 R Copyright c Peter Corke 2011 . out = iopen(im. that is n erosions followed by n dilations. sides. n) as above but the structuring element se is applied n times. sides. the effective structuing element is the Minkowski sum of the structuring element with itself n times. v) as above but pads with pixels of value v. This is an erosion followed by dilation. Notes • Cheaper to apply a smaller structuring element multiple times than one large one. n. See also iclose.CHAPTER 2. se. se) is the image im after morphological opening with the structuring element se. FUNCTIONS AND CLASSES See also iisum iopen Morphological opening out = iopen(im.

im2) is an image where each pixel is selected from the corresponding pixel in im1 or im2 according to the corresponding values of mask. im1. im2. otherwise p is the top-left corner (default) the coordinates of p start at zero.CHAPTER 2. otherwise im2 is selected. If the element of mask is zero im1 is selected.V]. 20. ’t’. ’tblr’. FUNCTIONS AND CLASSES Examples Add a band of zero pixels 20 pixels high across the top of the image: ipad(im. 10. p. Machine Vision Toolbox for MATLAB 128 R Copyright c Peter Corke 2011 . 0) Add a band of white pixels 10 pixels wide on all sides of the image: ipad(im. 255) Notes • Not a tablet computer. by default 1 is assumed im2 overwrites the pixels in im (default) im2 is added to the pixels in im im2 is set to the mean of pixel values in im2 and im See also iline ipixswitch Pixelwise image merge out = ipixswitch(mask. options) is the image im with the image im2 pasted in at the position p=[U. Options ‘centre’ ‘zero’ ‘set’ ‘add’ ‘mean’ The pasted image is centred at p. ipaste Paste an image into an image out = ipaste(im.

See also bresenham.v) for the corresponding row of p. [p. FUNCTIONS AND CLASSES Notes • im1 and im2 must have the same number of rows and columns • if im1 and im2 are both greyscale then out is greyscale • if either of im1 and im2 are color then out is color See also colorize iproﬁle Extract pixels along a line v = iproﬁle(im.CHAPTER 2. p1. p2) is a vector of pixel values extracted from the image im (HxWxP) between the points p1 (2 × 1) and p2 (2 × 1). p1. out is a cell array of images each one having R Machine Vision Toolbox for MATLAB 129 Copyright c Peter Corke 2011 . p2) as above but also returns the coordinates of the pixels for each point along the line. v (N × P ) has one row for each point along the line and the row is the pixel value which will be a vector for a multi-plane image. iline ipyramid Pyramidal image decomposition out = ipyramid(im) is a pyramid decomposition of input image im using Gaussian smoothing with standard deviation of 1.uv] = iproﬁle(im. Each row of uv is the pixel coordinate (u. Notes • The Bresenham algorithm is used to ﬁnd points along the line.

op. The value of edge is: ‘border’ ‘none’ ‘trim’ ‘wrap’ the border value is replicated (default) pixels beyond the border are not included in the window output is not computed for pixels whose window crosses the border. nbins) as above but the number of histogram bins can be speciﬁed. se. im > irank(im. hence output image had reduced dimensions. ismooth irank Rank ﬁlter out = irank(im.5)).CHAPTER 2. The highest rank. op. sigma. 1. se(2. order. out = imorph(image. The pyramid is computed down to a nonhalvable image size.3). out = imorph(image. See also iscalespace. Machine Vision Toolbox for MATLAB 130 R Copyright c Peter Corke 2011 . se. se). n) as above but only n levels of the pyramid are computed. 3 × 3 non-local maximum: se = ones(3. se) is a rank ﬁltered version of im. idecimate. edge) as above but the processing of edge pixels can be controlled. the maximum.2) = 0. out = ipyramid(im. ones(5. is order=1. FUNCTIONS AND CLASSES dimensions half that of the previous image. nbins. Notes • Works for greyscale images only. sigma) as above but the Gaussian standard deviation is sigma. 12. Only pixels corresponding to non-zero elements of the structuring element se are ranked and the order’th value in rank becomes the corresponding output pixel value. the image is assumed to wrap around Examples 5 × 5 median ﬁlter: irank(im. out = ipyramid(im.

Notes • A greyscale image is returned as an H × W matrix • A color image is returned as an HxWx3 matrix • A greyscale image sequence is returned as an HxWxN matrix where N is the sequence length Machine Vision Toolbox for MATLAB 131 R Copyright c Peter Corke 2011 . either numeric or ‘sRGB’ decimate image by R in both dimensions apply the region of interest R to each image. On subsequent calls the initial folder is as set on the last call. options) reads the speciﬁed ﬁle and returns a matrix. ivar. • A histogram method is used with nbins (default 256). vmin vmax]. If multiple ﬁles match a 3D or 4D image is returned where the last dimension is the number of images in the sequence. FUNCTIONS AND CLASSES Notes • Is a MEX ﬁle. where R=[umin umax. Wildcards are allowed in ﬁle names. Options ‘uint8’ ‘single’ ‘double’ ‘grey’ ‘grey 709’ ‘gamma’. return an image with double precision ﬂoating point pixels in the range 0 to 1. iwindow iread Read image from ﬁle im = iread() presents a ﬁle selection GUI from which the user can select an image ﬁle which is returned as 2D or 3D matrix. R return an image with 8-bit unsigned integer pixels in the range 0 to 255 return an image with single precision ﬂoating point pixels in the range 0 to 1. If the path is relative it is searched for on Matlab search path. G ‘reduce’. convert image to greyscale if it’s color using ITU rec 601 convert image to greyscale if it’s color using ITU rec 709 gamma value. See also imorph. im = iread(ﬁle. R ‘roi’.CHAPTER 2.

out2.h2] = irectify(f. im2) as above but also returns the homographies h1 and h2 that warp im1 to out1 and im2 to out2 respectively. igamma. Machine Vision Toolbox for MATLAB 132 R Copyright c Peter Corke 2011 . imwrite.CHAPTER 2. im1. k) is an image where each pixel is replicated into a k × k tile. f (3 × 3) is the fundamental matrix relating the two views and m is a FeatureMatch object containing point correspondences between the images. If im is H × W the result is (KH)x(KW). m. im1. Notes • The resulting image pair are epipolar aligned. CentralCamera ireplicate Expand image out = ireplicate(im. istereo. imono. See also FeatureMatch. • The resulting images may have negative disparity. FUNCTIONS AND CLASSES • A color image sequence is returned as an HxWx3xN matrix where N is the sequence length See also idisp.h1. Notes • Color images are not supported. path irectify Rectify stereo image pair [out1. homwarp. m. [out1. im2) is a rectiﬁed pair of images corresponding to im1 and im2.out2] = irectify(f.

CHAPTER 2. V ‘smooth’. Options ‘outsize’. angle. S set size of out to H × W where S=[W. S ‘extrapval’. iscale iroi Extract region of interest out = iroi(im.R) is a subimage of the image im described by the rectangle R=[umin. FUNCTIONS AND CLASSES See also idecimate. same size as im scale the image size by S (default 1) set background pixels to V (default 0) smooth image with Gaussian of standard deviation S Machine Vision Toolbox for MATLAB 133 R Copyright c Peter Corke 2011 .vmax]. options) is a version of the image im that has been rotated about its centre. out = iroi(im) as above but the image is displayed and the user is prompted to adjust a rubber band box to select the region of interest. vmin. S ‘crop’ ‘scale’. [out. See also idisp.vmin vmax].H] return central part of image. iline irotate Rotate image out = irotate(im.R] = iroi(im) as above but returns the selected region of interest R=[umin umax.umax.

s=[] means no smoothing (default s=1) Machine Vision Toolbox for MATLAB 134 R Copyright c Peter Corke 2011 . s set size of out to H × W where s=[W. See also iscale isamesize Automatic image trimming out = isamesize(im1. out = isamesize(im1. FUNCTIONS AND CLASSES Notes • Rotation is deﬁned with respect to a z-axis into the image.5 is symmetric cropping. s) is a version of im scaled in both directions by s which is a real scalar. s<1 makes it smaller.CHAPTER 2. s ‘extrapval’. V ‘smooth’.5 moves the crop window down or to the right. while bias>0. • Counter-clockwise is a positive angle. bias) as above but bias controls which part of the image is cropped. im2. bias<0. Options ‘outsize’.H] set background pixels to V (default 0) smooth image with Gaussian of standard deviation s.5 moves the crop window up or to the left. See also iscale iscale Scale an image out = iscale(im. bias=0. s>1 makes the image larger. im2) is an image derived from im1 that has the same dimensions as im2 which is achieved by cropping and scaling.

s] = iscalespace(im. L (HxWxN) is the absolute value of the Laplacian of Gaussian (LoG) of the scale sequence. The standard deviation of the smoothing Gaussian is sigma. corresponding to each step of the sequence. See also iscalespace. FUNCTIONS AND CLASSES See also ireplicate.L. of the Laplacian of Gaussian (LoG) scale-space image sequence L (HxWxN). n. Machine Vision Toolbox for MATLAB 135 R Copyright c Peter Corke 2011 . g (HxWxN) is the scale sequence.L. irotate iscalemax Scale space maxima f = iscalemax(L. [g. s) is a vector of ScalePointFeature objects which are the maxima.CHAPTER 2.s] = iscalespace(im. ScalePointFeature iscalespace Scale-space image sequence [g. At each scale step the variance of the Gaussian increases by sigma2 . Notes • The Laplacian is computed from the difference of adjacent Gaussians. idecimate. The ﬁrst step in the sequence is the original image. in space and scale. Notes • Features are sorted into descending feature strength. n) as above but sigma=1. sigma) is a scale space image sequence of length n derived from im (H × W ). s (N × 1) is a vector of scale values corresponding to each plane of L. and s (n × 1) is the vector of scales.

ilaplace.CHAPTER 2. that is. options) returns a vector of SiftPointFeature objects representing scale and rotationally invariant interest points in the image im. FUNCTIONS AND CLASSES See also iscalemax. else false (0). ismooth. See also isrot. but also checks the validity of the rotation matrix. it its third dimension is equal to three. klog iscolor Test for color image iscolor(im) is true (1) if im is a color image. isvec isift SIFT feature extractor sf = isift(im. The SiftPointFeature object has many properties including: Machine Vision Toolbox for MATLAB 136 R Copyright c Peter Corke 2011 . ishomog Test if argument is a homogeneous transformation ishomog(T) is true (1) if the argument T is of dimension 4 × 4 or 4x4xN. ishomog(T. ‘valid’) as above.

isurf. Lowe. 2 (2004). R set the number of features to return (default Inf) set the suppression radius (default 0) Notes • Greyscale images only. • Wraps a MEX ﬁle from www.org • Corners are processed in order from strongest to weakest. • Features are returned in descending strength order. • ISURF is a functional equivalent. 60.vlfeat. pp. • If im is HxWxN it is considered to be an image sequence and F is a cell array with N elements. FUNCTIONS AND CLASSES u v strength descriptor sigma theta horizontal coordinate vertical coordinate feature strength feature descriptor (128 × 1) feature scale feature orientation [rad] Options ‘nfeat’. Reference David G. N ‘suppress’. • The SIFT algorithm is patented by Univerity of British Columbia. See also SiftPointFeature. icorner isimilarity Locate template in image s = isimilarity(T.CHAPTER 2. International Journal of Computer Vision. “Distinctive image features from scale-invariant keypoints”. im) is an image where each pixel is the ZNCC similarity of the template T (M × M ) to the M × M neighbourhood surrounding the corresonding Machine Vision Toolbox for MATLAB 137 R Copyright c Peter Corke 2011 . each of which is the feature vectors for the corresponding image in the sequence. 91-110.

the function accepts two regions and returns a scalar similarity score. @ncc. @zssd. FUNCTIONS AND CLASSES input pixel in im. metric) as above but the similarity metric is speciﬁed by the function metric which can be any of @sad. s is same size as im. zssd. sad. ncc.p] = isize(im) is the image height H and width w and number of planes p. s = isimilarity(T. [w. See also size ismooth Gaussian smoothing out = ismooth(im. Notes • Similarity is not computed where the window crosses the image boundary. and these output pixels are set to NaN. [w. R Machine Vision Toolbox for MATLAB 138 Copyright c Peter Corke 2011 .d) is the size of the d’th dimension of im.H. See also imatch. Even if the image has only two dimensions p will be one. @zsad. zsad. ssd. • The ZNCC function is a MEX ﬁle and therefore the fastest • User provided similarity metrics can be provided.H] = isize(im) is the image height H and width w. zncc isize Size of image n = isize(im.CHAPTER 2. @ssd. im. sigma) is the image im after convolution with a Gaussian kernel of standard deviation sigma.

This is the norm of the vertical and horizontal gradients at each pixel.dx) as above but applies the kernel dx and dx’ to compute the horizontal and vertical gradients respectively. [gx. those where the kernel does not exceed the bounds of the image. The Sobel kernel is: | -1 | -2 | -1 0 0 0 1| 2| 1| out = isobel(im. Machine Vision Toolbox for MATLAB 139 R Copyright c Peter Corke 2011 . • Smooths all planes of the input image • The Gaussian kernel has a unit volume • If input image is integer it is converted to ﬂoat. See also iconv. then converted back to integer. Notes • By default (option ‘full’) the returned image is larger than the passed image. Options ‘full’ ‘same’ ‘valid’ returns the full 2-D convolution (default) returns out the same size as im returns the valid pixels only. FUNCTIONS AND CLASSES out = ismooth(im.gy] = isobel(im. options) as above but the options are passed to CONV2. convolved. [gx.gy] = isobel(im) as above but returns the gradient images.dx) as above but returns the gradient images.CHAPTER 2. sigma. kgauss isobel Sobel edge detector out = isobel(im) is an edge image computed using the Sobel edge operator applied to the image im.

or a 2-vector [DMIN DMAX] for searches in the range DMIN to DMAX.u-d) is the same world point as iml(v. FUNCTIONS AND CLASSES Notes • Tends to produce quite thick edges. H. the disparity d=d(v. That is. imr. See also ksobel. which can be a scalar for N × N or a 2-vector [N. options) is a disparity image computed from the epipolar aligned stereo pair: the left image iml (H × W ) and the right image imr (H × W ). range. range. ‘valid’) as above.CHAPTER 2. H is the half size of the matching window.u).u) means that imr(v.sim] = istereo(iml. options) as above but returns sim which is the same size as d and the elements are the peak matching score for the corresponding Machine Vision Toolbox for MATLAB 140 R Copyright c Peter Corke 2011 . See also ishomog. else false (0). isvec istereo Stereo matching d = istereo(iml. iconv isrot Test if argument is a rotation matrix isrot(R) is true (1) if the argument is of dimension 3 × 3. w. which can be a scalar for disparities in the range 0 to range. imr. icanny. but also checks the validity of the rotation matrix. d (H × W ) is the disparity and the value at each pixel is the horizontal shift of the corresponding pixel in iml as observed in imr. range is the disparity search range. • The resulting image is the same size as the input image. isrot(R. [d.M] for an N × M window.

options) if the ‘interp’ option is given then disparity is estimated to sub-pixel precision using quadratic interpolation. The interpolation polynomial is s = Ad2 + Bd + C where s is the similarity score and d is disparity relative to the integer disparity at which s is maximum. imr. 3) See also irectify. dx. • sim = max(dsi. The I’th plane is the similarity of iml to imr shifted by DMIN+I-1. range. Options ‘metric’.p] = istereo(iml.dsi] = istereo(iml. B.sim. In this case d is the interpolated disparity and p is a structure with elements A. stdisp istretch Image normalization out = istretch(im) is a normalized image in which all pixel values lie in the range 0 to 1. That is. M ‘interp’ string that speciﬁes the similarity metric to use which is one of ‘zncc’ (default). Machine Vision Toolbox for MATLAB 141 R Copyright c Peter Corke 2011 . w.CHAPTER 2.5).A and p. ‘ssd’ or ‘sad’. w. range. • Disparity values pixels within a half-window dimension (H) of the edges will not be valid and are set to NaN. imr. For the default matching metric ZNCC this varies between -1 (very bad) to +1 (perfect).max) as above but pixel values lie in the range 0 to max.dx is the peak of the polynomial with respect to the integer disparity at which s is maximum (in the range -0. [d. options) as above but returns dsi which is the disparity space image (HxWxN) where N=DMAX-DMIN+1. FUNCTIONS AND CLASSES elements of d. ‘ncc’. p. enable subpixel interpolation and d contains non-integer values (default false) Notes • Images must be greyscale.sim. out = istretch(im. p.5 to +0. [d.B are matrices the same size as d whose elements are the per pixel values of the interpolation polynomial coefﬁcients. a linear mapping where the minimum value of im is mapped to 0 and the maximum value of im is mapped to 1.

N ‘thresh’. The SurfPointFeature object has many properties including: u v strength descriptor sigma theta horizontal coordinate vertical coordinate feature strength feature descriptor (64 × 1 or 128 × 1) feature scale feature orientation [rad] Options ‘nfeat’. are ﬁrst converted to greyscale.CHAPTER 2. Twente) or a MEX-ﬁle OpenCV wrapper by Petter Strandmark. each of which is the feature vectors for the corresponding image in the sequence. • Wraps an M-ﬁle implementation of OpenSurf by D. Machine Vision Toolbox for MATLAB 142 R Copyright c Peter Corke 2011 . number of octaves to process (default 5) return 128-element descriptor (default 64) don’t compute rotation invariance set the suppression radius (default 0). • Features are returned in descending strength order • If im is HxWxN it is considered to be an image sequence and F is a cell array with N elements. options) returns a vector of SurfPointFeature objects representing scale and rotationally invariant interest points in the image im. T ‘octaves’. • The sign of the Laplacian is not retained. N ‘extended’ ‘upright’ ‘suppress’. Kroon (U. FUNCTIONS AND CLASSES See also inormhist isurf SURF feature extractor sf = isurf(im. Features are not returned if they are within R [pixels] of an earlier (stronger) feature. or sequences. R set the number of features to return (default Inf) set Hessian threshold. Notes • Color images. Increasing the threshold reduces the number of features computed and reduces computation time.

else false (0). 2008 See also SurfPointFeature.or columnvector. Vol. FUNCTIONS AND CLASSES Reference Herbert Bay. 110. See also hitormiss. isrot ithin Morphological skeletonization out = ithin(im) is the binary skeleton of the binary image im. itriplepoint. out = ithin(im. Any non-zero region is replaced by a network of single-pixel wide lines. isift. “SURF: Speeded Up Robust Features”. No. Computer Vision and Image Understanding (CVIU). See also ishomog. Otherwise false (0). icorner isvec Test if argument is a vector isvec(v) is true (1) if the argument v is a 3-vector. iendpoint Machine Vision Toolbox for MATLAB 143 R Copyright c Peter Corke 2011 .CHAPTER 2. Andreas Ess. L) is true (1) if the argument v is a vector of length L. either a row. Tinne Tuytelaars. 346–359. pp. 3. Luc Van Gool. isvec(v.delay) as above but graphically displays each iteration of the skeletonization algorithm with a pause of delay seconds between each iteration.

T) as above but the initial threshold is set to T. This function crops out the central rectangular region of each.out2] = itrim(im1. The default is 0.CHAPTER 2. ithresh(im. The same cropping is applied to each input image. [out1.T) as above but the threshold T in the range 0 to 1 is used to adjust the level of cropping. a higher value will include fewer NaN value in the result.5. irectify Machine Vision Toolbox for MATLAB 144 R Copyright c Peter Corke 2011 . a lower value will include more.out2] = itrim(im1.im2. See also homwarp. FUNCTIONS AND CLASSES ithresh Interactive image threshold ithresh(im) displays the image im in a window with a slider which adjusts the binary threshold. It assumes that the undeﬁned pixels in im1 and im2 have values of NaN.0 See also idisp itrim Trim images [out1. • For a uint8 class image the slider range is 0 to 255.im2) returns the central parts of images im1 and im2 as out1 and out2 respectively. When images are rectiﬁed or warped the shapes can become quite distorted and are embedded in rectangular images surrounded by black of NaN values. • For a ﬂoating point class image the slider range is 0 to 1. Notes • Greyscale image only.

op. The operation op is one of: ‘var’ variance ‘kurt’ Kurtosis or peakiness of the distribution ‘skew’ skew or asymmetry of the distribution out = ivar(im. The value of edge is: ‘border’ ‘none’ ‘trim’ ‘wrap’ the border value is replicated (default) pixels beyond the border are not included in the window output is not computed for pixels whose window crosses the border. hitormiss ivar Pixel window statistics out = ivar(im. Computed using the hit-or-miss morphological operator. ithin. se. The elements in the neighbourhood corresponding to non-zero elements in se are packed into a vector on which the required statistic is computed. that is where three single-pixel wide line intersect. Machine Vision Toolbox for MATLAB 145 R Copyright c Peter Corke 2011 . See also iendpoint. se. edge) as above but performance at edge pixels can be controlled. the image is assumed to wrap around Notes • Is a MEX ﬁle. These are the Voronoi points in an image skeleton.CHAPTER 2. FUNCTIONS AND CLASSES itriplepoint Find triple points out = itriplepoint(im) is a binary image where pixels are set if the corresponding pixel in the binary image im is a triple point. hence output image had reduced dimensions. op) is an image where each output pixel is the speciﬁed statistic over the pixel neighbourhood indicated by the structuring element se which should have odd side lengths.

se. FUNCTIONS AND CLASSES See also irank. se. The value of edge is: ‘border’ ‘none’ ‘trim’ ‘wrap’ the border value is replicated (default) pixels beyond the border are not included in the window output is not computed for pixels whose window crosses the border. See also ivar. @max). irank Machine Vision Toolbox for MATLAB 146 R Copyright c Peter Corke 2011 . iwindow iwindow Generalized spatial operator out = iwindow(im. the image is assumed to wrap around Example Compute the maximum value over a 5 × 5 window: iwindow(im. out = iwindow(image.CHAPTER 2. Compute the standard deviation over a 3 × 3 window: iwindow(im.3). ones(3. func) is an image where each pixel is the result of applying the function func to a neighbourhood centred on the corresponding pixel in im. @std). func. ones(5. The return value becomes the corresponding pixel value in out. hence output image had reduced dimensions. The neighbourhood is deﬁned by the size of the structuring element se which should have odd side lengths. The elements in the neighbourhood corresponding to non-zero elements in se are packed into a vector (in column order from top left) and passed to the speciﬁed function handle func.5). • Is slow since the function func must be invoked once for every output pixel. Notes • Is a MEX ﬁle. edge) as above but performance of edge pixels can be controlled.

k = kdgauss(sigma. dG/dx. Notes • This kernel is the horizontal derivative of the Gaussian. • This kernel is an effective edge detector. FUNCTIONS AND CLASSES kcircle Circular structuring element k = kcircle(R) is a square matrix S × S where S=2R+1 of zeros with a maximal centered circular region of radius R pixels set to one. dG/dy. iconv Machine Vision Toolbox for MATLAB 147 R Copyright c Peter Corke 2011 . See also kgauss.s) as above but s is explicitly speciﬁed. kdog. ktriangle. is k’. • The vertical derivative. See also ones. and the two numbers are interpretted as inner and outer radii.CHAPTER 2. k = kcircle(R. imorph kdgauss Derivative of Gaussian kernel k = kdgauss(sigma) is a 2-dimensional derivative of Gaussian kernel (W × W ) of width (standard deviation) sigma and centred within the matrix k whose half-width H = 3 × sigma and W=2 × H+1. Notes • If R is a 2-element vector the result is an annulus of ones. klog. H) as above but the half-width is explictly speciﬁed.

k = kdog(sigma1. sigma2) as above but sigma2 is speciﬁed directly. sigma2. H) as above but the half-width H is speciﬁed.KGAUSS(SIGMA2).6*sigma1. klog. The kernel is centred within the matrix k whose half-width H = 3 × SIGM A and W=2 × H+1. klog. iconv kgauss Gaussian kernel k = kgauss(sigma) is a 2-dimensional unit-volume Gaussian kernel of width (standard deviation) sigma. k = kgauss(sigma. k = kdog(sigma1. kdog. and centred within the matrix k whose half-width is H=2 × sigma and W=2 × H+1. FUNCTIONS AND CLASSES kdog Difference of Gaussian kernel k = kdog(sigma1) is a 2-dimensional difference of Gaussian kernel equal to KGAUSS(sigma1) . See also kdgauss. See also kgauss. By default SIGMA2 = 1. H) as above but the kernel half-width is speciﬁed. iconv Machine Vision Toolbox for MATLAB 148 R Copyright c Peter Corke 2011 . where sigma1 > SIGMA2. kdgauss.CHAPTER 2. Notes • This kernel is similar to the Laplacian of Gaussian and is often used as an efﬁcient approximation.

iconv klog Laplacian of Gaussian kernel k = klog(sigma) is a 2-dimensional Laplacian of Gaussian kernel of width (standard deviation) sigma and centred within the matrix k whose half-width is H=3 × sigma. iconv. See also kgauss. and W=2 × H+1.CHAPTER 2. kdog. See also ilaplace. k = klog(sigma. kdgauss. FUNCTIONS AND CLASSES klaplace Laplacian kernel k = klaplace() is the Laplacian kernel: —0 |1 1 0— -4 1| —0 1 0— Notes • This kernel has an isotropic response to gradient. zcross Machine Vision Toolbox for MATLAB 149 R Copyright c Peter Corke 2011 . H) as above but the half-width H is speciﬁed.

L is a vector (N ×1) whose elements indicates which cluster the corresponding element of x belongs to. C) is similar to above but the clustering step is not performed.CHAPTER 2. [L. it is assumed to have been completed previously. Pattern Recognition Principles. c0) as above but the initial clusters c0 (D × k) is given and column I is the initial estimate of the centre of cluster I. FUNCTIONS AND CLASSES kmeans K-means clustering [L. C (D × k) contains the cluster centroids and L (N × 1) indicates which cluster the corresponding element of x is closest to. pp 94 ksobel Sobel edge detector k = ksobel() is the Sobel x-derivative kernel: |-1 |-2 |-1 0 0 0 1| 2| 1| Notes • This kernel is an effective horizontal edge detector • The Sobel vertical derivative is k’ Machine Vision Toolbox for MATLAB 150 R Copyright c Peter Corke 2011 . Reference Tou and Gonzalez.C] = kmeans(x. Options ‘random’ ‘spread’ initial cluster centres are chosen randomly from the set of data points x initial cluster centres are chosen randomly from within the hypercube spanned by x. and D is the dimension. k. The data is organized into k clusters based on Euclidean distance from cluster centres C (D × k). L = kmeans(x. k. options) is k-means clustering of multi-dimensional data points x (D × N ) where N is the number of points.C] = kmeans(x.

CHAPTER 2. FUNCTIONS AND CLASSES

See also

isobel

ktriangle

Triangular kernel

k = ktriangle(w) is a triangular kernel within a rectangular matrix k. The dimensions k are w × w if w is scalar or w(1) wide and w(2) high. The triangle is isocles and is full width at the bottom row of the kernel and with its apex in the top row.

Examples

>> ktriangle(3) ans = |0 1 0| |0 1 0| |1 1 1|

See also

kcircle

lambda2rg

RGB chromaticity coordinates

rgb = lambda2rg(lambda) is the rg-chromaticity coordinate (1 × 2) for illumination at the speciﬁc wavelength lambda [metres]. If lambda is a vector (N × 1), then P (N × 2) is a vector whose elements are the luminosity at the corresponding elements of lambda. rgb = lambda2rg(lambda, E) is the rg-chromaticity coordinate (1 × 2) for an illumination spectrum E (N × 1) and lambda (N × 1).

Machine Vision Toolbox for MATLAB 151

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

See also

cmfrgb, lambda2xy

lambda2xy

= LAMBDA2XY(LAMBDA) is the xy-chromaticity coordinate (1 × 2) for

illumination at the speciﬁc wavelength LAMBDA [metres]. If LAMBDA is a vector (N × 1), then P (N × 2) is a vector whose elements are the luminosity at the corresponding elements of LAMBDA. xy = lambda2xy(lambda, E) is the rg-chromaticity coordinate (1 × 2) for an illumination spectrum E (N × 1) and lambda (N × 1).

See also

cmfxyz, lambda2rg

loadspectrum

Load spectrum data

s = loadspectrum(lambda, ﬁlename) is spectral data (N × D) from ﬁle ﬁlename interpolated to wavelengths [metres] speciﬁed in lambda (N × 1). The spectral data can be scalar (D=1) or vector (D>1) valued. [s,lambda] = loadspectrum(lambda, ﬁlename) as above but also returns the passed wavelength lambda.

Notes

• The ﬁle is assumed to have its ﬁrst column as wavelength in metres, the remainding columns are linearly interpolated and returned as columns of s.

Machine Vision Toolbox for MATLAB 152

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

luminos

Photopic luminosity function

p = luminos(lambda) is the photopic luminosity function for the wavelengths in lambda. If lambda is a vector (N × 1), then p (N × 2) is a vector whose elements are the luminosity at the corresponding elements of lambda. Luminosity has units of lumens which are the intensity with which wavelengths are perceived by the light-adapted human eye.

See also

rluminos

maxﬁlt

maximum ﬁlter

MAXFILT(s [,w]) minimum ﬁlter a signal with window of width w (default is 5) SEE ALSO: medﬁlt, minﬁlt pic 6/93

medﬁlt1

Median ﬁlter

y = medﬁlt1(x, w) is the one-dimensional median ﬁlter of the signal x computed over a sliding window of width w.

Notes

• A median ﬁlter performs smoothing but preserves sharp edges, unlike traditional smoothing ﬁlters.

Machine Vision Toolbox for MATLAB 153

R

Copyright c Peter Corke 2011

s. The points are the columns of p. [x. Options ‘T’. If s is a 2-vector the side lengths are s(1)xS(2). ‘edge’. options) is a mesh that deﬁnes the edges of a cube. T ‘edge’ Add an extra point in the middle of each face. C ‘T’.y. Options ‘facepoint’ ‘centre’. sphere mkgrid Create grid of points p = mkgrid(d. in this case the returned value is 3 × 14 (8 vertices + 6 face centres). options) is a set of points (3 x d2 ) that deﬁne a d × d planar grid of points with side length s. options) as above but return the rows of p as three vectors. T the homogeneous transform T is applied to all points. symmetric about the origin.z] = mkcube(s. The cube is centred at C (3 × 1) not the origin The cube is arbitrarily transformed by the homogeneous transform T Return a set of cube edges in MATLAB mesh format rather than points. See also cylinder. FUNCTIONS AND CLASSES mkcube Create cube p = mkcube(s.y.z] = mkcube(s. Machine Vision Toolbox for MATLAB 154 R Copyright c Peter Corke 2011 . allowing the plane to be translated or rotated. If d is a 2-vector the grid is d(1)xD(2) points. By default the grid lies in the XY plane.CHAPTER 2. options) is a set of points (3 × 8) that deﬁne the vertices of a cube of side length s and centred at the origin. [x.

n is a row vector specifying which variables to plot (1 is ﬁrst data column. Subplots are labelled as per the data ﬁelds.CHAPTER 2. See also mpq poly. mpq Image moments m = mpq(im. upq Machine Vision Toolbox for MATLAB 155 R Copyright c Peter Corke 2011 . MPLOT(y) MPLOT(y. Plot all other vectors versus time in subplots. {labels}) Where y is multicolumn data and ﬁrst column is time. p.yq . the sum of I(x. FUNCTIONS AND CLASSES mlabel for mplot style graph mlabel(lab1 lab2 lab3) mplot multiple data Plot y versus t in multiple windows. npq. labels is a cell array of labels for the subplots.xp . q) is the PQ’th moment of the image im. y.2)). n is a row vector specifying which variables to plot (1 is ﬁrst data column. MPLOT(S) Where S is a structure and one element ‘t’ is assumed to be time. or y(:.y). labels is a cell array of labels for the subplots. That is.2)). MPLOT(t. n) MPLOT(y. n. n) MPLOT(t. {labels}) Where y is multicolumn data and t is time. y) MPLOT(t. y. or y(:.

Polygon mtools simple/useful tools to all windows in ﬁgure ncc Normalized cross correlation m = ncc(i1. they are considered to be a single vertex. FUNCTIONS AND CLASSES mpq poly Polygon moments m = MPQ POLY(v. p. • If the ﬁrst and last point in the list are the same. upq poly. i2) is the normalized cross-correlation between the two equally sized image patches i1 and i2. The result m is a scalar in the interval -1 (non match) to 1 (perfect match) that indicates similarity. npq poly.CHAPTER 2. so centroids will be still be correct. See also mpq. Notes • The points must be sorted such that they follow the perimeter in sequence (counterclockwise). q) is the PQ’th moment of the polygon with vertices described by the columns of v. Machine Vision Toolbox for MATLAB 156 R Copyright c Peter Corke 2011 . • If the points are clockwise the moments will all be negated.

s] = niblack(im. Machine Vision Toolbox for MATLAB 157 R Copyright c Peter Corke 2011 . niblack.2. Example t = niblack(im. • A common choice of k=-0.CHAPTER 2. k. in text segmentation. w2) as above but returns the per-pixel mean m and standard deviation s. • w2 should be chosen to be half the “size” of the features to be segmented. where W=2*w2+1. [T. -0.m. FUNCTIONS AND CLASSES Notes • A value of 1 indicates identical pixel patterns. Prentice-Hall. See also zncc. k. isimilarity niblack Adaptive thresholding T = niblack(im. sad. The threshold at each pixel is a function of the mean and standard deviation computed over a W × W window. ssd. Notes • This is an efﬁcient algorithm very well suited for binarizing text. 20). T has the same dimensions as im. for example. W. 1986. the height of a character. idisp(im >= t). • The ncc similarity measure is invariant to scale changes in image intensity.2 Reference An Introduction to Digital Image Processing. w2) is the per-pixel (local) threshold to apply to image im.

Notes • The normalized central moments are invariant to translation and scale.CHAPTER 2. q) is the PQ’th normalized central moment of the polygon with vertices described by the columns of v. ithresh norm2 columnwise norm n = norm2(m) n = norm2(a.0. That is UPQ(im. p.p. p. mpq. Machine Vision Toolbox for MATLAB 158 R Copyright c Peter Corke 2011 .0).q)/MPQ(im. q) is the PQ’th normalized central moment of the image im. upq npq poly Normalized central polygon moments m = NPQ POLY(v. FUNCTIONS AND CLASSES See also otsu. See also npq poly. b) npq Normalized central image moments m = npq(im.

• The normalized central moments are invariant to translation and scale. FUNCTIONS AND CLASSES Notes • The points must be sorted such that they follow the perimeter in sequence (counterclockwise). mpq. upq. • If the ﬁrst and last point in the list are the same. Polygon numcols Return number of columns in matrix nc = numcols(m) returns the number of columns in the matrix m. See also numrows numrows Return number of rows in matrix nr = numrows(m) returns the number of rows in the matrix m. See also numcols Machine Vision Toolbox for MATLAB 159 R Copyright c Peter Corke 2011 . they are considered as a single vertex.CHAPTER 2. npq. so centroids will be still be correct. See also mpq poly. • If the points are clockwise the moments will all be negated.

i] = peak(y. FUNCTIONS AND CLASSES otsu Threshold selection T = otsu(im) is an optimal threshold for binarizing an image with a bimodal intensity histogram. Notes • Performance for images with non-bimodal histograms can be quite poor. ithresh peak Find peaks in vector yp = peak(y. T is a scalar threshold that maximizes the variance between the classes of pixels below and above the thresold T. idisp(im >= t). Example t = otsu(im). options) as above but also returns the indices of the maxima in the vector y. Systems. otsu IEEE Trans. pp 62-66 See also niblack. Reference A Threshold Selection Method from Gray-Level Histograms. Man and Cybernetics Vol SMC-9(1). x is the same length of y and contains the corresponding x-coordinates. [yp. N. Machine Vision Toolbox for MATLAB 160 R Copyright c Peter Corke 2011 .xp] = peak(y. options) are the values of the maxima in the vector y. options) as above but also returns the corresponding x-coordinates of the maxima in the vector y. Jan 1979.CHAPTER 2. [yp. x.

options) as above but also returns the indices of the maxima in the matrix z. use peak(-V). Interpolate peak (default no interpolation) Display the interpolation polynomial overlaid on the point data Notes • To ﬁnd minima.S points. Order of interpolation polynomial (default no interpolation) Display the interpolation polynomial overlaid on the point data Notes • To ﬁnd minima. FUNCTIONS AND CLASSES Options ‘npeaks’. Typically choose N to be odd. Use SUB2IND to convert these to row and column coordinates Options ‘npeaks’. Machine Vision Toolbox for MATLAB 161 R Copyright c Peter Corke 2011 . options) are the peak values in the 2-dimensional signal z. N ‘scale’.CHAPTER 2. use peak2(-V). S ‘interp’ ‘plot’ Number of peaks to return (default all) Only consider as peaks the largest value in the horizontal and vertical range +/. • The interp options ﬁts points in the neighbourhood about the peak with a paraboloid and its peak position is returned. • The interp options ﬁts points in the neighbourhood about the peak with an N’th order polynomial and its peak position is returned. N ‘scale’. See also peak2 peak2 Find peaks in a matrix zp = peak2(z. N ‘plot’ Number of peaks to return (default all) Only consider as peaks the largest value in the horizontal range +/. [zp. S ‘interp’.ij] = peak2(z.S points.

ls) as above but the line style arguments ls are passed to plot. FUNCTIONS AND CLASSES See also peak. The program given by the string pgmcmd must accept and return images in PGM format. sub2ind pgmﬁlt Pipe image through PGM utility out = pgmﬁlt(im. If p has three dimensions. See also plot Machine Vision Toolbox for MATLAB 162 R Copyright c Peter Corke 2011 . pgmcmd) pipes the image im through a Unix ﬁlter program and returns its output as an image. p can be N × 2 or N × 3. See also pnmﬁlt. Notes • Provides access to a large number of Unix command line utilities such as ImageMagick.CHAPTER 2. iread plot2 Plot trajectories plot2(p) plots a line with coordinates taken from successive rows of p. ie. Nx2xM or Nx3xM then the M trajectories are overlaid in the one plot. plot2(p.

r. W. ’b’). Options ‘edgecolor’ ‘ﬁllcolor’ ‘alpha’ the color of the circle’s edge. Matlab color spec the color of the circle’s interior. ‘size’. ‘g:’) for a non-ﬁlled circle. P.Y] and with dimensions W=[WIDTH HEIGHT]. ’r’).CHAPTER 2. PLOT BOX(’centre’. value pairs that are passed to plot. ’g’. FUNCTIONS AND CLASSES plot box a box on the current plot PLOT BOX(b. 1=solid. ls) draws a box deﬁned by b=[XL XR.y2.y1) and (x2. PLOT BOX(x1.Y] and with dimensions W=[WIDTH HEIGHT]. If C=[X Y Z] the circle is drawn in the XY-plane at height Z. ls) draws a box with corners at (x1. x2. ls) draws a box with top-left at P=[X. plot circle Draw a circle on the current plot PLOT CIRCLE(C. ‘r’. YL YR] with optional Matlab linestyle options ls.y1. r. Notes • the option can be either a simple linespec (eg. plot_circle(c. R. If C (2 × N or 3 × N ) and R (1 × N ) then a set of N circles are drawn with centre and radius taken from the columns of C and R. or a set of name. ’fillcolor’. Examples plot_circle(c. ‘size’. P. ls) draws a box with center at P=[X. options) draws a circle on the current plot with centre C=[X Y] and radius R. plot_circle(c. Machine Vision Toolbox for MATLAB 163 R Copyright c Peter Corke 2011 . ’LineWidth’. r. PLOT BOX(’topleft’. Matlab color spec transparency of the ﬁlled circle: 0=transparent. ’edgecolor’. W. and optional Matlab linestyle options ls.y2). 5).

Y. xc. C.Z] the ellipse is parallel to the XY plane but at height Z. PLOT ELLIPSE(a. Options ‘color’.Y]. centred at the origin. plot ellipse inv Plot an ellipse plot ellipse(a. c Specify color of the axes. If C=[X. ls) draws an ellipse deﬁned by X’AX = 0 on the current plot. options) draws a coordinate frame corresponding to the homogeneous transformation T. plot frame Plot a coordinate frame represented by a homogeneous transformation trplot(T.CHAPTER 2. Matlab colorspec ‘axes’ ’axis’ Machine Vision Toolbox for MATLAB 164 R Copyright c Peter Corke 2011 . current plot. with Matlab line style ls. ls) ls is the standard line styles. ls) as above but centred at C=[X. FUNCTIONS AND CLASSES See also plot plot ellipse Draw an ellipse on the current plot PLOT ELLIPSE(a.

FUNCTIONS AND CLASSES ‘name’. w ‘arrow’ ’length’. data Label points according to printf format string and corresponding element of data ‘sequence’ Label points sequentially Additional options are passed through to PLOT for creating the marker. n ‘text opts’. fmt. See also homline plot point point features PLOT POINT(p. trplot( T. l Specify length of the axes (default 1) Examples: trplot( T.CHAPTER 2. options) adds point markers to a plot. ’color’. The current axis limits are used to determine the endpoints of the line. Machine Vision Toolbox for MATLAB 165 R Copyright c Peter Corke 2011 . ’r’). ’name’. Matlab line speciﬁcation ls can be set. n Specify the name of the coordinate frame. where p is 2 × N and each column is the point coordinate.X = 0. to ‘view’. ‘printf’. ’B’) plot homline Draw a line in homogeneous form H = PLOT HOMLINE(L. The return argument is a vector of graphics handles for the lines. ls) draws a line in the current ﬁgure L. ‘framename’. Options ‘textcolor’. v Specify the view angle for the Matlab axes ‘width’. ’color’. ’r’. colspec Specify color of text ‘textsize’. size Specify size of text ‘bold’ Text in bold font.

FUNCTIONS AND CLASSES See also plot. R is the radius and color is a Matlab color spec. H = PLOT SPHERE(C. color. text plot poly Plot a polygon plotpoly(p. The default is 1. color) add spheres to the current ﬁgure. options ‘ﬁll’ ‘alpha’ the color of the circle’s interior. See also plot. C is the centre of the sphere and if its a 3 × N matrix then N spheres are drawn with centres as per the columns.CHAPTER 2. color) as above but returns the handle(s) for the spheres. R. NOTES • The sphere is always added. R. Machine Vision Toolbox for MATLAB 166 R Copyright c Peter Corke 2011 . H = PLOT SPHERE(C. Polygon plot sphere Plot spheres PLOT SPHERE(C. patch. 1=solid. either a letter or 3-vector. • The number of vertices to draw the sphere is hardwired. R. alpha) as above but alpha speciﬁes the opacity of the sphere were 0 is transparant and 1 is opaque. irrespective of ﬁgure hold state. Matlab color spec transparency of the ﬁlled circle: 0=transparent. options) plot a polygon deﬁned by columns of p which can be 2 × N or 3 × N.

FUNCTIONS AND CLASSES plotp Plot trajectories plotp(p) plots a set of points p. See also pgmﬁlt. Notes • Provides access to a large number of Unix command line utilities such as ImageMagick. iread Machine Vision Toolbox for MATLAB 167 R Copyright c Peter Corke 2011 .CHAPTER 2. plot2 pnmﬁlt Pipe image through PNM utility out = pnmﬁlt(im. See also plot. plotp(p. pnmcmd) pipes the image im through a Unix ﬁlter program and returns its output as an image. which by Toolbox convention are stored one per column. The program given by the string pnmcmd must accept and return images in PNM format. p can be N × 2 or N × 3. ls) as above but the line style arguments ls are passed to plot. By default a linestyle of ‘bx’ is used.

• translational component is zero See also t2r radgrad Radial gradient [gr. radgrad(im) as above but the result is displayed graphically.gt] = radgrad(im. See also isobel Machine Vision Toolbox for MATLAB 168 R Copyright c Peter Corke 2011 .CHAPTER 2. At each pixel the image gradient vector is resolved into the radial and tangential directions. [gr. Notes • functions for T in SE(2) or SE(3) – if R is 2 × 2 then T is 3 × 3. or – if R is 3 × 3 then T is 4 × 4.Y] rather than the centre pixel of im. centre) as above but the centre of the image is speciﬁed as centre=[X.gt] = radgrad(im) is the radial and tangential gradient of the image im. FUNCTIONS AND CLASSES r2t Convert rotation matrix to a homogeneous transform T = r2t(R) is a homogeneous transform equivalent to an orthonormal rotation matrix R with a zero translational component.

T. x. FUNCTIONS AND CLASSES ramp create a ramp vector ramp(n) output a vector of length n that ramps linearly from 0 to 1 ramp(v) as above but vector is same length as v ramp(v. [m.CHAPTER 2. d) as above but elements increment by d. ransac classiﬁes Points that support the model as inliers and those that do not as outliers. T. options) as above but returns the ﬁnal residual of applying func to the inlier set. x typically contains corresponding point data. one column per point pair. x. Options ‘maxTrials’. if the ﬁt residual is aboe the the threshold the point is considered an outlier. x. T. options) as above but returns the vector in of column indices of x that describe the inlier point set. T is a threshold on how well a point ﬁts the estimated.in] = ransac(func. N ‘maxDataTrials’. [m.resid] = ransac(func.in. See also linspace ransac Random sample and consensus m = ransac(func. ransac determines the subset of points (inliers) that best ﬁt the model described by the function func and the parameter m. N maximum number of iterations (default 2000) maximum number of attempts to select a non-degenerate data set (default 100) Model function out = func(R) is the function passed to RANSAC and it must accept a single argument R which is a structure: Machine Vision Toolbox for MATLAB 169 R Copyright c Peter Corke 2011 . options) is the ransac algorithm that robustly ﬁts data x to the model represented by the function func.

that is.x. ”Multiple View Geometry in Computer Vision”.misc private data (cell array) The function return value is also a structure: out. and the data transform parameters are kept in the .theta estimated quantity (3 × 3) out.s sample size (1 × 1) out. References • m.x.x conditioned data (2D × N ) out.inlier.theta is a cell array. fundamental matrix) it is necessary to condition the data to improve the accuracy of model estimation. Assoc.x) condition the point data out. 2001 Machine Vision Toolbox for MATLAB 170 R Copyright c Peter Corke 2011 . Cambridge University Press.out. If multiple models are found out. For efﬁciency the data is conditioned once. ”Random sample concensus: A paradigm for model ﬁtting with applications to image analysis and automated cartography”.t threshold (1 × 1) R.valid if data is valid for estimation (logical) out.resid model ﬁt residual (1 × 1) The values of R.theta] = ERR(R.valid is true if a set of points is not degenerate.resid] = EST(R.s is the minimum number of points required to compute an estimate to out. [out.theta estimated quantity to test (3 × 3) R.cmd are: ‘size’ ‘condition’ ‘decondition’ ‘valid’ ‘estimate’ out. Comp. pp 101-113. The inverse conditioning operation is applied to the model to transform the estimate based on conditioned data to a model applicable to the original data.. that is they will produce a model.misc element.inlier list of inliers (1 × m) out.theta. This is used to discard random samples that do not result in useful models.R. FUNCTIONS AND CLASSES R.theta = []. Fishler and R.x) returns the best ﬁt model and residual for the subset of points R.x and returns the best model out.theta.theta = DECONDITION(R. ‘error’ Notes • For some algorithms (eg.theta and the subset of R.C. • The functions FMATRIX and HOMOG are written so as to be callable from RANSAC. 1981 • Richard Hartley and Andrew Zisserman. N point pairs (6 × N ) R.x = CONDITION(R.x data to work on.debug display what’s going on (logical) R. [out.s out. they detect a structure argument. Mach. pp 381-395.T) evaluates the distance from the model(s) R.theta) decondition the estimated model data out.A. Boles.CHAPTER 2.misc private data (cell array) out.x that best supports (most inliers) that model.out. Comm. No 6. Vol 24.cmd the operation to perform which is either (string) R.theta to the points R. If this function cannot ﬁt a model then out.

If lambda is a vector. See also luminos Machine Vision Toolbox for MATLAB 171 R Copyright c Peter Corke 2011 . Relative luminosity lies in the interval 0 to 1 which indicate the intensity with which wavelengths are perceived by the light-adapted human eye.uwa.au/ pk See also fmatrix. then p is a vector whose elements are the relative luminosity at the corresponding elements of lambda.csse.CHAPTER 2. homography rg addticks Label spectral locus RG ADDTICKS() adds wavelength ticks to the spectral locus.edu. See also xycolourspace rluminos Relative photopic luminosity function p = ruminos(lambda) is the relative photopic luminosity function for the wavelengths in lambda. FUNCTIONS AND CLASSES Author Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia pk at csse uwa edu au http://www.

rotz. angvec2r Machine Vision Toolbox for MATLAB 172 R Copyright c Peter Corke 2011 . See also rotx. rotz. angvec2r rotz Rotation about Z axis R = rotz(theta) is a rotation matrix representing a rotation of theta about the z-axis. See also roty. FUNCTIONS AND CLASSES rotx Rotation about X axis R = rotx(theta) is a rotation matrix representing a rotation of theta about the x-axis. See also rotx. roty. angvec2r roty Rotation about Y axis R = roty(theta) is a rotation matrix representing a rotation of theta about the y-axis.CHAPTER 2.

where the last index corresponds to the rows of roll. Y. If rpy has multiple rows they are assumed to represent a trajectory and R is a three dimensional matrix. yaw. yaw angles which correspond to rotations about the X. where the last index corresponds to the rows of rpy. • many texts (Paul. T = rpy2tr(roll. then TR is 3 × 3 – If R is 3 × 3 and t is 3 × 1.CHAPTER 2. yaw) as above but the roll-pitch-yaw angles are passed as separate arguments. eul2tr rt2tr Convert rotation and translation to homogeneous transform TR = rt2tr(R. Notes • functions for R in SO(2) or SO(3) – If R is 2 × 2 and t is 2 × 1. Z axes respectively. pitch and yaw are column vectors then they are assumed to represent a trajectory and R is a three dimensional matrix. If roll. See also tr2rpy. FUNCTIONS AND CLASSES rpy2tr Roll-pitch-yaw angles to homogeneous transform T = rpy2tr(rpy) is an orthonormal rotation matrix equivalent to the speciﬁed roll. pitch. Spong) use the rotation order ZYX. pitch. then TR is 4 × 4 • the validity of R is not checked Machine Vision Toolbox for MATLAB 173 R Copyright c Peter Corke 2011 . pitch. t) is a homogeneous transformation matrix formed from an orthonormal rotation matrix R and a translation vector t. Note • in previous releases (<8) the angles corresponded to rotations about ZYX.

y] and rotation is zero T = se2(xy. ncc.y. T = se2(xy) as above where xy=[x. and rotation theta in the plane. a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. theta) as above where xy=[x.y] T = se2(xyt) as above where xyt=[x.theta] See also trplot2 Machine Vision Toolbox for MATLAB 174 R Copyright c Peter Corke 2011 . See also zsad. isimilarity se2 Create planar translation and rotation transformation T = se2(x. FUNCTIONS AND CLASSES See also t2r. ssd. The result m is a scalar that indicates image similarity. r2t.CHAPTER 2. theta) is a 3 × 3 homogeneous transformation SE(2) representing translation x and y. tr2rt sad Sum of absolute differences m = sad(i1. i2) is the sum of absolute differences between the two equally sized image patches i1 and i2. y.

R) displays the stereo image pair L and R in adjacent windows. See also vex ssd Sum of squared differences m = ssd(i1. a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. Clicking the corresponding world point in the right image sets the green crosshair and displays the disparity [pixels]. ncc. Two cross-hairs are created. FUNCTIONS AND CLASSES skew Create skew-symmetric matrix s = skew(v) is a skew-symmetric matrix and v is a 3-vector. isimilarity stdisp Display stereo pair stdisp(L. Clicking a point in the left image positions black cross hair at the same pixel coordinate in the right image. See also zsdd. The result m is a scalar that indicates image similarity.CHAPTER 2. Machine Vision Toolbox for MATLAB 175 R Copyright c Peter Corke 2011 . i2) is the sum of squared differences between the two equally sized image patches i1 and i2. sad.

rt2tr tb optparse Standard option parser for Toolbox functions [optout.bar = false. The software pattern is: function(a. ’other’}. opt.blah = []. Machine Vision Toolbox for MATLAB 176 R Copyright c Peter Corke 2011 . opt. tr2rt. then R is 2 × 2. • the validity of rotational part is not checked See also r2t. boolean or enumeration types (string or int). istereo t2r Return rotational submatrix of a homogeneous transformation R = t2r(T) is the orthonormal rotation matrix component of homogeneous transformation matrix T: Notes • functions for T in SE(2) or SE(3) – If T is 4 × 4. arglist) is a generalized option parser for Toolbox functions. then R is 3 × 3. ’that’.foo = true. opt.CHAPTER 2.args] = TB OPTPARSE(opt. – If T is 3 × 3. It supports options that have an assigned value. c.choose = {’this’. b. varargin) opt. FUNCTIONS AND CLASSES See also idisp.

select = {’#no’. 3 sets opt. By default if an option is given that is not a ﬁeld of opt an error is declared. If neither of ‘this’. varargin). which is a cell array of all unassigned arguments in the order given in varargin.select <.true sets opt. ’#yes’}.2 (the second element) and can be given in any combination. The return structure is automatically populated with ﬁelds: verbose and debug.select <.arglist] = tb_optparse(opt. If neither of ‘no’ or ‘yes’ are speciﬁed then opt.‘that’ ‘yes’ sets opt. FUNCTIONS AND CLASSES opt.1.3 ‘blah’.false ‘blah’. The image is speciﬁed by the string type and one or two (type speciﬁc) arguments: Machine Vision Toolbox for MATLAB 177 R Copyright c Peter Corke 2011 . args) creates a test pattern image. ‘that’ or ‘other’ are speciﬁed then opt.blah <.x. x.foo <. N ‘setopt’. • that only one value can be assigned to a ﬁeld.choose <.choose <. S ‘showopt’ sets opt.blah <. if multiple values are required they must be converted to a cell array. opt = tb_optparse(opt.y sets opt. The following options are automatically parsed: ‘verbose’ ‘debug’. w.true ‘nobar’ sets opt.verbose <. The allowable options are speciﬁed by the names of the ﬁelds in the structure opt.CHAPTER 2. Sometimes it is useful to collect the unassigned options and this can be achieved using a second output argument [opt.foo <.‘this’. Note: • that the enumerator names must be distinct from the ﬁeld names.y ‘that’ sets opt. Optional arguments to the function behave as follows: ‘foo’ sets opt. If w is a scalar the image is w × w else w(2)xW(1).S displays opt and arglist testpattern Create test images im = testpattern(type.N sets opt <.debug <. varargin).

sd. a line. intensity ramp from 0 to 1 in the y-direction. Notes • With no output argument the testpattern in displayed using idisp. args are pitch (distance between centres). args is the number of cycles. 256. sd and sdd are n-vectors. args is the number of cycles. A 256 × 256 image with a grid of dots on 50 pixel centres and 20 pixels in diameter: testpattern(’dots’. args are dot pitch (distance between centres). square side length. FUNCTIONS AND CLASSES ‘rampx’ ‘rampy’ ‘sinx’ ‘siny’ ‘dots’ ‘squares’ ‘line’ intensity ramp from 0 to 1 in the x-direction.sd.sdd] = tpoly(s0. args is the number of cycles. The trajectory s.sdd] = tpoly(s0. 2). intercept. T) as above but speciﬁes the trajectory in terms of the length of the time vector T. [s. sinusoidal intensity pattern (from -1 to 1) in the y-direction. sf. args is the number of cycles. 50. sf. binary square pattern. Velocity and acceleration can be optionally returned as sd and sdd. 25). binary dot pattern. Examples A 256 × 256 image with 2 cycles of a horizontal sawtooth intensity ramp: testpattern(’rampx’. sinusoidal intensity pattern (from -1 to 1) in the x-direction. 256. See also idisp tpoly Generate scalar polynomial trajectory [s. Machine Vision Toolbox for MATLAB 178 R Copyright c Peter Corke 2011 .CHAPTER 2. args are theta (rad). dot diameter. n) is a trajectory of a scalar that varies smoothly from s0 to sf in n steps using a quintic (5th order) polynomial.

Machine Vision Toolbox for MATLAB 179 R Copyright c Peter Corke 2011 . X axes (Paul book) Notes • There is a singularity for the case where THETA=0 in which case PHI is arbitrarily set to zero and PSI is the sum (PHI+PSI). options) are the roll-pitch-yaw angles expressed as a row vector corresponding to the rotation part of a homogeneous transform T. angvec2tr tr2rpy Convert a homogeneous transform to roll-pitch-yaw angles rpy = tr2rpy(T. then each row of rpy corresponds to a step of the trajectory. [theta. See also angvec2r. options) are the roll-pitch-yaw angles expressed as a row vector corresponding to the orthonormal rotation matrix R. rpy = tr2rpy(R.Y] correspond to sequential rotations about the X. If R or T represents a trajectory (has 3 dimensions).v] = tr2angvec(T) is a rotation of theta about the vector v equivalent to the rotational component of the homogeneous transform T. FUNCTIONS AND CLASSES tr2angvec Convert rotation matrix to angle-vector form [theta.P. Notes • If no output arguments are speciﬁed the result is displayed. Y. Options ‘deg’ ‘zyx’ Compute angles in degrees (radians default) Return solution for sequential rotations about Z. Y and Z axes respectively. The 3 angles rpy=[R.v] = tr2angvec(R) is a rotation of theta about the vector v equivalent to the orthonormal rotation matrix R.CHAPTER 2.

T = transl(p) is a homogeneous transform representing a translation or point p=[x. r2t. – If TR is 3 × 3.z]. See also rt2tr.t] = tr2rt(TR) split a homogeneous transformation matrix into an orthonormal rotation matrix R and a translation vector t.y. then R is 3 × 3 and T is 3 × 1.i) corresponds to the i’th row of p. then R is 2 × 2 and T is 2 × 1. See also rpy2tr.CHAPTER 2. If T has three dimensions.:. Notes • Functions for TR in SE(2) or SE(3) – If TR is 4 × 4. z) is a homogeneous transform representing a pure translation. t2r transl Create translational transform T = transl(x. p = transl(T) is the translational part of a homogenous transform as a 3-element column vector. Spong) use the rotation order ZYX. 4x4xN then T is considered a homgoeneous Machine Vision Toolbox for MATLAB 180 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES • Note that textbooks (Paul. If p is an M × 3 matrix transl returns a 4x4xM matrix representing a sequence of homogenous transforms such that T(:. tr2eul tr2rt Convert homogeneous transform to rotation and translation [R. y. ie. • The validity of R is not checked.

Notes • somewhat unusually this function performs a function and its inverse.CHAPTER 2. out (HxWx2) has planes corresponding to r and g. each N × 1. out = tristim2cc(im) is the chromaticity coordinates corresponding to every pixel in the tristimulus image im (HxWx3). See also ctraj tristim2cc Tristimulus to chromaticity coordinates cc = tristim2cc(tri) is the chromaticity coordinate (1 × 2) corresponding to the tristimulus tri (1 × 3). [c1. An historical anomaly. Machine Vision Toolbox for MATLAB 181 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES transform sequence and returns an N × 3 matrix where each row is the translational component of the corresponding transform in the sequence. if tri is XYZ then cc is xy. [o1. The O and A vectors are normalized and the normal vector is formed from O x A.o2] = tristim2cc(im) as above but the chromaticity is returned as separate images (H × W ). If tri is RGB then cc is rg. or x and y. trnorm Normalize a homogeneous transform tn = trnorm(T) is a normalized homogeneous transformation matrix in which the rotation submatrix is guaranteed to be a proper orthogonal matrix. Multiple tristimulus values can be given as rows of tri (N × 3) in which case the chromaticity coordinates are the corresponding rows of cc (N × 2).C2] = tristim2cc(tri) as above but the chromaticity coordinates are returned in separate vectors.

Notes • translational component is zero See also rotx. FUNCTIONS AND CLASSES Notes • Used to prevent ﬁnite word length arithmetic causing transforms to become ‘unnormalized’. trotz troty Rotation about Y axis T = troty(theta) is a homogeneous transformation representing a rotation of theta about the y-axis. Notes • translational component is zero Machine Vision Toolbox for MATLAB 182 R Copyright c Peter Corke 2011 . troty. See also oa2tr trotx Rotation about X axis T = trotx(theta) is a homogeneous transformation representing a rotation of theta about the x-axis.CHAPTER 2.

f ‘label’. (default %g) display the text before the transform Machine Vision Toolbox for MATLAB 183 R Copyright c Peter Corke 2011 . troty trprint Compact display of homogeneous transformation trprint(T. trotz trotz Rotation about Z axis T = trotz(theta) is a homogeneous transformation representing a rotation of theta about the z-axis. If T is a homogeneous transform sequence then print each element is printed on a separate line. FUNCTIONS AND CLASSES See also roty. trotx. Notes • translational component is zero See also rotz. options) displays the homogoneous transform in a compact single-line format.CHAPTER 2. and displays in RPY format. Options ‘rpy’ ‘euler’ ‘angvec’ ‘radian’ ‘fmt’. l display with rotation in roll/pitch/yaw angles (default) display with rotation in ZYX Euler angles display with rotation in angle/vector format display angle in radians (default is degrees) use format string f for all numbers. trprint T is the command line form of above. trotx.

FUNCTIONS AND CLASSES See also tr2eul. See also upq poly. tr2angvec unit Unitize a vector vn = unit(v) is a unit vector parallel to v. q) is the PQ’th central moment of the image im. tr2rpy.(x-x0)p . npq Machine Vision Toolbox for MATLAB 184 R Copyright c Peter Corke 2011 . p. That is.(y-y0)q where (x0. mpq.CHAPTER 2. upq Central image moments m = upq(im. Note • fails for the case where norm(v) is zero. the sum of I(x. Notes • The central moments are invariant to translation.y).y0) is the centroid.

CHAPTER 2. FUNCTIONS AND CLASSES

upq poly

Central polygon moments

m = UPQ POLY(v, p, q) is the PQ’th central moment of the polygon with vertices described by the columns of v.

Notes

• The points must be sorted such that they follow the perimeter in sequence (counterclockwise). • If the points are clockwise the moments will all be negated, so centroids will be still be correct. • If the ﬁrst and last point in the list are the same, they are considered as a single vertex. • The central moments are invariant to translation.

See also

upq, mpq poly, npq poly

useﬁg

a named ﬁgure or create a new ﬁgure

useﬁg(’Foo’) make ﬁgure ‘Foo’ the current ﬁgure, if it doesn’t exist create it. h = useﬁg(’Foo’) as above, but returns the ﬁgure handle

vex

Convert skew-symmetric matrix to vector

v = vex(s) is the vector which has the skew-symmetric matrix s.

Machine Vision Toolbox for MATLAB 185

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

Notes

• No checking is done to ensure that the matrix is skew-symmetric. • The function takes the mean of the two elements that correspond to each unique element of the matrix, ie. vx = 0.5*(s(3,2)-s(2,3))

See also

skew

xaxis

X-axis scaling

xaxis(max) xaxis([min max]) xaxis(min, max) xaxis restore automatic scaling for this axis

xycolorspace

Display spectral locus

xycolorspace() display a fully colored spectral locus in terms of CIE x and y coordinates. xycolorspace(p) as above but plot the points whose xy-chromaticity is given by the columns of p. [im,ax,ay] = xycolorspace() as above returns the spectral locus as an image im, with corresponding x- and y-axis coordinates ax and ay respectively.

Notes

• The colors shown within the locus only approximate the true colors, due to the gamut of the display device.

Machine Vision Toolbox for MATLAB 186

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

See also

rg addticks

yaxis

Y-axis scaling

yayis(max) yayis(min, max) YAXIS restore automatic scaling for this axis

zcross

Zero-crossing detector

iz = zcross(im) is a binary image with pixels set where the corresponding pixels in the signed image im have a zero crossing, a positive pixel adjacent to a negative pixel.

Notes

• Can be used in association with a Lapalacian of Gaussian image to determine edges.

See also

ilog

Machine Vision Toolbox for MATLAB 187

R

Copyright c Peter Corke 2011

FUNCTIONS AND CLASSES zncc Normalized cross correlation m = zncc(i1. The result m is a scalar that indicates image similarity. The result m is a scalar in the interval -1 to 1 that indicates similarity. a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. isimilarity zsad Sum of absolute differences m = zsad(i1. ssd. ssd. sad. isimilarity Machine Vision Toolbox for MATLAB 188 R Copyright c Peter Corke 2011 . See also sad. Notes • The zncc similarity measure is invariant to afﬁne changes in image intensity (brightness offset and scale). See also ncc.CHAPTER 2. i2) is the zero-mean sum of absolute differences between the two equally sized image patches i1 and i2. Notes • The zsad similarity measure is invariant to changes in image brightness offset. ncc. i2) is the zero-mean normalized cross-correlation between the two equally sized image patches i1 and i2. A value of 1 indicates identical pixel patterns.

See also sdd. Notes • The zssd similarity measure is invariant to changes in image brightness offset. ncc. The result m is a scalar that indicates image similarity. a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. FUNCTIONS AND CLASSES zssd Sum of squared differences m = zssd(i1. sad.CHAPTER 2. i2) is the zero-mean sum of squared differences between the two equally sized image patches i1 and i2. isimilarity Machine Vision Toolbox for MATLAB 189 R Copyright c Peter Corke 2011 .

- Matlab Simulink SimPowerSystems for Power Lab 2014
- PDF Slideshow Presenting the Course Content
- PlotXY
- Lecture 1
- Project - Color Based Moving Object Detection with Matlab
- INC151 Matlab Lec1
- Lecture 1 An overview of MATLAB.pdf
- Machine Vision Toolbox for Matlab
- Digital IVision Labs!- Some MATLAB Unusual Commands, That You Must Know ( Just for Fun! ) Part 1
- (5) What is the Use of MATLAB in Solar Power Plants Designing and Installation_ What Are Other Uses for MATLAB_ How Can One Learn It Without an Electrical Engineering Background_ - Quora
- Pratical1_TRP_2013_2014
- fisiere matlab
- TGS_02_PLTL_1304405054
- Licence
- Image Processing Using MATLAB
- Parallel Mat Lab
- MATLAB
- Practical
- dsge_all
- elg5163_pdc
- MATLAB Tutorial 2012
- 30164970 Orchard Introduction to Simulink With Engineering Applications 2nd Edition Mar
- GridPV-Toolbox Ver 2.1
- BasicMatlabAngGui.ppt
- 1 Introduction
- Cpp to Python
- VCOType16New_Aug06
- Mat Lab Tutorial
- What is Matlab
- image-acquisition-toolbox

- tmp2BE5.tmp
- UT Dallas Syllabus for ee3102.002 06f taught by P Rajasekaran (pkr021000)
- MathWorks v. AccelerEyes et. al.
- Spur Gear Design by Using MATLAB Code
- Design of Fuzzy Logic Controller for Speed Regulation of BLDC motor using MATLAB
- UT Dallas Syllabus for hcs7372.005.11s taught by Michael Kilgard (kilgard)
- UT Dallas Syllabus for cs4334.501.07f taught by Bentley Garrett (btg032000)
- Energy Efficient Scheme Based on Low Energy Adaptive Clustering Hierarchical Protocol for Wireless Sensor Network using HG LEACH
- Simulation Study of FIR Filter based on MATLAB
- Artificial Neural Network Based Graphical User Interface for Estimation of Fabrication Time in Rig Construction Project
- UT Dallas Syllabus for ee3102.022.06u taught by P Rajasekaran (pkr021000)
- Control of Industrial Pneumatic & Hydraulic Systems using Serial Communication Technology & Matlab

- Survey on Object Classification in Image Processing
- Tracking of Multiple Object on Aerial Videos using Image Registration
- tmpE688.tmp
- Video Image Segmentation and Object Detection Using Markov Random Field Model and EM Algorithm
- Medical Image Denoising using Spatial Filtering Techniques
- tmpCEE5
- Practical Applications of Sixth Sense Technology in Supermarkets - Possibilities & Challenges
- Techniques and Issues in Image Mining
- Content Based Image Retrieval Using Color and Texture Features
- Design of an Algorithm for Video Mosaicing Using Cross-Correlation
- Automated Skin Lesion Analysis System for Melanoma using Feature extraction
- tmpB6FC
- The LDEP is achieved for Computer Tomography Image Retrieval in NEMA-Computer Tomography
- High Speed Real Time Quality Inspection System
- Purifying Weakly Labeled Web Facial Images Using Auto Face Annotation Technique
- Disease Classification of Paddy leaves using HSI Feature Extraction and SVM Technique
- Real Time Animal Repellent System using Image Processing
- Head and Hand Detection using Kinect Camera 360
- An Accurate Approach for Satellite Image Classification Using Neuroevolutionary Method

Sign up to vote on this title

UsefulNot usefulClose Dialog## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Close Dialog## This title now requires a credit

Use one of your book credits to continue reading from where you left off, or restart the preview.

Loading