Licence Toolbox home page Discussion group

LGPL http://www.petercorke.com/vision http://groups.google.com.au/group/robotics-tool-box

Copyright c 2011 Peter Corke peter.i.corke@gmail.com September 2011 http://www.petercorke.com

Preface
This, the third release of the Toolbox, represents a decade of development. The last release was in 2005 and this version captures a large number of changes over that period but with extensive work over the last two years to support my new book “Robotics, Vision & Control” shown to the left.

Peter C0rke

Robotics,
Vision and Control

The practice of robotics and computer vision each involve the application of computational algorithms to data. The research community has developed a very large body of algorithms but for a newcomer to the field this can be quite daunting. For more than 10 years the author has maintained two opensource matlab® Toolboxes, one for robotics and one for vision. They provide implementations of many important algorithms and allow users to work with real problems, not just trivial examples. This new book makes the fundamental algorithms of robotics, vision and control accessible to all. It weaves together theory, algorithms and examples in a narrative that covers robotics and computer vision separately and together. Using the latest versions of the Toolboxes the author shows how complex problems can be decomposed and solved using just a few simple lines of code. The topics covered are guided by real problems observed by the author over many years as a practitioner of both robotics and computer vision. It is written in a light but informative style, it is easy to read and absorb, and includes over 1000 matlab® and Simulink® examples and figures. The book is a real walk through the fundamentals of mobile robots, navigation, localization, armrobot kinematics, dynamics and joint level control, then camera models, image processing, feature extraction and multi-view geometry, and finally bringing it all together with an extensive discussion of visual servo systems.

Corke

Robotics,
Vision and Control

Peter Corke

1
Robotics, Vision and Control

isbn 978-3-642-20143-1

9 783642 201431

springer.com

The Machine Vision Toolbox (MVTB) provides many functions that are useful in machine vision and vision-based control. It is a somewhat eclecFUNDAMENTAL tic collection reflecting my personal interest in areas ALGORITHMS IN MATLAB® of photometry, photogrammetry, colorimetry. It includes over 100 functions spanning operations such 123 as image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration and color space conversion. The Toolbox, combined R with MATLAB and a modern workstation computer, is a useful and convenient environment for investigation of machine vision algorithms. For modest image sizes the processing rate can be sufficiently “real-time” to allow for closed-loop control. Focus of attention methods such as dynamic windowing (not provided) can be used to increase the processing rate. With input from a firewire or web camera (support provided) and output to a robot (not provided) it would be possible to implement a visual R servo system entirely in MATLAB . An image is usually treated as a rectangular array of scalar values representing intenR sity or perhaps range. The matrix is the natural datatype for MATLAB and thus makes the manipulation of images easily expressible in terms of arithmetic statements R in MATLAB language. Many image operations such as thresholding, filtering and R statistics can be achieved with existing MATLAB functions. The Toolbox extends this core functionality with M-files that implement functions and classes, and mex-files for some compute intensive operations. It is possible to use mex-files to interface with image acquisition hardware ranging from simple framegrabbers to robots. Examples for firewire cameras under Linux are provided. The routines are written in a straightforward manner which allows for easy underR standing. MATLAB vectorization has been used as much as possible to improve efficiency, however some algorithms are not amenable to vectorization. If you have the

Machine Vision Toolbox for MATLAB

R

3

Copyright c Peter Corke 2011

contains many functions that are useful for image feature extraction and control. This toolbox is not a clone of the Mathwork’s own Image Processing Toolbox (IPT) although there are many functions in common. This is extravagant on storage. This toolbox considers images generally as arrays of double precision numbers. The manual is now auto-generated from the comments in the MATLAB code itself which reduces the effort in maintaining code and a separate manual as I used to — the downside is that there are no worked examples and figures in the manual. However the book “Robotics. nearly 400 figures and 1000 code examples) of how to use the Toolbox functions to solve many types of problems in robotics. and I commend it to you. though this is much less significant today than it was in the past. Vision & Control” provides a detailed discussion (over 600 pages. This toolbox predates IPT by many years. It was developed under Unix and Linux systems and some functions rely on tools and utilities that exist only in that environment. is open-source. R R Machine Vision Toolbox for MATLAB R 4 Copyright c Peter Corke 2011 .MATLAB compiler available then this can be used to compile bottleneck functions. Some particularly compute intensive functions are provided as mex-files and may need to be compiled for the particular platform.

. . . . . . . . . . SiftPointFeature . . . . . . . . . . . . . . . . . . . . . . . . . angdiff . PGraph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . blackbody . . . 2 . . . . . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . . . . . . . . . . . . . . . SphericalCamera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . AxisWebCamera . . . . . . . . . . . . . . . . . . Machine Vision Toolbox for MATLAB R 5 Copyright c Peter Corke 2011 . . . .3 How to obtain the Toolbox 1. . . . . . . . . . . . . . . . . . . . . .2 Support . . . . . . . . . . . . . . . BagOfWords . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . CentralCamera . . . . . . . . . . . . . . . . . . .4 MATLAB version issues . . . . . . . . . . . . . . . . . . . . . . . . Movie . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . . . . . . . . . . . . . . FishEyeCamera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . about . . . . . . . . . . . . . . . . . . .1 Other toolboxes . . . . . . . . . . . . . . . . . . . . . . . . . PointFeature . . . . . . . 1. . .6. . . . . . . . . . . . . . . . . . . . . . . Video . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Functions and classes Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Polygon . 1 Introduction 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . anaglyph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ScalePointFeature . . . . . LineFeature . . 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . RegionFeature . . . . . . .Contents Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Ray3D . . . . 3 10 10 10 11 11 11 11 12 12 13 13 19 29 32 37 40 41 45 48 53 55 59 62 63 70 73 78 80 83 85 87 88 89 89 90 . . . . . . . . . . . . . . . . . . FeatureMatch . . . . . . . . . . . . . . . . CatadioptricCamera . . . . . . . SurfPointFeature . . . . . . . . . . . . . . . . . . .6 Use in research . . . . . . . . . . . . . . . . . . . . . . . . . . 1. . . . . . . . . . . . . . . . . . . . .5 Use in teaching . . . . . . . . . 1. . . . . . . . . . . . . . . . .1 What’s new . . . . . . . . . . . . . . . . . . . . . . . . . . . Hough . . . . . . .7 Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Tracker . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ibbox . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . igraphseg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ianimate . . . . col2im . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iconcat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . distance . . . . . . . . . homwarp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . epiline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . closest . . . . . . . . . . . . cie primaries circle . . . . . . . . . . . . . . colorkmeans . . . . . . . . . . .CONTENTS CONTENTS boundmatch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . icanny . . . . . . . . . . . . . . . . . colordistance colorize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . colorname . . . . . . . . . . . homline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ihist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 90 91 91 92 92 92 93 94 94 95 95 96 96 97 98 99 99 100 100 101 101 102 102 103 103 103 104 104 105 106 106 107 107 108 109 110 110 111 112 113 114 115 116 116 117 117 118 119 120 Machine Vision Toolbox for MATLAB Copyright c Peter Corke 2011 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . h2e . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . colorspace . . hitormiss . . . idouble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . homography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . igamma . . . . . . . . . . . . . . . . . edgelist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . camcald . . . . . diff2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . icp . . . . . . . . . . . . . . . . . . . . . . . iblobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . fmatrix . . . . . . . . . . . . epidist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . idecimate . . . . . . . . cmfxyz . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . cmfrgb . . . . . . . . . . . . . . . . . . . . . . . . . . . gauss2d . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . e2h . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . humoments . . . . . ccdresponse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . idisplabel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . idisp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . icolor . . . . . . . . . . . . . . . . . . . . . . . . . . bresenham . . . . colnorm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . homtrans . . . . . . . . . . . . . iconv . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . R . . . . . . . . . . . . . . . . . . . . . . . 6 . . . . . . . . . . . . . . . iendpoint . . . . . . . . . icorner . . . . . . . . . . iclose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . ithin . . . . . . R . . . . . isvec . . . . . . . . . . . . . . . . . . . . . kcircle . . . . . . . . . . . . isamesize . . . . . . . . . . . . . . . . . . . . . . . . . iscalespace . . . . . . . . . . . . . . . . . kgauss . . . . . . . . . . . . . . . . . . . . . . . . . . . istereo . . . . . . . . . . irank . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iline . . . . . . . . . . . isurf . . . . . . . . iopen . . . . . . . . itriplepoint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . irotate . . . iread . . ivar . . . . . . . . . ilabel . . . . . . . . . . . . . . . . . . . . . . . . iscolor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isrot . . . . . . . . . . . . . . . . . . . . . . . . istretch . . . ipyramid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . imorph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 120 121 122 122 123 123 124 125 125 126 126 127 127 128 128 129 129 130 131 132 132 133 133 134 134 135 135 136 136 136 137 138 138 139 140 140 141 142 143 143 144 144 145 145 146 147 147 148 148 Machine Vision Toolbox for MATLAB Copyright c Peter Corke 2011 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ithresh . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . imono . . . . . . . . . . . . . . . . . . . . ireplicate . . . . . . . . . . . . . . . . . . . . . . . . . . . im2col . . . . . . . . . . . . . . . . . . . . imatch . . . . . . . . . . . . . . . . . . . . . . . . . . inormhist . . itrim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . imser . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . intgimage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . imoments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . irectify . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ipixswitch . . . . . . . . . . . . . . . . ismooth . . . . . . . . . . . . . . . . ishomog . . . . . . . . . . . . . . . . . . . . . . . iscalemax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . kdog . . . . 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . kdgauss . . . . . . . . . . . ipad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iroi . . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS CONTENTS iisum . . . . . . . . . . . . . . . ipaste . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iwindow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isobel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isimilarity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . imeshgrid . . . . . . . . . . . . . . iscale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . isift . . . . . . . . . . . . iprofile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . luminos . . . . . . . . . . . . . . . . plot poly . . . . . . . . . . . . . . . . . . . . . . . . plot circle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ncc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ksobel . rg addticks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 149 150 150 151 151 152 152 153 153 153 154 154 155 155 155 156 156 156 157 158 158 158 159 159 160 160 161 162 162 163 163 164 164 164 165 165 166 166 167 167 168 168 169 169 171 171 172 172 172 Machine Vision Toolbox for MATLAB Copyright c Peter Corke 2011 . . . radgrad . . npq poly . . . . . R . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS CONTENTS klaplace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . roty . . . . . pgmfilt . . . . mlabel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . maxfilt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . loadspectrum . . . . pnmfilt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot point . . . . . medfilt1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . numrows . peak2 . . . . . . . . . . . . . . . . . . . . . . . . ktriangle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . mtools . plot sphere . . . . . . . . . . . mplot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . mkgrid . . . . . . . rluminos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot homline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . r2t . . . . . . . . . . . . . . . . . norm2 . . . otsu . . . . . . . . plot ellipse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plotp . . . . . . . . . . . . . . . . . . . . . . . . . . . peak . . . . . . . . . . . . . . . . . . . . . . . . lambda2rg . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . rotz . . . . . . . . . . . . . . . . . . . . . . . . . . . . ramp . . . . kmeans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 . . . rotx . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . mpq poly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ransac . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . mpq . . . . . . . . . . . . . . . . . . plot ellipse inv plot frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . klog . . . . . . . . . . . . . . . niblack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . mkcube . numcols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . npq . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . plot2 . . . . . lambda2xy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . tpoly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . zcross . . . . . . . . . . . . . . . . . . . . . . . . . troty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . trotx . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vex . . . . . . . . . . . . . . . . . . . . . . zsad . . . . . . . . . . . . . . . . . . . . . . . . . . . . transl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . upq poly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xaxis . . . . . . . . . tr2rpy . . . . . . ssd . skew . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . upq . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . trnorm . . . . . . . . . . . . . . . . unit . . . . . . . . . . . . . . . . . zncc . . . . . . . tr2rt . se2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xycolorspace yaxis . . 173 173 174 174 175 175 175 176 176 177 178 179 179 180 180 181 181 182 182 183 183 184 184 185 185 185 186 186 187 187 188 188 189 Machine Vision Toolbox for MATLAB R 9 Copyright c Peter Corke 2011 . . . . . . . testpattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . zssd . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . usefig . . . . . . . . trotz . . . . . . . . . tr2angvec . . . . . . . . . . . . tb optparse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . stdisp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . t2r . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . rt2tr . . . . . . . . . . . . trprint . . . . . . . . . . . . . . . . . . . tristim2cc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . sad . . . . . . . . . . . . . .CONTENTS CONTENTS rpy2tr . . . . . . . . . . . . .

no matter how urgent or important they might be to you.au/group/robotics-tool-box which is a forum for discussion. You might instead like to communicate with other users via the Google Group called “Robotics Toolbox” http://groups. I am happy to correspond with people who have found genuine bugs or deficiencies but my response time can be long and I can’t guarantee that I respond to your email. That’s what you your teachers. I can guarantee that I will not respond to any requests for help with assignments or homework.2 Support There is no support! This software is made freely available in the hope that you find it useful in solving whatever problems you have to hand.Chapter 1 Introduction 1. and you will be suitably acknowledged.google. I need you to write a Machine Vision Toolbox for MATLAB R 10 Copyright c Peter Corke 2011 . I am very happy to accept contributions for inclusion in future versions of the toolbox. You need to signup in order to post.1 • x New features: • x Bugfixes: • Improved error messages in many functions • Removed trailing commas from if and for statements What’s new Changes: 1. tutors. and the signup process is moderated by me so allow a few days for this to happen. lecturers and professors are paid to do.com.

I. 1. HOW TO OBTAIN THE TOOLBOX CHAPTER 1. The details are @article{Corke05f.3. This is just a means for me to gauge interest and to help convince my bosses (and myself) that this is a worthwhile activity. Journal = {IEEE Robotics and Automation Magazine}.petercorke. and the “See also” functions to each other. Number = {4}.1. INTRODUCTION few words about why you want to join the list so I can distinguish you from a spammer or a web-bot.pdf is a manual that describes all functions in the Toolbox. The web page requests some information from you regarding such as your country.zip). Volume = {12}.5 Use in teaching This is definitely encouraged! You are free to put the PDF manual (vision. If you plan to distribute paper copies of the PDF manual then every copy must include the first two pages (cover and licence). Author = {P. It is R auto-generated from the comments in the MATLAB code and is fully hyperlinked: to external web sites.html on a server for class use.3 How to obtain the Toolbox The Machine Vision Toolbox is freely available from the Toolbox home page at http://www.4 MATLAB version issues The Toolbox has been tested under R2011a.com The files are available in either gzipped tar format (. the table of content to functions.pdf or the web-based documentation html/*. Corke}. 1. type of organization and application. Year = {2005}. 1. A menu-driven demonstration can be invoked by the function rtdemo. 1. Month = nov. Title = {Machine Vision Toolbox}. Machine Vision Toolbox for MATLAB R 11 Copyright c Peter Corke 2011 .gz) or zip format (.6 Use in research If the Toolbox helps you in your endeavours then I’d appreciate you citing the Toolbox when you publish. The file robot.

R. The Camera Calibration Toolbox by Jean-Yves Bouguet is used unmodified. Vincent Lepetit. 1. VLFeat http://www. but not least.S. this release includes functions for computing image plane homographies and the fundamental matrix. See the file CONTRIB for details. Pascal Fua at the CVLab-EPFL. which is also given in electronic form in the CITATION file. 1. RANSAC code by Peter Kovesi. ACKNOWLEDGEMENTS CHAPTER 1.I.7 Acknowledgements Last.org is a great collection of advanced computer vision algorithms for MATLAB.. and there are hundreds of modules available. color space conversions by Pascal Getreuer.the graph-based image segmentation software by Pedro Felzenszwalb. 12(4). Twente. the k-means and MSER algorithms by Andrea Vedaldi and Brian Fulkerson.6. pose estimation by Francesco Moreno-Noguer.1 Other toolboxes Matlab Central http://www.com/matlabcentral is a great resource for user contributed MATLAB code.1. numerical routines for geometric vision by various members of the Visual Geometry Group at Oxford (from the web site of the Hartley and Zisserman book.vlfeat. 1994 University of British Columbia. INTRODUCTION Pages = {16-25} } or “Machine Vision Toolbox”. IEEE Robotics and Automation Magazine.mathworks. MSER. P. November 2005. contributed by Nuno Alexandre Cid Martins of I.Functions such as SURF. Corke.7. Coimbra. pp 16–25. Machine Vision Toolbox for MATLAB R 12 Copyright c Peter Corke 2011 . graph-based segmentation and pose estimation are based on great code Some of the MEX file use some really neat macros that were part of the package VISTA Copyright 1993. and the SURF feature detector by Dirk-Jan Kroon at U.

Methods plot hold ishold clf figure mesh point line plot camera rpy move centre delete char display plot projection of world point to image plane control figure hold for image plane window test figure hold for image plane clear image plane figure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera in world view set camera attitude clone Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Properties (read/write) npix pp rho T image dimensions (2 × 1) principal point (2 × 1) pixel dimensions (2 × 1) in metres camera pose as homogeneous transformation R Machine Vision Toolbox for MATLAB 13 Copyright c Peter Corke 2011 .Chapter 2 Functions and classes Camera Camera superclass An abstract superclass for Toolbox camera classes.

Camera. IM ‘resolution’. • The object can create a window to display the Camera image plane. P ‘pixel’. N ‘image’. C = Camera(options) creates a default (abstract) camera with null parameters. SIGMA ‘pose’. The ‘image’ option paints the specified image onto the image plane and allows points and lines to be overlaid. used by all subclasses.8]) Notes • Normally the class plots points and lines into a set of axes that represent the image plane. • Camera objects can be used in vectors and arrays • This is an abstract class and must be subclassed and a project() method defined. Options ‘name’. T ‘color’. Machine Vision Toolbox for MATLAB R 14 Copyright c Peter Corke 2011 .CHAPTER 2. C Name of camera Load image IM to image plane Image plane resolution: N × N or N=[W H] Image sensor size in metres (2 × 1) [metres] Principal point (2 × 1) Pixel size: S × S or S=[W H] Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation Color of image plane background (default [1 1 0. this window is protected and can only be accessed by the plot methods of this object. N ‘sensor’.Camera Create camera object Constructor for abstact Camera class. S ‘centre’. FUNCTIONS AND CLASSES Properties (read only) nu nv u0 v0 number of pixels in u-direction number of pixels in v-direction principal point u-coordinate principal point v-coordinate Notes • Camera is a reference object. S ‘noise’.

display() displays a compact human-readable representation of the camera parameters. Machine Vision Toolbox for MATLAB R 15 Copyright c Peter Corke 2011 .CHAPTER 2.display Display value C.delete() destroys all figures associated with the Camera object and removes the object.clf() removes all graphics from the camera’s image plane. Camera. fisheyecamera. Camera. SphericalCamera Camera. CatadioptricCamera.delete Camera object destructor C.centre() is the 3-dimensional position of the camera centre (3 × 1).centre Get camera position p = C. FUNCTIONS AND CLASSES See also CentralCamera. Camera. Camera.char Convert to string s = C.clf Clear the image plane C.char() is a compact string representation of the camera parameters.

hold(H) hold mode is set on if H is true (or > 0).line(L) plots lines on the camera image plane which are defined by columns of L (3 × N ) considered as lines in homogeneous form: a. Camera.v + c = 0.hold Control hold on image plane graphics C. Camera. otherwise false (0).figure Return figure handle H = C. C.line Plot homogeneous lines on image plane C.u + b.ishold() returns true (1) if the camera’s image plane is in hold mode. Machine Vision Toolbox for MATLAB R 16 Copyright c Peter Corke 2011 .figure() is the handle of the figure that contains the camera’s image plane graphics. FUNCTIONS AND CLASSES Notes • This method is invoked implicitly at the command line when the result of an expression is a Camera object and the command has no trailing semicolon. See also Camera.ishold Return image plane hold status H = C.CHAPTER 2. and off if H is false (or 0).char Camera.hold() sets “hold on” for the camera’s image plane. Camera.

sphere. Camera.T. cylinder. The matrices x.move Instantiate displaced camera C2 = C.mesh Plot mesh object on image plane C. See also mesh. y. FUNCTIONS AND CLASSES Camera. If p has 3 dimensions (3xNxS) then it is considered a sequence of point sets and is displayed as an animation.plot(p) as above but returns the image plane coordinates uv (2 × N ).plot Plot points on image plane C. uv = C.mesh(x.plot.clf Camera. y. Camera. z are of the same size and the corresponding elements of the matrices define 3D points. Camera. options) projects a 3D shape defined by the matrices x. z to the image plane and plots them. Machine Vision Toolbox for MATLAB R 17 Copyright c Peter Corke 2011 . Temporarily overrides the current camera pose C.move(T) is a new camera object that is a clone of C but its pose is displaced by the homogeneous transformation T with respect to the current pose of C. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane.plot(p. mkcube.hold. y.CHAPTER 2. Camera. T ‘Tcam’. If p is 2 × N the points are assumed to be image plane coordinates and are plotted directly. z. Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. options) projects world points p (3 × N ) to the image plane and plots them. Options ‘Tobj’.

T. T See also Camera. C Text color for annotation (default black) ‘textsize’. N Number of frames per second for point sequence display ‘sequence’ Annotate the points with their index ‘textcolor’.CHAPTER 2.plot camera(options) draw a camera as a simple 3D model in the current figure. ‘Tcam’. FUNCTIONS AND CLASSES Options Transform all points by the homogeneous transformation T before projecting them to the camera image plane. Machine Vision Toolbox for MATLAB R 18 Copyright c Peter Corke 2011 . T ‘scale’. T Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane.point(p) plots points on the camera image plane which are defined by columns of p (3 × N ) considered as points in homogeneous form. Camera. S Text size for annotation (default 12) ‘drawnow’ Execute MATLAB drawnow function Additional options are considered MATLAB linestyle parameters and are passed directly to plot. S Camera displayed in pose T (homogeneous transformation 4 × 4) Overall scale factor (default 0.2 x maximum axis dimension) Notes • The graphic handles are stored within the Camera object.clf Camera.plot camera Display camera icon in world view C. ‘Tobj’. Options ‘Tcam’.point Plot homogeneous points on image plane C. Overrides the current camera pose C. Camera.mesh.hold. Camera. ‘fps’.

rpy(R. FUNCTIONS AND CLASSES Camera. the focal point is at z=0 and the image plane is at z=f.p. Machine Vision Toolbox for MATLAB R 19 Copyright c Peter Corke 2011 . p. v Y This camera model assumes central projection. The image is not inverted. that is.rpy Set camera attitude C.rpy(rpy) as above but rpy=[R. CentralCamera Perspective camera class A concrete class for a central-projection perspective camera. a subclass of Camera. C.y]. y) sets the camera attitude to the specified roll-pitch-yaw angles. The camera coordinate system is: 0------------> u X | | | + (principal point) | | Z-axis is into the page.CHAPTER 2.

FUNCTIONS AND CLASSES Methods project K C H invH F E invE fov ray plot hold ishold clf figure mesh point line plot camera plot line tr plot epiline flowfield visjac p visjac p polar visjac l visjac e rpy move centre estpose delete char display project world points camera intrinsic matrix camera matrix camera motion to homography decompose homography camera motion to fundamental matrix camera motion to essential matrix decompose essential matrix field of view Ray3D corresponding to point plot projection of world point on image plane control hold for image plane test figure hold for image plane clear image plane figure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera in world view draw line in theta/rho format draw epipolar line compute optical flow image Jacobian for point features image Jacobian for point features in polar coordinates image Jacobian for line features image Jacobian for ellipse features set camera attitude clone Camera after motion get world coordinate of camera centre estimate pose object destructor convert camera parameters to string display camera parameters Properties (read/write) npix pp rho f k p distortion T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: pixel dimensions (2 × 1) in metres intrinsic: focal length intrinsic: radial distortion vector intrinsic: tangential distortion parameters intrinsic: camera distortion [k1 k2 k3 p1 p2] extrinsic: camera pose as homogeneous transformation Machine Vision Toolbox for MATLAB R 20 Copyright c Peter Corke 2011 .CHAPTER 2.

also known as the camera calibration or projection matrix. • Camera objects can be used in vectors and arrays See also Camera CentralCamera. Machine Vision Toolbox for MATLAB R 21 Copyright c Peter Corke 2011 .C() is the 3×4 camera matrix. C = CentralCamera(options) as above but with specified parameters. FUNCTIONS AND CLASSES Properties (read only) nu nv u0 v0 number of pixels in u-direction number of pixels in v-direction principal point u-coordinate principal point v-coordinate Notes • Camera is a reference object.CentralCamera Create central projection camera object C = CentralCamera() creates a central projection camera with canonic parameters: f=1 and name=’canonic’.C Camera matrix C = C. CentralCamera.CHAPTER 2.

CHAPTER 2. SIGMA ‘pose’. 2003.and y-axes respectively. IM ‘resolution’. S. optical axis is z-axis.F. S ‘centre’. C Name of camera Focal length [metres] Distortion vector [k1 k2 k3 p1 p2] Distortion vector [k1 k2 p1 p2 k3] Default camera parameters: 1024 × 1024.F(C2) is the essential matrix relating two camera views described by camera objects C (first view) and C2 (second view). p.E Essential matrix E = C. E = C. S ‘noise’.Kosecka. CentralCamera. u. 10um pixels. “An invitation to 3D”. D ‘distortion-bouguet’. f=8mm. E = C.T and the second is a relative motion represented by the homogeneous transformation T. N ‘sensor’. CatadioptricCamera.Sastry. Reference Y. The first view is from the current camera pose C.and v-axes parallel to x.8]) See also Camera.E(T) is the essential matrix relating two camera views. Display an image rather than points Image plane resolution: N × N or N=[W H] Image sensor size in metres (2 × 1) Principal point (2 × 1) Pixel size: S × S or S=[W H] Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation Color of image plane background (default [1 1 0. T ‘color’. FUNCTIONS AND CLASSES Options ‘name’.Ma. S.177 See also CentralCamera.F(F) is the essential matrix based on the fundamental matrix F (3 × 3) and the intrinsic parameters of camera C. SphericalCamera CentralCamera. P ‘pixel’. camera at origin.Soatto. N ‘focal’. Springer. F ‘distortion’. J.invE Machine Vision Toolbox for MATLAB R 22 Copyright c Peter Corke 2011 . fisheyecamera. D ‘default’ ‘image’.

d) is a 3 × 3 homography matrix for the camera observing the plane with normal n and at distance d. 2003.E CentralCamera.T and the second is after a relative motion represented by the homogeneous transformation T.Ma. The first view is from the current camera pose C. from two viewpoints.H CentralCamera.H(T. S. Reference Y.K Intrinsic parameter matrix K = C.Kosecka. S.177 See also CentralCamera.F Fundamental matrix F = C. See also CentralCamera.Soatto. p. Springer. F = C.F(T) is the fundamental matrix relating two camera views. FUNCTIONS AND CLASSES CentralCamera.K() is the 3 × 3 intrinsic parameter matrix. The first view is from the current camera pose C.T and the second is a relative motion represented by the homogeneous transformation T. J. n.H Homography matrix H = C. Machine Vision Toolbox for MATLAB R 23 Copyright c Peter Corke 2011 .Sastry.F(C2) is the fundamental matrix relating two camera views described by camera objects C (first view) and C2 (second view). “An invitation to 3D”.CHAPTER 2.

flowfield(v) displays the optical flow pattern for a sparse grid of points when the camera has a spatial velocity v (6 × 1).fov() are the field of view angles (2 × 1) in radians for the camera x and y (horizontal and vertical) directions. Feb. Lepetit. FUNCTIONS AND CLASSES CentralCamera. a = C.invE(E) decomposes the essential matrix E (3 × 3) into the camera motion. See also quiver CentralCamera.CHAPTER 2. 155-166.estpose(xyz. pp. and P. uv) is an estimate of the pose of the object defined by coordinates xyz (3×N ) in its own coordinate frame. F. Int. CentralCamera. CentralCamera. 81.invE Decompose essential matrix s = C. Moreno-Noguer. 2009.fov Camera field-of-view angles. V. In practice there are multiple solutions and s (4x4xN) is a set of homogeneous transformations representing possible camera motion. Reference “EPnP: An accurate O(n) solution to the PnP problem”.flowfield Optical flow C. vol.estpose Estimate pose from object model and camera view T = C. uv (2×N ) are the corresponding image plane coordinates. Journal on Computer Vision. Machine Vision Toolbox for MATLAB R 24 Copyright c Peter Corke 2011 . Fua.

Sastry. Springer.Kosecka.Kosecka. normal vector to the plane (3 × 3) Notes • There are up to 4 solutions • Only those solutions that obey the positive depth constraint are returned • The required camera intrinsics are taken from the camera object • The transformation is from view 1 to view 2. In practice there are multiple solutions and s is a vector of structures with elements: • T. p. Chap 9. camera motion as a homogeneous transform matrix (4 × 4). “An invitation to 3D”. Springer. J. translation not to scale • n.invE(E.Sastry. p) as above but only solutions in which the world point p is visible are returned. “An invitation to 3D”. section 5.Ma. Reference Y. s.Ma.CHAPTER 2. p116. s. Reference Hartley & Zisserman.invH Decompose homography matrix s = C. See also CentralCamera. 2003. p120122 Notes • The transformation is from view 1 to view 2.Soatto. s.E CentralCamera.3 Machine Vision Toolbox for MATLAB R 25 Copyright c Peter Corke 2011 . s. 2003. FUNCTIONS AND CLASSES s = C.invH(H) decomposes the homography H (3 × 3) into the camera motion and the normal to the plane.Soatto. J. “Multiview Geometry”. 259 Y.

p) as above but return a vector of graphic handles. ‘Tobj’.plot line tr Plot line in theta-rho format CentralCamera. ‘Tcam’.plot epiline(f.H CentralCamera.plot line tr(L) plots lines on the camera’s image plane that are described by columns of L with rows theta and rho respectively. T Machine Vision Toolbox for MATLAB R 26 Copyright c Peter Corke 2011 . p) plots the epipolar lines due to the fundamental matrix f and the image points p. CentralCamera. p. Temporarily overrides the current camera pose C. FUNCTIONS AND CLASSES See also CentralCamera. H = C.CHAPTER 2.plot epiline(f.plot epiline(f. options) are the image plane coordinates (2 × N ) corresponding to the world points p (3 × N ). C. ls) as above but draw lines using the line style arguments ls. See also Hough CentralCamera.project Project world points to image plane uv = C. T Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. Options Transform all points by the homogeneous transformation T before projecting them to the camera image plane. If Tcam (4x4xS) is a transform sequence then uv (2xNxS) represents the sequence of projected points as the camera moves in the world.T. one per line.project(p.plot epiline Plot epipolar line C.

pp. FUNCTIONS AND CLASSES If Tobj (4x4x) is a transform sequence then uv (2xNxS) represents the sequence of projected points as the object moves in the world.c.CHAPTER 2. one for each point defined by the columns of p. Espiau. F.plot CentralCamera. p 162 See also Ray3D CentralCamera. IEEE Transactions on Robotics and Automation. Rives. See also Camera. and P. 313-326. 8.ray(p) returns a vector of Ray3D objects.ray 3D ray for image point R = C. The ellipse lies in the world plane pl = (a.2E2uv + 2E3u + 2E4v + E5 = 0.visjac e Visual motion Jacobian for point feature J = C.b. The Jacobian gives the rates of change of the ellipse parameters in terms of camera spatial velocity. pl) is the image Jacobian (5 × 6) for the ellipse E (5 × 1) described by u2 + E1v2 . June 1992.d) such that aX + bY + cZ + d = 0. Reference Hartley & Zisserman. Chaumette. Machine Vision Toolbox for MATLAB R 27 Copyright c Peter Corke 2011 . “Multiview Geometry”. vol.visjac e(E. “A New Approach to Visual Servoing in Robotics”. Reference B.

IEEE Transactions on Robotics and Automation. Oct.visjac p polar. CentralCamera. and the rows are theta and rho respectively.c. Espiau. pp. Hager & Corke. The depth of the points from the camera is given by z which is a scalar for all points. Rives.visjac p. Machine Vision Toolbox for MATLAB R 28 Copyright c Peter Corke 2011 . Vol 12(5).visjac e CentralCamera.d) such that aX + bY + cZ + d = 0.visjac p Visual motion Jacobian for point feature J = C. Chaumette.visjac l Visual motion Jacobian for line feature J = C. and P. IEEE Trans.CHAPTER 2. The Jacobian gives the image-plane point velocity in terms of camera spatial velocity. z) is the image Jacobian (2N × 6) for the image plane points uv (2 × N ). CentralCamera.b. R&A. or a vector (N × 1) of depth for each point.visjac l(L. pl) is the image Jacobian (2N × 6) for the image plane lines L (2 × N ). CentralCamera. F. See also CentralCamera. vol. The Jacobian gives the rates of change of the line parameters in terms of camera spatial velocity. June 1992. CentralCamera. Reference B. Hutchinson. 1996.visjac p(uv. FUNCTIONS AND CLASSES See also CentralCamera. The lines all lie in the plane pl = (a. Reference “A tutorial on Visual Servo Control”. “A New Approach to Visual Servoing in Robotics”.visjac l CentralCamera. Each column of L is a line in theta-rho format.visjac p. 8. pp 651-670. 313-326.visjac p polar.

Conf on Intelligent Robots and Systems (IROS). P. F. radius and theta. CentralCamera. CentralCamera. CentralCamera.visjac l. The depth of the points from the camera is given by z which is a scalar for all point. Machine Vision Toolbox for MATLAB R 29 Copyright c Peter Corke 2011 . The Jacobian gives the image-plane polar point coordinate velocity in terms of camera spatial velocity. 5962-5967. z) is the image Jacobian (2N × 6) for the image plane points rt (2 × N ) described in polar form. Int. Oct. I. FUNCTIONS AND CLASSES See also CentralCamera.visjac p polar(rt. Corke.visjac p.visjac e CentralCamera. Louis). or a vector (N × 1) of depths for each point. and F.visjac p polar Visual motion Jacobian for point feature J = C. Spindler.CHAPTER 2. (St. Chaumette.visjac e SiftPointFeature SIFT point corner feature object A subclass of PointFeature for SIFT features.visjac p polar. in Proc. See also CentralCamera. Reference “Combining Cartesian and polar coordinates in IBVS”. 2009. CentralCamera. pp.visjac l.

SIFT. SurfPointFeature Machine Vision Toolbox for MATLAB R 30 Copyright c Peter Corke 2011 .91-110. Nov.u is a 2×N matrix with each column the corresponding u coordinate. vol. You can download a SIFT implementation which this class can utilize.60. D. See README. 2004. If F is a vector (N ×1) of SiftCornerFeature objects then F. ScalePointFeature. FUNCTIONS AND CLASSES Methods plot plot scale distance match ncc uv display char Plot feature position Plot feature scale Descriptor distance Match features Descriptor similarity Return feature coordinate Display value Convert value to string Properties u horizontal coordinate v vertical coordinate strength feature strength theta feature orientation [rad] scale feature scale descriptor feature descriptor (vector) index of image containing feature image id Properties of a vector of SiftCornerFeature objects are returned as a vector. • SiftCornerFeature objects can be used in vectors and arrays • The SIFT algorithm is patented and not distributed with this toolbox. See also isift.Lowe. pp.CHAPTER 2. Journal on Computer Vision. Notes • SiftCornerFeature is a reference object. References “Distinctive image features from scale-invariant keypoints”. Int. PointFeature.

SiftPointFeature.plot scale(options) overlay a marker to indicate feature point position and scale. A Indicate scale by a circle (default) Indicate scale by circle with one radial line for orientation Indicate scale and orientation by an arrow Indicate scale by a translucent disk Color of circle or disk (default green) Transparency of disk. options) is a vector of FeatureMatch objects that describe candidate matches between the two vectors of SIFT features F and f2. 0=transparent (default 0. f = PointFeature(u. See also isift SiftPointFeature.SiftPointFeature Create a SIFT point feature object f = SiftPointFeature() is a point feature object with null parameters. ls) as above but the optional line style arguments ls are passed to plot. Correspondence is based on descriptor similarity. 1=opaque. C ‘alpha’. F.match Match SIFT point features m = F. Options ‘circle’ ‘clock’ ‘arrow’ ‘disk’ ‘color’. v.plot scale(options.plot scale Plot feature scale F. strength) as above but with specified strength. v) is a point feature object with specified coordinates. f = PointFeature(u. FUNCTIONS AND CLASSES SiftPointFeature.2) Machine Vision Toolbox for MATLAB R 31 Copyright c Peter Corke 2011 .CHAPTER 2.match(f2. If F is a vector then each element is plotted.

support(images. w) as above but if the features were extracted from an image sequence images then the feature is extracted from the appropriate image in the same sequence. The support region is scaled to w × w and rotated so that the feature’s orientation axis is upward. extracted from the image im in which the feature appears. F. See also SiftPointFeature SphericalCamera Spherical camera class A concrete class a spherical-projection camera.support Support region of feature out = F. Machine Vision Toolbox for MATLAB R 32 Copyright c Peter Corke 2011 .support(im. out = F. [out. w) is an image of the support region of the feature F.CHAPTER 2.T] = F. w) as above but the support region is displayed.support(images. FUNCTIONS AND CLASSES SiftPointFeature. w) as above but returns the pose of the feature as a 3 × 3 homogeneous transform in SE(2) that comprises the feature position and orientation.support(im.

FUNCTIONS AND CLASSES Methods project plot hold ishold clf figure mesh point line plot camera rpy move centre delete char display project world points plot/return world point on image plane control hold for image plane test figure hold for image plane clear image plane figure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera set camera attitude copy of Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Properties (read/write) npix pp rho T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: pixel dimensions (2 × 1) in metres extrinsic: camera pose as homogeneous transformation Properties (read only) nu nv number of pixels in u-direction number of pixels in v-direction Note • SphericalCamera is a reference object. • SphericalCamera objects can be used in vectors and arrays See also Camera Machine Vision Toolbox for MATLAB R 33 Copyright c Peter Corke 2011 .CHAPTER 2.

FUNCTIONS AND CLASSES SphericalCamera. See also SphericalCamera.CHAPTER 2.plot Machine Vision Toolbox for MATLAB R 34 Copyright c Peter Corke 2011 . Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. T Name of camera Pixel size: S × S or S(1)xS(2) Pose of the camera as a homogeneous transformation See also Camera. each column is phi (longitude) and theta (colatitude). T ‘Tcam’. S ‘pose’.project(p. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane. N ‘pixel’. CatadioptricCamera SphericalCamera.SphericalCamera Create spherical projection camera object C = SphericalCamera() creates a spherical projection camera with canonic parameters: f=1 and name=’canonic’.project Project world points to image plane pt = C.T. fisheyecamera. The columns of p (3 × N ) are the world points and the columns of pt (2 × N ) are the corresponding spherical projection points. C = CentralCamera(options) as above but with specified parameters. Options ‘Tobj’. options) are the image plane coordinates for the world points p. Overrides the current camera pose C. CentralCamera. Options ‘name’.

params) Simulate IBVS with for a square target comprising 4 points is placed in the world XY plane. The camera/robot is initially at pose T and is driven to the orgin. camera pose.2) niter eterm lambda ci depth . showing the desired view (*) and the current view (o) 2.center of the target in world coords (0. scalar for If null take actual value all points.01) . from simulation ([]) SEE ALSO: ibvsplot SphericalCamera.the number of iterations to run the simulation (500) . error norm. The camera view.sph2 Implement spherical IBVS for point features results = sph(T) results = sph(T. Two windows are shown and animated: 1.a stopping criteria on feature error norm (0) .gain. image plane size and desired feature locations. The camera/robot is initially at pose T and is driven to the orgin.5) target center . The external view. showing the desired view (*) and the current view (o) Machine Vision Toolbox for MATLAB R 35 Copyright c Peter Corke 2011 . showing the target points and the camera The results structure contains time-history information about the image plane.the side length of the target in world units (0.depth of points to use for Jacobian. The camera view. FUNCTIONS AND CLASSES SphericalCamera.0. Two windows are shown and animated: 1. error. of 4-vector.CHAPTER 2. defaults in parentheses: target size .sph Implement spherical IBVS for point features results = sph(T) results = sph(T. The params structure can be used to override simulation defaults by providing elements. can be scalar or diagonal 6 × 6 matrix (0. params) Simulate IBVS with for a square target comprising 4 points is placed in the world XY plane.camera intrinsic structure (camparam) . Jacobian condition number.

Reference “Spherical image-based visual servo and structure estimation”.visjac p polar. z) is the image Jacobian (2N × 6) for the image plane points pt (2 × N ) described by phi (longitude) and theta (colatitude). See also CentralCamera. FUNCTIONS AND CLASSES 2.depth of points to use for Jacobian.3) niter eterm lambda ci depth . error norm.a stopping criteria on feature error norm (0) . I. The external view.5) target center . scalar for If null take actual value all points. Robotics and Automation. defaults in parentheses: target size . Jacobian condition number.camera intrinsic structure (camparam) . P. CentralCamera.center of the target in world coords (0.visjac p Visual motion Jacobian for point feature J = C.visjac l. 5550-5555. CentralCamera. IEEE Int. pp. (Anchorage). camera pose.0.01) . or a vector (N × 1) for each point. image plane size and desired feature locations. The depth of the points from the camera is given by z which is a scalar. of 4-vector.gain.visjac p(pt.visjac e Machine Vision Toolbox for MATLAB R 36 Copyright c Peter Corke 2011 . Corke. The params structure can be used to override simulation defaults by providing elements. The Jacobian gives the image-plane velocity in terms of camera spatial velocity. for all points.the number of iterations to run the simulation (500) . from simulation ([]) SEE ALSO: ibvsplot SphericalCamera. Conf. can be scalar or diagonal 6 × 6 matrix (0.CHAPTER 2. May 3-7 2010. in Proc. error.the side length of the target in world units (0. showing the target points and the camera The results structure contains time-history information about the image plane.

3. Methods plot plot scale distance match ncc uv display char Plot feature position Plot feature scale Descriptor distance Match features Descriptor similarity Return feature coordinate Display value Convert value to string Properties u horizontal coordinate v vertical coordinate strength feature strength scale feature scale theta feature orientation [rad] descriptor feature descriptor (vector) index of image containing feature image id Properties of a vector of SurfCornerFeature objects are returned as a vector. • SurfCornerFeature objects can be used in vectors and arrays Reference Herbert Bay. Luc Van Gool. Computer Vision and Image Understanding (CVIU). 346–359.CHAPTER 2. Notes • SurfCornerFeature is a reference object. No. Andreas Ess. FUNCTIONS AND CLASSES SurfPointFeature SURF point corner feature object A subclass of PointFeature for SURF features. “SURF: Speeded Up Robust Features”. Vol. 110. 2008 Machine Vision Toolbox for MATLAB R 37 Copyright c Peter Corke 2011 . pp. If F is a vector (N × 1) of SurfCornerFeature objects then F. Tinne Tuytelaars.u is a 2 × N matrix with each column the corresponding u coordinate.

strength) as above but with specified strength. [m. v. PointFeature.C] = F. T ‘median’ match threshold (default 0. f = PointFeature(u.CHAPTER 2.match(f2. See also isurf SurfPointFeature. Options ‘thresh’. SiftPointFeature SurfPointFeature. FUNCTIONS AND CLASSES See also isurf.SurfPointFeature Create a SURF point feature object f = SurfPointFeature() is a point feature object with null parameters. options) is a vector of FeatureMatch objects that describe candidate matches between the two vectors of SURF features F and f2. v) is a point feature object with specified coordinates.05) Threshold at the median distance Notes • for no threshold set to []. options) as above but returns a correspodence matrix where each row contains the indices of corresponding features in F and f2 respectively. f = PointFeature(u. Correspondence is based on descriptor similarity. See also FeatureMatch Machine Vision Toolbox for MATLAB R 38 Copyright c Peter Corke 2011 . ScalePointFeature.match(f2.match Match SURF point features m = F.

extracted from the image im in which the feature appears. 0=transparent (default 0.support(images. ls) as above but the optional line style arguments ls are passed to plot. w) as above but the support region is displayed. FUNCTIONS AND CLASSES SurfPointFeature. C ‘alpha’. The support region is scaled to w × w and rotated so that the feature’s orientation axis is upward.support(images. w) as above but if the features were extracted from an image sequence images then the feature is extracted from the appropriate image in the same sequence. See also SurfPointFeature Machine Vision Toolbox for MATLAB R 39 Copyright c Peter Corke 2011 .T] = F. out = F. Options ‘circle’ ‘clock’ ‘arrow’ ‘disk’ ‘color’.2) SurfPointFeature.CHAPTER 2. A Indicate scale by a circle (default) Indicate scale by circle with one radial line for orientation Indicate scale and orientation by an arrow Indicate scale by a translucent disk Color of circle or disk (default green) Transparency of disk. 1=opaque. F.support Support region of feature out = F.support(im. F.plot scale Plot feature scale F.plot scale(options. w) is an image of the support region of the feature F. [out. If F is a vector then each element is plotted. w) as above but returns the pose of the feature as a 3 × 3 homogeneous transform in SE(2) that comprises the feature position and orientation.plot scale(options) overlay a marker to indicate feature point position and scale.support(im.

S Notes: Return image with uint8 pixels (default) Return image with float pixels Return image with double precision pixels Return greyscale image Apply gamma correction with gamma=G Subsample the image by S in both directions. otherwise the result is not predictable. S ‘resolution’. Methods grab size close char Aquire and return the next image Size of image Close the image source Convert the object parameters to human readable string See also ImageSource.com) web camera.AxisWebCamera Axis web camera constructor a = AxisWebCamera(url. Machine Vision Toolbox for MATLAB R 40 Copyright c Peter Corke 2011 . G ‘scale’. Obtain an image of size S=[W H].axis. • The specified ‘resolution’ must match one that the camera is capable of. Options ‘uint8’ ‘float’ ‘double’ ‘grey’ ‘gamma’.com). FUNCTIONS AND CLASSES AxisWebCamera Image from Axis webcam A concrete subclass of ImageSource that acquires images from a web camera built by Axis Communications (www. Video AxisWebCamera. options) is an AxisWebCamera object that acquires images from an Axis Communications (www.axis.CHAPTER 2.

and this function will return the most recently captured image held in the camera.char() is a string representing the state of the camera object in human readable form.CHAPTER 2. AxisWebCamera.grab Acquire image from the camera im = A.close Close the image source A.grab() is an image acquired from the web camera. Notes • Some web cameras have a fixed picture taking interval. Machine Vision Toolbox for MATLAB R 41 Copyright c Peter Corke 2011 .close() closes the connection to the web camera.char Convert to string A.display AxisWebCamera. See also AxisWebCamera. FUNCTIONS AND CLASSES AxisWebCamera. BagOfWords Bag of words class The BagOfWords class holds sets of features for a number of images and supports image retrieval by comparing new images with those in the ‘bag’.

Zisserman. The features are sorted into k clusters and each cluster is termed a visual word. FUNCTIONS AND CLASSES Methods isword occurrences remove stop wordvector wordfreq similarity contains exemplars display char Return all features assigned to word Return number of occurrences of word Remove stop words Return word frequency vector Return words and their frequencies Compare two word bags List the images that contain a word Display examples of word support regions Display the parameters of the bag of words Convert the parameters of the bag of words to a string Properties K nstop nimages The number of clusters specified The number of stop words specified The number of images in the bag Reference J.org). Machine Vision Toolbox for MATLAB R 42 Copyright c Peter Corke 2011 . pp. Oct. b = BagOfWords(f.BagOfWords Create a BagOfWords object b = BagOfWords(f.1470-1477. See also PointFeature BagOfWords. Notes • Uses the MEX function vl kmeans to perform clustering (vlfeat. as produced by ISURF() for an image sequence.Sivic and A. Conf. “Video Google: a text retrieval approach to object matching in videos”. on Computer Vision. b2) is a new bag of words created from the feature vector f but clustered to the words (and stop words) from the existing bag b2. k) is a new bag of words created from the feature vector f and with k words. Ninth IEEE Int. 2003. in Proc.CHAPTER 2. f can also be a cell array.

char Convert to string s = B. images. Notes • This method is invoked implicitly at the command line when the result of an expression is a BagOfWords object and the command has no trailing semicolon. The examples are displayed as a table of thumbR Machine Vision Toolbox for MATLAB 43 Copyright c Peter Corke 2011 .display() displays the parameters of the bag in a compact human readable form. BagOfWords. See also BagOfWords. isurf BagOfWords.char() is a compact string representation of a bag of words. FUNCTIONS AND CLASSES See also PointFeature.exemplars(w.display Display value B.CHAPTER 2.contains(w) is a vector of the indices of images in the sequence that contain one or more instances of the word w.exemplars display exemplars of words B.char BagOfWords. BagOfWords. options) displays examples of the support regions of the words specified by the vector w.contains Find images containing word k = B.

n] = B. BagOfWords.occurrence Word occurrence n = B.remove stop Remove stop words B.isword Features from words f = B. Machine Vision Toolbox for MATLAB R 44 Copyright c Peter Corke 2011 .isword(w) is a vector of feature objects that are assigned to any of the word w.remove stop(n) removes the n most frequent words (the stop words) from the bag.wordfreq Word frequency statistics [w. Options ‘ncolumns’. If w is a vector of words the result is a vector of features assigned to all the words in w. w Number of columns to display (default 10) Maximum number of exemplars to display from any one image (default 2) Width of each thumbnail [pixels] (default 50) BagOfWords. The original sequence of images from which the features were extracted must be provided as images.CHAPTER 2. All remaining words are renumbered so that the word labels are consecutive.occurrence(w) is the number of occurrences of the word w across all features in the bag. N ‘maxperimage’. FUNCTIONS AND CLASSES nail images. M ‘width’. BagOfWords. BagOfWords.wordfreq() is a vector of word labels w and the corresponding elements of n are the number of occurrences of that word.

subclass of Camera.CHAPTER 2. Methods project plot hold ishold clf figure mesh point line plot camera rpy move centre delete char display project world points to image plane plot/return world point on image plane control hold for image plane test figure hold for image plane clear image plane figure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera set camera attitude copy of Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Machine Vision Toolbox for MATLAB R 45 Copyright c Peter Corke 2011 .wordvector(J) is the word frequency vector for the J’th image in the bag.wordvector Word frequency vector wf = B. FUNCTIONS AND CLASSES BagOfWords. The vector is K × 1 and the angle between any two WFVs is an indication of image similarity. Notes • The word vector is expensive to compute so a lazy evaluation is performed on the first call to this function CatadioptricCamera Catadioptric camera class A concrete class for a catadioptric camera.

C = CatadioptricCamera(options) as above but with specified parameters. Camera CatadioptricCamera. FUNCTIONS AND CLASSES Properties (read/write) npix pp rho f p T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: pixel dimensions (2 × 1) [metres] intrinsic: focal length [metres] intrinsic: tangential distortion parameters extrinsic: camera pose as homogeneous transformation Properties (read only) nu nv u0 v0 number of pixels in u-direction number of pixels in v-direction principal point u-coordinate principal point v-coordinate Notes • Camera is a reference object. • Camera objects can be used in vectors and arrays See also CentralCamera.CatadioptricCamera Create central projection camera object C = CatadioptricCamera() creates a central projection camera with canonic parameters: f=1 and name=’canonic’.CHAPTER 2. Machine Vision Toolbox for MATLAB R 46 Copyright c Peter Corke 2011 .

P ‘pixel’. A ‘resolution’. T ‘Tcam’. S ‘centre’. SphericalCamera CatadioptricCamera. 10um pixels. optical axis is z-axis. F ‘default’ ‘projection’. K ‘maxangle’. M ‘k’. ‘equisolid’. See also Camera. T Name of camera Focal length (metres) Default camera parameters: 1024 × 1024. Image plane resolution: N × N or N=[W H].and y-axes respectively. CatadioptricCamera. u. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane. options) are the image plane coordinates for the world points p. N ‘sensor’. fisheyecamera. S ‘noise’. f=8mm. The columns of p (3 × N ) are the world points and the columns of uv (2 × N ) are the corresponding image plane points. See also Camera. Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane.plot Machine Vision Toolbox for MATLAB R 47 Copyright c Peter Corke 2011 . ‘stereographic’ Parameter for the projection model The maximum viewing angle above the horizontal plane.T.project(p.project Project world points to image plane uv = C. ‘sine’. SIGMA ‘pose’. Temporarily overrides the current camera pose C.CHAPTER 2. Options ‘Tobj’.and v-axes parallel to x. N ‘focal’. Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation Notes • The elevation angle range is from -pi/2 (below the mirror) to maxangle above the horizontal plane. camera at origin. Image sensor size in metres (2 × 1) Principal point (2 × 1) Pixel size: S × S or S=[W H]. FUNCTIONS AND CLASSES Options ‘name’. Catadioptric model: ‘equiangular’ (default).

• FeatureMatch objects can be used in vectors and arrays • Operates with all objects derived from PointFeature. SurfPointFeature and SiftPointFeature. such as ScalePointFeature. Methods plot show ransac inlier outlier subset display char Plot corresponding points Show summary statistics of corresponding points Determine inliers and outliers Return inlier matches Return outlier matches Return a subset of matches Display value of match Convert value of match to string Properties p1 Point coordinates in view 1 (2 × 1) p2 Point coordinates in view 2 (2 × 1) p Point coordinates in view 1 and 2 (4 × 1) distance Match strength between the points Properties of a vector of FeatureMatch objects are returned as a vector.p1 is a 2 × N matrix with each column the corresponding view 1 point coordinate. A vector of FeatureMatch objects can represent the correspondence between sets of points. Note • FeatureMatch is a reference object. SurfPointFeature. FUNCTIONS AND CLASSES FeatureMatch Feature correspondence object This class represents the correspondence between two PointFeature objects.CHAPTER 2. See also PointFeature. If F is a vector (N × 1) of FeatureMatch objects then F. SiftPointFeature Machine Vision Toolbox for MATLAB R 48 Copyright c Peter Corke 2011 .

FeatureMatch Create a new FeatureMatch object m = FeatureMatch(f1.char Convert to string s = M. s) is a new FeatureMatch object describing a correspondence between point features f1 and f2 with a strength of s. If M is a vector then the string has multiple lines. See also PointFeature. m = FeatureMatch(f1. SurfPointFeature. one per element. FUNCTIONS AND CLASSES FeatureMatch. Notes • This method is invoked implicitly at the command line when the result of an expression is a FeatureMatch object and the command has no trailing semicolon.display Display value M. Notes • Only the coordinates of the PointFeature are kept.char() is a compact string representation of the match object.CHAPTER 2.char Machine Vision Toolbox for MATLAB R 49 Copyright c Peter Corke 2011 . See also FeatureMatch. f2) as above but the strength is set to NaN. If M is a vector then the elements are printed one per line. f2.display() displays a compact human-readable representation of the feature pair. SiftPointFeature FeatureMatch. FeatureMatch.

ransac FeatureMatch.inlier Inlier features m2 = M.outlier Outlier features m2 = M.p Feature point coordinate pairs p = M.CHAPTER 2. Notes • Inliers are not determined until after RANSAC is run. FUNCTIONS AND CLASSES FeatureMatch.inlier.p() is a 4 × N matrix containing the feature point coordinates. FeatureMatch.v1.ransac FeatureMatch.outlier() is a subset of the FeatureMatch vector M that are considered to be outliers.u2.inlier() is a subset of the FeatureMatch vector M that are considered to be inliers. FeatureMatch. See also FeatureMatch.v2]. Notes • Outliers are not determined until after RANSAC is run.outlier. Machine Vision Toolbox for MATLAB R 50 Copyright c Peter Corke 2011 . Each column contains the coordinates of a pair of corresponding points [u1. See also FeatureMatch.

p2() is a 2 × N matrix containing the feature points coordinates from view 1.p FeatureMatch.plot(ls) as above but the optional line style arguments ls are passed to plot.p1.p1() is a 2 × N matrix containing the feature points coordinates from view 1.plot() overlays the correspondences in the FeatureMatch vector M on the current figure. These are the (u. FUNCTIONS AND CLASSES See also FeatureMatch.CHAPTER 2. FeatureMatch. FeatureMatch. for example by: idisp({im1.plot Show corresponding points M. See also FeatureMatch.im2}) m.FeatureMatch. Machine Vision Toolbox for MATLAB R 51 Copyright c Peter Corke 2011 .p FeatureMatch.plot() M.p2 FeatureMatch. FeatureMatch.v) properties of the feature F1 passed to the constructor.p2. These are the (u. The figure must comprise views 1 and 2 side by side.p1 Feature point coordinates from view 1 p = M.p2 Feature point coordinates from view 2 p = M. See also FeatureMatch. FeatureMatch.v) properties of the feature F2 passed to the constructor.p1. FeatureMatch.FeatureMatch.

See also idisp FeatureMatch.ransac Apply RANSAC M.show() is a compact summary of the FeatureMatch vector M that gives the number of matches. f2 = isurf(im2).match(f2). Example f1 = isurf(im1). See also fmatrix. m. options) applies the RANSAC algorithm to fit the point correspondences to the model described by the function func.show Display summary statistics of the FeatureMatch vector M. FUNCTIONS AND CLASSES Notes • Using IDISP as above adds UserData to the figure.ransac( @fmatrix. homography.ransac(func. ransac FeatureMatch. 1e-4). Elements of the FeatureMatch vector have their status updated in place to indicate whether they are inliers or outliers. inliers and outliers (and their percentages). The options are passed to the RANSAC() function.CHAPTER 2. m = f1. and an error is created if this UserData is not found. Machine Vision Toolbox for MATLAB R 52 Copyright c Peter Corke 2011 .

CHAPTER 2.subset Subset of matches m2 = M. Methods project plot hold ishold clf figure mesh point line plot camera rpy move centre delete char display project world points to image plane plot/return world point on image plane control hold for image plane test figure hold for image plane clear image plane figure holding the image plane draw shape represented as a mesh draw homogeneous points on image plane draw homogeneous lines on image plane draw camera set camera attitude copy of Camera after motion get world coordinate of camera centre object destructor convert camera parameters to string display camera parameters Machine Vision Toolbox for MATLAB R 53 Copyright c Peter Corke 2011 . that is. Y This camera model assumes central projection. X | | | + (principal point) | | Z-axis is into the page. The camera coordinate system is: 0------------> u. The image is not inverted. FishEyeCamera Fish eye camera class A concrete class a fisheye lense projection camera.subset(n) is a FeatureMatch vector with no more than n elements sampled uniformly from M. FUNCTIONS AND CLASSES FeatureMatch. the focal point is at z=0 and the image plane is at z=f. v.

Options ‘name’. N ‘sensor’. Standard deviation of additive Gaussian noise added to returned image projections Pose of the camera as a homogeneous transformation R Machine Vision Toolbox for MATLAB 54 Copyright c Peter Corke 2011 . P ‘pixel’. ‘equisolid’. camera at origin. T Name of camera Default camera parameters: 1024 × 1024.axes respectively.and v-axes are parallel to x. N ‘default’ ‘projection’. f=8mm.and y.CHAPTER 2. K ‘resolution’. M ‘k’. FUNCTIONS AND CLASSES Properties (read/write) npix pp f rho T image dimensions in pixels (2 × 1) intrinsic: principal point (2 × 1) intrinsic: focal length [metres] intrinsic: pixel dimensions (2 × 1) [metres] extrinsic: camera pose as homogeneous transformation Properties (read only) nu nv number of pixels in u-direction number of pixels in v-direction Notes • Camera is a reference object. u.FishEyeCamera Create fisheyecamera object C = FishEyeCamera() creates a fisheye camera with canonic parameters: f=1 and name=’canonic’. ‘sine’. • Camera objects can be used in vectors and arrays See also Camera FishEyeCamera. Image sensor size [metres] (2 × 1) Principal point (2 × 1) Pixel size: S × S or S=[W H]. optical axis is z-axis. Fisheye model: ‘equiangular’ (default). S ‘centre’. S ‘noise’. ‘stereographic’ Parameter for the projection model Image plane resolution: N × N or N=[W H]. 10um pixels. C = FishEyeCamera(options) as above but with specified parameters. SIGMA ‘pose’.

The columns of p (3 × N ) are the world points and the columns of uv (2 × N ) are the corresponding image plane points. See also FishEyeCamera. options) are the image plane coordinates for the world points p. See also Camera.CHAPTER 2. Set the camera pose to the homogeneous transformation T before projecting points to the camera image plane. T Transform all points by the homogeneous transformation T before projecting them to the camera image plane. T ‘Tcam’. CentralCamera.project(p. For every edge pixel in the input image a set of cells in the Hough accumulator (voting array) are incremented. CatadioptricCamera. Options ‘Tobj’. FUNCTIONS AND CLASSES Notes • If K is not specified it is computed such that the circular imaging region maximally fills the square image plane. Temporarily overrides the current camera pose C.project Project world points to image plane uv = C.plot Hough Hough transform class The Hough transform is a technique for finding lines in an image using a voting scheme. SphericalCamera FishEyeCamera.T. In this version of the Hough transform lines are described by: d = y cos(theta) + x sin(theta) Machine Vision Toolbox for MATLAB R 55 Copyright c Peter Corke 2011 .

and d is the perpendicular distance between (0. with columns corresponding to theta and rows corresponding to offset (d). See also LineFeature Hough. Theta spans the range -pi/2 to pi/2 in Ntheta steps.CHAPTER 2.Hough Create Hough transform object ht = Hough(E. The voting array is 2-dimensional. FUNCTIONS AND CLASSES where theta is the angle the line makes to horizontal axis. a vertical line has theta = pi/2 or -pi/2. A horizontal line has theta = 0. For every pixel in the edge image E (H ×W ) greater than a threshold the corresponding elements of the accumulator are incremented. options) is the Hough transform of the edge image E.0) and the line. Methods plot show lines char display Overlay detected lines Display the Hough accumulator Return line features Convert Hough parameters to string Display Hough parameters Properties Nrho Ntheta A rho theta edgeThresh houghThresh suppress interpWidth Number of bins in rho direction Number of bins in theta direction The Hough accumulator (Nrho x Ntheta) rho values for the centre of each bin vertically Theta values for the centre of each bin horizontally Threshold on relative edge pixel strength Threshold on relative peak strength Radius of accumulator cells cleared around peak Width of region used for peak interpolation Notes • Hough is a reference object.H). Offset is in the range -rho max to rho max where rho max=max(W. By default the vote is incremented by Machine Vision Toolbox for MATLAB R 56 Copyright c Peter Corke 2011 .

CHAPTER 2. W ‘nbins’. Nrho].char() is a compact string representation of the Hough transform parameters.5) Set ht. Hough.houghThresh (default 0.display() displays a compact human-readable string representation of the Hough transform parameters. Options ‘equal’ ‘interpwidth’.edgeThresh (default 0. otherwise the edge pixel value is the vote strength Interpolation width (default 3) Set ht.1). T ‘edgethresh’. N All edge pixels have equal weight. Default 400 × 401. T ‘suppress’. Set ht. Notes • This method is invoked implicitly at the command line when the result of an expression is a Hough object and the command has no trailing semicolon.char Machine Vision Toolbox for MATLAB R 57 Copyright c Peter Corke 2011 . See also Hough.suppress (default 0) Set number of bins.display Display value HT. The threshold is determined from the maximum edge strength value x ht. W ‘houghthresh’. FUNCTIONS AND CLASSES the edge strength but votes can be made equal with the option ‘equal’. Hough.char Convert to string s = HT. if N is scalar set Nrho=Ntheta=N.edgeThresh. else N = [Ntheta.

show Display the Hough accumulator as image s = HT.plot.plot(n) overlays a maximum of n strongest lines on the current figure.lines Find lines L = HT.lines(n) as above but returns no more than n LineFeature objects. where ‘heat’ is proportional to the number of votes.CHAPTER 2. Machine Vision Toolbox for MATLAB R 58 Copyright c Peter Corke 2011 . LineFeature Hough. then all elements in an HT. refined to subpixel precision. HT.plot Plot line features HT.houghThresh times the maximum vote value.plot() overlays all detected lines on the current figure.plot(n. Lines are the coordinates of peaks in the Hough accumulator.plot() as above but returns a vector of graphics handles for each line. L = HT.lines() is a vector of LineFeature objects that represent the dominant lines in the Hough accumulator. FUNCTIONS AND CLASSES Hough. The highest peak is found.suppress radius around are zeroed so as to eliminate multiple close minima. See also Hough. H = HT. See also Hough.show() displays the Hough vote accumulator as an image using the hot colormap. The peak detection loop breaks early if the remaining peak has a strength less than HT. ls) as above but the optional line style arguments ls are passed to plot.lines Hough. The process is repeated for all peaks. HT.

PointFeature Machine Vision Toolbox for MATLAB R 59 Copyright c Peter Corke 2011 . Note • LineFeature is a reference object. FUNCTIONS AND CLASSES See also colormap. If L is a vector (N × 1) of LineFeature objects then L. Methods plot seglength display char Plot the line segment Determine length of line segment Display value Convert value to string Properties rho Offset of the line theta Orientation of the line strength Feature strength length Length of the line Properties of a vector of LineFeature objects are returned as a vector. • LineFeature objects can be used in vectors and arrays See also Hough.rho is an N × 1 vector of the rho element of each feature.CHAPTER 2. RegionFeature. hot LineFeature Line feature class This class represents a line feature.

char Machine Vision Toolbox for MATLAB R 60 Copyright c Peter Corke 2011 . L = LineFeature(rho. strength) is a line feature object with the specified properties. length) is a line feature object with the specified properties.display Display value L. L = LineFeature(rho. LineFeature. LineFeature. theta. If L is a vector then the elements are printed one per line. See also LineFeature. strength.char Convert to string s = L. one per element. theta. Notes • This method is invoked implicitly at the command line when the result of an expression is a LineFeature object and the command has no trailing semicolon. LENGTH is undefined.display() displays a compact human-readable representation of the feature.CHAPTER 2. FUNCTIONS AND CLASSES LineFeature.char() is a compact string representation of the line feature. If L is a vector then the string has multiple lines.LineFeature Create a line feature object L = LineFeature() is a line feature object with null parameters. L = LineFeature(l2) is a deep copy of the line feature l2.

seglength Compute length of line segments The Hough transform identifies lines but cannot determine their length.seglength(edge. This method examines the edge pixels in the original image and determines the longest stretch of non-zero pixels along the line. See also icanny LineFeature. FUNCTIONS AND CLASSES LineFeature. L.points(edge) is the set of points that lie along the line in the edge image edge are determined. LineFeature. See also icanny Machine Vision Toolbox for MATLAB R 61 Copyright c Peter Corke 2011 . l2 = L.plot() overlay the line on current plot. less than gap pixels are tolerated. Small gaps.points Return points on line segments p = L.seglength(edge) as above but the maximum allowable gap is 5 pixels.plot Plot line L. l2 = L. gap) is a copy of the line feature object with the property length updated to the length of the line (pixels).plot(ls) as above but the optional line style arguments ls are passed to plot. Notes • If L is a vector then each element is plotted.CHAPTER 2.

Movie Image source constructor m = Movie(file. Movie. options) is an Movie object that returns frames from the movie file file.CHAPTER 2.char() is a string representing the state of the movie object in human readable form.char Convert to string M. Options ‘uint8’ ‘float’ ‘double’ ‘grey’ ‘gamma’. FUNCTIONS AND CLASSES Movie Class to read movie file A concrete subclass of ImageSource that acquires images from a web camera built by Axis Communications (www.close() closes the connection to the movie.close Close the image source M. Machine Vision Toolbox for MATLAB R 62 Copyright c Peter Corke 2011 . S ‘skip’. S Return image with uint8 pixels (default) Return image with float pixels Return image with double precision pixels Return greyscale image Apply gamma correction with gamma=G Subsample the image by S in both directions Read every S’th frame from the movie Movie.axis. G ‘scale’.com). Movie.

eid Graph connectivity is maintained by a labeling algorithm and this is updated every time an edge is added.grab Acquire next frame from movie im = M. planar. undirected graph create an n-d. undirected graph Graphs • are undirected • are symmetric cost edges (A to B is same cost as B to A) • are embedded in coordinate system • have no loops (edges from A to A) • vertices are represented by integer ids. Options ‘skip’.CHAPTER 2. Machine Vision Toolbox for MATLAB R 63 Copyright c Peter Corke 2011 . S ‘frame’. vid • edges are represented by integer ids.grab(options) as above but allows the next frame to be specified. and return current+S frame Return frame F within the movie Notes • If no output argument given the image is displayed using IDISP.grab() acquires the next image from the movie im = M. FUNCTIONS AND CLASSES Movie. PGraph Simple graph class g = PGraph() g = PGraph(n) create a 2D. F Skip frames.

return eid remove all nodes and edges from the graph Information from graph g.CHAPTER 2.goal(v) g.connectivity() g. v2) g.n number of nodes Machine Vision Toolbox for MATLAB R 64 Copyright c Peter Corke 2011 . Object properties (read/write) g.neighbours(v) g. v) g. return vid add vertex and edge to v.coord(v) g.clear() add vertex.path(v) set goal vertex.add node(coord) g.next(v) g.cost(e) g.plot() g.distance(v1. v2) distance between v1 and v2 as the crow flies g.distances(coord) return sorted distances from coord and vertices To change the distance metric create a subclass of PGraph and override the method distance metric(). return vid add edge from v1 to v2. FUNCTIONS AND CLASSES Methods Constructing the graph g.add edge(v1. and plan paths return d of neighbour of v closest to goal return list of nodes from v to goal Graph and world points g.add node(coord.pick() char(g) return vid for edge return cost for edge list return coordinate of node v return vid for edge return component id for vertex return number of edges for all nodes set goal vertex for path planning return vertex id closest to picked point display summary info about the graph Planning paths through the graph g.closest(coord) return vertex closest to coord g.edges(e) g.component(v) g.

add edge(v1. v2) add an edge between nodes with id v1 and v2. options) returns a graph object embedded in d dimensions. v2. PGraph.add node(x.PGraph Graph class constructor g = PGraph(d. FUNCTIONS AND CLASSES PGraph.add node Add a node to the graph v = G. Options ‘distance’. and returns the edge id E. PGraph. and returns the node id v. v) adds a node with coordinate x and connected to node v by an edge.add node(x.CHAPTER 2. Machine Vision Toolbox for MATLAB R 65 Copyright c Peter Corke 2011 . v = G. v = G. C) add an edge between nodes with id v1 and v2 with cost C.add node(x) adds a node with coordinate x.add edge(v1. M ‘verbose’ Use the distance metric M for path planning Specify verbose operation Note • The distance metric is either ‘Euclidean’ or ‘SE2’ which is the sum of the squares of the difference in position and angle modulo 2pi. where x is D × 1. C) adds a node with coordinate x and connected to node v by an edge with cost C.add edge Add an edge to the graph E = G. E = G. v.

connectivity Graph connectivity C = G. and the distance d.CLOSEST(x) return id of node geometrically closest to coordinate x. [v.clear Clear the graph G. edges and components.char Convert graph to string s = G. of node id v.d] = G. Machine Vision Toolbox for MATLAB R 66 Copyright c Peter Corke 2011 .closest Find closest node v = G. PGraph. PGraph.coord Coordinate of node x = G. PGraph.CLEAR() removes all nodes and edges.CHAPTER 2. D × 1. FUNCTIONS AND CLASSES PGraph.connectivity() returns the total number of edges in the graph.coord(v) return coordinate vector.closest(x) return id of node geometrically closest to coordinate x. PGraph.char() returns a compact human readable representation of the state of the graph including the number of vertices.

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.cost
Cost of edge
C = G.cost(E) return cost of edge id E.

PGraph.display
Display state of the graph
G.display() displays a compact human readable representation of the state of the graph including the number of vertices, edges and components.

See also
PGraph.char

PGraph.distance
Distance between nodes
d = G.distance(v1, v2) return the geometric distance between the nodes with id v1 and v2.

PGraph.distances
distance to all nodes
d = G.distances(v) returns vector of geometric distance from node id v to every other node (including v) sorted into increasing order by d. [d,w] = G.distances(v) returns vector of geometric distance from node id v to every other node (including v) sorted into increasing order by d where elements of w are the corresponding node id.

Machine Vision Toolbox for MATLAB

R

67

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.edges
Find edges given vertex
E = G.edges(v) return the id of all edges from node id v.

PGraph.goal
Set goal node
G.goal(vg) for least-cost path through graph set the goal node. The cost of reaching every node in the graph connected to vg is computed.

See also
PGraph.path cost is total distance from goal

PGraph.neighbours
Neighbours of a node
n = G.neighbours(v) return a vector of ids for all nodes which are directly connected neighbours of node id v. [n,C] = G.neighbours(v) return a vector n of ids for all nodes which are directly connected neighbours of node id v. The elements of C are the edge costs of the paths to the corresponding node ids in n.

PGraph.next
Find next node toward goal
v = G.next(vs) return the id of a node connected to node id vs that is closer to the goal.

See also
PGraph.goal, PGraph.path

Machine Vision Toolbox for MATLAB

R

68

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.path
Find path to goal node
p = G.path(vs) return a vector of node ids that form a path from the starting node vs to the previously specified goal. The path includes the start and goal node id.

See also
PGraph.goal

PGraph.pick
Graphically select a node
v = G.pick() returns the id of the node closest to the point clicked by user on a plot of the graph.

See also
PGraph.plot

PGraph.plot
Plot the graph
G.plot(opt) plot the graph in the current figure. Nodes are shown as colored circles.

Options
‘labels’ ‘edges’ ‘edgelabels’ ‘MarkerSize’, S ‘MarkerFaceColor’, C ‘MarkerEdgeColor’, C ‘componentcolor’ Display node id (default false) Display edges (default true) Display edge id (default false) Size of node circle Node circle color Node circle edge color Node color is a function of graph component

Machine Vision Toolbox for MATLAB

R

69

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

PGraph.showComponent
t
G.showcomponent(C) plots the nodes that belong to graph component C.

PGraph.showVertex
Highlight a vertex
G.showVertex(v) highlights the vertex v with a yellow marker.

PGraph.vertices
Find vertices given edge
v = G.vertices(E) return the id of the nodes that define edge E.

PointFeature
PointCorner feature object
A superclass for image corner features.

Methods
plot distance ncc uv display char Plot feature position Descriptor distance Descriptor similarity Return feature coordinate Display value Convert value to string

Machine Vision Toolbox for MATLAB

R

70

Copyright c Peter Corke 2011

v. FUNCTIONS AND CLASSES Properties u horizontal coordinate v vertical coordinate strength feature strength descriptor feature descriptor (vector) Properties of a vector of PointFeature objects are returned as a vector. See also ScalePointFeature.PointFeature Create a point feature object f = PointFeature() is a point feature object with null parameters.CHAPTER 2. strength) as above but with specified strength. one per element. If F is a vector then the string has multiple lines. PointFeature. f = PointFeature(u. If F is a vector (N × 1) of PointFeature objects then F.char() is a compact string representation of the point feature.u is a 2 × N matrix with each column the corresponding point coordinate. If F is a vector then the elements are printed one per line. SurfPointFeature.display Display value F.char Convert to string s = F. v) is a point feature object with specified coordinates. f = PointFeature(u. PointFeature. Notes • This method is invoked implicitly at the command line when the result of an expression is a PointFeature object and the command has no trailing semicolon. Machine Vision Toolbox for MATLAB R 71 Copyright c Peter Corke 2011 .display() displays a compact human-readable representation of the feature. SiftPointFeature PointFeature.

distance Distance between feature descriptors d = F. PointFeature.CHAPTER 2. FUNCTIONS AND CLASSES See also PointFeature. Machine Vision Toolbox for MATLAB R 72 Copyright c Peter Corke 2011 . [m.char PointFeature.match(f2.05) Threshold at the median distance See also FeatureMatch PointFeature. options) as above but returns a correspodence matrix where each row contains the indices of corresponding features in F and f2 respectively.C] = F.match Match point features m = F. If F is a vector then D is a vector whose elements are the distance between the corresponding element of F and f1.match(f2. If F is a vector then d is a vector whose elements are the distance between the corresponding element of F and f1.distance(f1) is the distance between feature descriptors.ncc Feature descriptor similarity s = F. where 1 is perfect match. Options ‘thresh’. the norm of the Euclidean distance. options) is a vector of FeatureMatch objects that describe candidate matches between the two vectors of point features F and f2. T ‘median’ match threshold (default 0.ncc(f1) is the similarty between feature descriptors which is a scalar in the interval -1 to 1.

miny maxy] Number of vertices Notes • this is reference class object • Polygon objects can be used in vectors and arrays R Machine Vision Toolbox for MATLAB 73 Copyright c Peter Corke 2011 .CHAPTER 2. F. If F is a vector then each element is plotted. Polygon .General polygon class p = Polygon(vertices).plot() overlay a marker at the feature position. Methods plot area moments centroid perimeter transform inside intersection difference union xor display char Plot polygon Area of polygon Moments of polygon Centroid of polygon Perimter of polygon Transform polygon Test if points are inside polygon Intersection of two polygons Difference of two polygons Union of two polygons Exclusive or of two polygons print the polygon in human readable form convert the polgyon to human readable string Properties vertices extent n List of polygon vertices.plot(ls) as above but the optional line style arguments ls are passed to plot. one per column Bounding box [minx maxx. FUNCTIONS AND CLASSES PointFeature.plot Plot feature F.

mit. union.area Area of polygon a = P. Polygon. Pankratov.edu. and xor are based on code written by: Kirill K. intersection. p = Polygon(C. so use with care.char String representation s = P.area() is the area of the polygon. difference. wh) is a rectangle centred at C with dimensions wh=[WIDTH. http://puddle.centroid() is the centroid of the polygon. Polygon.char() is a compact representation of the polgyon in human readable form. FUNCTIONS AND CLASSES Acknowledgement The methods inside. Polygon.centroid Centroid of polygon x = P. kirill@plume. However the author does not respond to email regarding the licence. one column per vertex.html and require a licence.edu/ glenn/kirill/saga. Polygon. Machine Vision Toolbox for MATLAB R 74 Copyright c Peter Corke 2011 . HEIGHT].Polygon Polygon class constructor p = Polygon(v) is a polygon with vertices given by v.mit.CHAPTER 2.

FUNCTIONS AND CLASSES Polygon.difference Difference of polygons d = P. See also Polygon.display() displays the polygon in a compact human readable form. Polygon. The corresponding elements of in are either true or false. Polygon.inside(p) tests if points given by columns of p are inside the polygon.display Display polygon P.intersect Intersection of polygon with list of polygons i = P. • If the result d is not simply connected or consists of several polygons. Machine Vision Toolbox for MATLAB R 75 Copyright c Peter Corke 2011 . returns coordinates of P.intersect(plist) indicates whether or not the Polygon P intersects with i(j) = 1 if p intersects polylist(j). else 0. resulting vertex list will contain NaNs. Notes • If polygons P and q are not intersecting.inside Test if points are inside polygon in = p.difference(q) is polygon P minus polygon q.char Polygon.CHAPTER 2.

intersection(q) is a Polygon representing the intersection of polygons P and q.moments Moments of polygon a = P.CHAPTER 2. See also mpq poly Polygon.intersect line Intersection of polygon and line segment i = P. each column is [x y]’. • If intersection consist of several disjoint polygons (for non-convex P or q) then vertices of i is the concatenation of the vertices of these polygons. returns empty polygon. FUNCTIONS AND CLASSES Polygon. Polygon. Polygon.perimeter Perimeter of polygon L = P.intersection Intersection of polygons i = P.perimeter() is the perimeter of the polygon.moments(p. Notes • If these polygons are not intersecting. i is an N × 2 matrix with one column per intersection. q) is the pq’th moment of the polygon. Machine Vision Toolbox for MATLAB R 76 Copyright c Peter Corke 2011 .intersect line(L) is the intersection points of a polygon P with the line segment L=[x1 x2. y1 y2].

Polygon.xor Exclusive or of polygons i = P.transform Transformation of polygon vertices p2 = P.plot Plot polygon P.plot() plot the polygon.transform(T) is a new Polygon object whose vertices have been transfored by the 3 × 3 homgoeneous transformation T. returns a polygon with vertices of both polygons separated by NaNs. Machine Vision Toolbox for MATLAB R 77 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES Polygon.union(q) is a Polygon representing the union of polygons P and q. Notes • If these polygons are not intersecting. Polygon. • If the result P is not simply connected (such as a polygon with a “hole”) the resulting contour consist of counter. Polygon. returns a polygon with vertices of both polygons separated by NaNs.clockwise “outer boundary” and one or more clock-wise “inner boundaries” around “holes”.union(q) is a Polygon representing the union of polygons P and q.CHAPTER 2. Notes • If these polygons are not intersecting.union Union of polygons i = P. P.plot(ls) as above but pass the arguments ls to plot.

defined by a point on the ray and a direction unit-vector.clockwise “outer boundary” and one or more clock-wise “inner boundaries” around “holes”. Methods intersect closest char display Intersection of ray with plane or ray Closest distance between point and ray Ray parameters as human readable string Display ray parameters in human readable form Properties P0 d A point on the ray (3 × 1) Direction of the ray.CHAPTER 2.Ray3D Ray constructor R = Ray3D(p0. unit vector (3 × 1) Notes • Ray3D objects can be used in vectors and arrays Ray3D. Machine Vision Toolbox for MATLAB R 78 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES • If the result P is not simply connected (such as a polygon with a “hole”) the resulting contour consist of counter. d) is a new Ray3D object defined by a point on the ray p0 and a direction vector d. Ray3D Ray in 3D space This object represents a ray in 3D space.

If R is a vector then the string has multiple lines.char() is a compact string representation of the Ray3D’s value.E] = R.display() displays a compact human-readable representation of the Ray3D’s value. Notes • This method is invoked implicitly at the command line when the result of an expression is a Ray3D object and the command has no trailing semicolon.intersect(r2) is the point on R that is closest to the ray r2. corresponding to the intersection of R(i) with r2. Ray3D.char Ray3D. See also Ray3D.intersect Intersetion of ray with line or plane x = R.intersect(r2) as above but also returns the closest distance between the rays. Ray3D. [x.CHAPTER 2. Machine Vision Toolbox for MATLAB R 79 Copyright c Peter Corke 2011 .closest(p) as above but also returns the distance E between x and p. one per element. [x.E] = R.closest(p) is the point on the ray R closest to the point p. If R is a vector then then x has multiple columns. If R is a vector then the elements are printed one per line.display Display value R.closest Closest distance between point and ray x = R. FUNCTIONS AND CLASSES Ray3D.char Convert to string s = R.

Methods boundary box plot plot boundary plot box plot ellipse display char Return the boundary as a list Return the bounding box Plot the centroid Plot the boundary Plot the bounding box Plot the equivalent ellipse Display value Convert value to string Properties uc vc umin umax vmin vmax area class label children edgepoint edge perimeter touch a b theta shape circularity moments centroid. minimum horizontal coordinate bounding box.CHAPTER 2. maximum vertical coordinate the number of pixels the value of the pixels forming this region the label assigned to this region a list of indices of features that are children of this feature coordinate of a point on the perimeter a list of edge points 2 × N matrix number of edge pixels true if region touches edge of the image major axis length of equivalent ellipse minor axis length of equivalent ellipse angle of major ellipse axis to horizontal axis aspect ratio b/a (always <= 1. less for other shapes a structure containing moments of order 0 to 2 Machine Vision Toolbox for MATLAB R 80 Copyright c Peter Corke 2011 . vertical coordinate bounding box. minimum vertical coordinate bounding box.c. FUNCTIONS AND CLASSES x = R. maximum horizontal coordinate bounding box.b.intersect(p) returns the point of intersection between the ray R and the plane p=(a. RegionFeature Region feature class This class represents a region feature.d) where aX + bY + cZ + d = 0. corresponding to the intersection of R(i) with p. If R is a vector then x has multiple columns. horizontal coordinate centroid.0) 1 for a circle.

ymin.boundary Boundary in polar form [d. • RegionFeature objects can be used in vectors and arrays • This class behaves differently to LineFeature and PointFeature when getting properties of a vector of RegionFeature objects. RegionFeature. Machine Vision Toolbox for MATLAB R 81 Copyright c Peter Corke 2011 .uc will be a list not a vector. These vectors have 400 elements irrespective of region size. RegionFeature. one per element.box Return bounding box b = R.box() is the bounding box in standard Toolbox form [xmin. FUNCTIONS AND CLASSES Note • RegionFeature is a reference object.xmax.char() is a compact string representation of the region feature. d(i) and th(i) are the distance to the boundary point and the angle respectively.boundary() is a polar representation of the boundary with respect to the centroid.CHAPTER 2. imoments RegionFeature.RegionFeature Create a region feature object R = RegionFeature() is a region feature object with null parameters. RegionFeature. See also iblobs.th] = R. If R is a vector then the string has multiple lines. For example R.char Convert to string s = R. ymax].

It is indicated with overlaid o. See also RegionFeature.plot boundary(ls) as above but the optional line style arguments ls are passed to plot.display() is a compact string representation of the region feature.plot Plot centroid R.display Display value R. Notes • this method is invoked implicitly at the command line when the result of an expression is a RegionFeature object and the command has no trailing semicolon. FUNCTIONS AND CLASSES RegionFeature. R.plot() overlay the centroid on current plot. RegionFeature. Machine Vision Toolbox for MATLAB R 82 Copyright c Peter Corke 2011 . Notes • If R is a vector then each element is plotted.and xmarkers. R. If R is a vector then the elements are printed one per line.plot boundary plot boundary R.plot boundary() overlay perimeter points on current plot.plot(ls) as above but the optional line style arguments ls are passed to plot. If R is a vector then each element is plotted.CHAPTER 2.char RegionFeature.

plot box() overlay the the bounding box of the region on current plot. R. R. If R is a vector then each element is plotted. If R is a vector then each element is plotted. ScalePointFeature ScalePointCorner feature object A subclass of PointFeature for features with scale.plot box(ls) as above but the optional line style arguments ls are passed to plot.CHAPTER 2. Methods plot plot scale distance ncc uv display char Plot feature position Plot feature scale Descriptor distance Descriptor similarity Return feature coordinate Display value Convert value to string Machine Vision Toolbox for MATLAB R 83 Copyright c Peter Corke 2011 .plot ellipse() overlay the the equivalent ellipse of the region on current plot. FUNCTIONS AND CLASSES See also boundmatch RegionFeature.plot box Plot bounding box R. RegionFeature.plot ellipse(ls) as above but the optional line style arguments ls are passed to plot.plot ellipse Plot equivalent ellipse R.

A Indicate scale by a circle (default) Indicate scale by a translucent disk Color of circle or disk (default green) Transparency of disk. 1=opaque. SurfPointFeature. f = ScalePointFeature(u. strength.plot scale(options) overlay a marker at the feature position.2) Machine Vision Toolbox for MATLAB R 84 Copyright c Peter Corke 2011 . f = ScalePointFeature(u.CHAPTER 2. See also PointFeature.u is a 2 × N matrix with each column the corresponding point coordinate. Options ‘circle’ ‘disk’ ‘color’. SiftPointFeature ScalePointFeature. scale) as above but with specified feature scale. v. F. ls) as above but the optional line style arguments ls are passed to plot. If F is a vector then each element is plotted. C ‘alpha’.plot scale Plot feature scale F. f = ScalePointFeature(u. strength) as above but with specified strength. FUNCTIONS AND CLASSES Properties u horizontal coordinate v vertical coordinate strength feature strength scale feature scale descriptor feature descriptor (vector) Properties of a vector of ScalePointFeature objects are returned as a vector. v) is a point feature object with specified coordinates. 0=transparent (default 0. v. ScalePointFeature. If F is a vector (N × 1) of ScalePointFeature objects then F.plot scale(options.ScalePointFeature Create a scale point feature object f = ScalePointFeature() is a point feature object with null parameters.

During operation the image sequence is animated and the point features are overlaid along with annotation giving the unique identifier of the track. Machine Vision Toolbox for MATLAB R 85 Copyright c Peter Corke 2011 . M Search radius for feature in next frame (default 20) Maximum number of tracks (default 800) Similarity threshold (default 0.8) Write the frames as images into the folder M as with sequential filenames. The elements of the cell array are the point features for the corresponding element of the image sequence. T ‘movie’. N ‘thresh’. Methods plot tracklengths Plot all tracks Length of all tracks Properties track history A vector of structures. Options ‘radius’.Tracker Create new Tracker object T = Tracker(im. options) is a new tracker object. one per active track. FUNCTIONS AND CLASSES Tracker Track points in image sequence This class assigns each new feature a unique identifier and tracks it from frame to frame until it is lost. A vector of track history structures with elements id and uv which is the path of the feature. R ‘nslots’.CHAPTER 2. A complete history of all tracks is maintained. See also PointFeature Tracker. C. im (HxWxS) is an image sequence and C (S × 1) is a cell array of vectors of PointFeature subclass objects.

Tracker. Tracker.plot Show feature trajectories T.plot() overlays the tracks of all features on the current plot. Machine Vision Toolbox for MATLAB R 86 Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES See also PointFeature Tracker.tracklengths() is a vector containing the length of every track.display Display value T.char Tracker.char Convert to string s = T.CHAPTER 2.char() is a compact string representation of the Tracker parameters and status.tracklengths Length of all tracks T.display() displays a compact human-readable string representation of the Tracker object Notes • This method is invoked implicitly at the command line when the result of an expression is a Tracker object and the command has no trailing semicolon. See also Tracker.

CHAPTER 2. Methods grab size close char Aquire and return the next image Size of image Close the image source Convert the object parameters to human readable string See also ImageSource. AxisWebCamera. S ‘resolution’. If camera is ‘?’ a list of available cameras. • The specified ‘resolution’ must match one that the camera is capable of. options) is a Video object that acquires images from the local video camera specified by the string camera.Video Video camera constructor v = Video(camera. otherwise the result is not predictable. FUNCTIONS AND CLASSES Video Class to read from local video camera A concrete subclass of ImageSource that acquires images from a local camera. S Notes: Return image with uint8 pixels (default) Return image with float pixels Return image with double precision pixels Return greyscale image Apply gamma correction with gamma=G Subsample the image by S in both directions. Movie Video. Options ‘uint8’ ‘float’ ‘double’ ‘grey’ ‘gamma’. Machine Vision Toolbox for MATLAB R 87 Copyright c Peter Corke 2011 . and their characteristics is displayed. Obtain an image of size S=[W H]. G ‘scale’.

FUNCTIONS AND CLASSES Video.char() is a string representing the state of the camera object in human readable form.grab() acquires an image from the camera.grab Acquire image from the camera im = V. about x as above but this is the command rather than functional form See also whos Machine Vision Toolbox for MATLAB R 88 Copyright c Peter Corke 2011 . Video. Notes • the function will block until the next frame is acquired. about Compact display of variable type about(x) displays a compact line that describes the class and dimensions of x.char Convert to string V. Video.close() closes the connection to the camera.CHAPTER 2.close Close the image source V.

disp) as above but allows for disparity correction. right.CHAPTER 2. FUNCTIONS AND CLASSES anaglyph Convert stereo images to an anaglyph image a = anaglyph(left. color) as above but the string color describes the color coding as a string with 2 letters. Machine Vision Toolbox for MATLAB R 89 Copyright c Peter Corke 2011 . d = angdiff(th) returns the equivalent angle to th in the interval [-pi pi). useful if the images were captured with a non-human stereo baseline or field of view. if negative it is reduced. Return the equivalent angle in the interval [-pi pi). If disp is positive the disparity is increased. and th2 a scalar then return a column vector where th2 is modulo subtracted from the corresponding elements of th1. Use this option to make the images more natural/comfortable to view. right. the second for right. The result is in the interval [-pi pi). These adjustments are achieved by trimming the images. the first for left. and each is one of: ‘r’ red ‘g’ green ‘b’ green ‘c’ cyan ‘m’ magenta a = anaglyph(left. right) is an anaglyph image where the two images of a stereo pair are combined into a single image by coding them in two different colors. If th1 is a column vector. color. and the right image is cyan. See also stdisp angdiff Difference of two angles d = angdiff(th1. By default the left image is red. a = anaglyph(left. th2) returns the difference between angles th1 and th2 on the circle.

then E is a column vector whose elements correspond to to those in lambda.y1) to (x2. y1. xcorr bresenham Generate a line p = bresenham(x1. Machine Vision Toolbox for MATLAB R 90 Copyright c Peter Corke 2011 . x is also N × 1 and is a correlation whose peak indicates the relative orientation of one profile with respect to the other.y2].y2). See also RegionFeature. FUNCTIONS AND CLASSES blackbody Compute blackbody emission spectrum E = blackbody(lambda. For example: l = [380:10:700]’*1e-9.CHAPTER 2. T) is the blackbody radiation power density [W/m3 ] at the wavelength lambda [m] and temperature T [K]. p2) as above but p1=[x1. % emission of sun plot(l. x2. r2) is the correlation of the two boundary profiles R1 and r2. y2) is a list of integer coordinates for points lying on the line segement (x1. r2) as above but also returns the relative scale s which is the size of object 2 with respect to object 1. % visible spectrum e = blackbody(l. Each is an N × 1 vector of distances from the centroid of an object to points on its perimeter at equal angular increments.s] = boundmatch(R1.boundary. If lambda is a column vector.y1] and p2=[x2. [x. e) boundmatch Match boundary profiles x = boundmatch(R1. p = bresenham(p1. Endpoints must be integer. 6500).

Reference An ancient Fairchild data book for a sensor with no IR filter fitted.V] is the corresponding image plane coordinate. Machine Vision Toolbox for MATLAB R 91 Copyright c Peter Corke 2011 .CHAPTER 2. The response is normalized in the range 0 to 1.Y. If lambda is a vector then R is a vector of the same length whose elements are the response at the corresponding element of lambda. d is a table of points with rows of the form [X Y Z U V] where (X. FUNCTIONS AND CLASSES See also icanvas camcald Camera calibration from data points C = camcald(d) is the camera matrix (3 × 4) determined by least squares from corresponding world and image-plane points.E] = camcald(d) as above but E is the maximum residual error after back substitution [pixels]. [C.Z) is the coordinate of a world point and [U. See also CentralCamera ccdresponse CCD spectral response R = ccdresponse(lambda) is the spectral response of a typical silicon imaging sensor at the wavelength lambda. Notes: • This method cannot handle lense distortion.

opt) return an N × 2 matrix whose rows define the coordinates [x. N Specify the number of points (default 50) closest Find closest points in N-dimensional space. green and blue primaries respectively. R Machine Vision Toolbox for MATLAB 92 Copyright c Peter Corke 2011 . R. [k. b) as above and d1(I)=—a(I)-b(J)— is the distance of the closest point. but the circle is always in the xy-plane with a z-coordinate of C(3).CHAPTER 2. circle Compute points on a circle circle(C. k (1 x NA) is such that the element J = k(I). b) is the correspondence for N-dimensional point sets a (N × N A) and b (N × N B). x = circle(C. FUNCTIONS AND CLASSES See also rluminos cie primaries Define CIE primary colors p = CIE PRIMARIES() is a 3-vector with the wavelengths [m] of the CIE 1976 red.d1] = closest(a. C is normally 2 × 1 but if 3 × 1 then the circle is embedded in 3D. k = closest(a. Options ‘n’. R. opt) plot a circle centred at C with radius R. and x is N × 3. that the I’th column of a is closest to the Jth column of b.y] of points around the circumferance of a circle centred at C and of radius R. that is.

335 of Table 1(5.3) of Wyszecki & Stiles (1982).ucl. FUNCTIONS AND CLASSES [k. they were measured directly.3) of Wyszecki & Stiles (1982) gives the Stiles & Burch functions in 250 cm-1 steps.) • These CMFs differ slightly from those of Stiles & Burch (1955). If lambda is a vector then each row of rgb is the color matching function of the corresponding element of lambda.3) of Wyszecki & Stiles (1982). (Table 1(5.d2] = closest(a. the CMFs have been ”corrected in accordance with instructions given by Stiles & Burch (1959)” and renormalized to primaries at 15500 (645.32).ac. The data are referred to as pilot data. rgb = cmfrgb(lambda) is the CIE color matching function (N × 3) for illumination at wavelength lambda (N × 1) [m]. while Table I(5.CHAPTER 2.5. rgb = cmfrgb(lambda.3) of Wyszecki & Stiles (1982) is gives them in interpolated 1 nm steps. since. As noted in footnote a on p.ioo. unlike the CIE 2 deg functions (which were reconstructed from chromaticity data). but probably represent the best estimate of the 2 deg CMFs.5. Notes • Data from http://cvrl. Notes • Is a MEX file.d1.uk • The Stiles & Burch 2-deg CMFs are based on measurements made on 10 observers. E) is the CIE color matching (1×3) function for an illumination spectrum defined by intensity E (N × 1) and wavelength lambda (N × 1) [m]. b) as above but also returns the distance to the second closest point.5.44) cm-1 Machine Vision Toolbox for MATLAB R 93 Copyright c Peter Corke 2011 . and 22500 (444. See also distance cmfrgb Color matching function Tthe color matching function is the tristimulus required to match a particular wavelength excitation. 19000 (526.5. • From Table I(5.16).

R Machine Vision Toolbox for MATLAB 94 Copyright c Peter Corke 2011 . xyz = cmfxyz(lambda. Notes • The number of rows in pix must match the product of the elements of imsize. ccxyz col2im Convert pixel vector to image out = col2im(pix. imsize is a 2-vector (N.uk See also cmfrgb. E) is the CIE xyz color matching (1 × 3) function for an illumination spectrum defined by intensity E (N × 1) and wavelength lambda (N × 1) [m]. out = col2im(pix. FUNCTIONS AND CLASSES See also ccxyz cmfxyz matching function The color matching function is the XYZ tristimulus required to match a particular wavelength excitation.ucl.ioo. imsize) is an image (HxWxP) comprising the pixel values in pix (N × P ) with one row per pixel where N=H × W . Note • CIE 1931 2-deg xyz CMFs from cvrl. If lambda is a vector then each row of xyz is the color matching function of the corresponding element of lambda.CHAPTER 2.ac.M). im) as above but the dimensions of out are the same as im. xyz = cmfxyz(lambda) is the CIE xyz color matching function (N ×3) for illumination at wavelength lambda (N × 1) [m].

FUNCTIONS AND CLASSES See also im2col colnorm Column-wise norm of a matrix cn = colnorm(a) returns an M × 1 vector of the normals of each column of the matrix a which is N × M . colordistance Colorspace distance d = colordistance(im. Perceptually uniform spaces such as Lab remedy this problem. d is an image with the same dimensions as im and the value of each pixel is the color space distance of the corresponding pixel in im. • Note that Euclidean distance in the rg-chromaticity space does not correspond well with human perception of color differences.g] to every pixel in the color image im.CHAPTER 2. See also colorspace Machine Vision Toolbox for MATLAB R 95 Copyright c Peter Corke 2011 . rg) is the Euclidean distance on the rg-chromaticity plane from coordinate rg=[r. Notes • The output image could be thresholded to determine color similarity.

B). Examples Display image with values < 100 in blue out = colorize(im. R Machine Vision Toolbox for MATLAB 96 Copyright c Peter Corke 2011 . color) as above but a the mask is the return value of the function handle func applied to the image im. out = colorize(im. icolor.CHAPTER 2. im<100. k) as above but also returns the cluster centres C (k × 2) where the I’th row is the rg-chromaticity of the I’th cluster and corresponds to the label I. and returns a per-pixel logical result. The color is specified as a 3-vector (R. [L. [0 0 1]) Display image with NaN values shown in red out = colorize(im. @isnan. A k-means clustering of the chromaticity of all input pixels is performed. @isnan.G. ipixswitch colorkmeans Color image segmentation by clustering L = colorkmeans(im. A k-means clustering of the chromaticity of all input pixels is performed. mask. See also imono. FUNCTIONS AND CLASSES colorize Colorize a greyscale image out = colorize(im. eg. options) is a segmentation of the color image im into k classes. func. [1 0 0]) Notes • With no output arguments the image is displayed.C] = colorkmeans(im. k. The label image L has the same row and column dimension as im and each pixel has a value in the range 0 to k-1 which indicates which cluster the corresponding pixel belongs to. color) is a color image where each pixel in out is set to the corresponding element of the greyscale image im or a specified color according to whether the corresponding value of mask is true or false respectively.

Options Various options are possible to choose the initial cluster centres for k-means: ‘random’ ‘spread’ ‘pick’ randomly choose k points from randomly choose k values within the rectangle spanned by the input chromaticities. ‘xy’) is the XYZ-tristimulus value corresponding to the color specified by the string name. the root mean square error of all pixel chromaticities with respect to their cluster centre. XYZ = colorname(name. low is good.R] = colorkmeans(im. k) as above but also returns the residual R. L = colorkmeans(im. Pixels are assigned to the closest (Euclidean) centre. FUNCTIONS AND CLASSES [L. ‘xy’) is a string giving the name of the color that is closest (Euclidean) to the given XYZ-tristimulus value. • The residual is an indication of quality of fit.CHAPTER 2. interactively pick cluster centres Notes • The k-means clustering algorithm used in the first three forms is computationally expensive and time consuming. C) is a segmentation of the color image im into k classes which are defined by the cluster centres C (k × 2) in chromaticity space. • Clustering is performed in rg-chromaticity space. “?burnt” Machine Vision Toolbox for MATLAB R 97 Copyright c Peter Corke 2011 . name = colorname(XYZ. Since cluster centres are provided the k-means segmentation step is not required. eg. Notes • Color name may contain a wildcard. See also colorname Map between color names and RGB values rgb = colorname(name) is the rgb-tristimulus value corresponding to the color specified by the string name. name = colorname(rgb) is a string giving the name of the color that is closest (Euclidean) to the given rgb-tristimulus value.C.

601) + Chroma Luma + Chroma (“digitized” version of Y’PbPr) NTSC PAL Y’UV Luma + Chroma NTSC Y’IQ Luma + Chroma SECAM Y’DbDr Luma + Chroma JPEG-Y’CbCr Luma + Chroma Hue Saturation Value/Brightness Hue Saturation Luminance/Intensity CIE XYZ CIE L*a*b* (CIELAB) CIE L*u*v* (CIELUV) CIE L*ch (CIELCH) Notes • RGB input is assumed to be gamma encoded • RGB output is gamma encoded • All conversions assume 2 degree observer and D65 illuminant.709 gamma-corrected) Luma (ITU-R BT.i3) as above but specifies separate input channels. • MATLAB uses two standard data formats for R’G’B’: double data with intensities in the range 0 to 1. and R Machine Vision Toolbox for MATLAB 98 Copyright c Peter Corke 2011 . or alternatively.o2. s = ‘src->dest’. • Tristimulus values are in the range 0 to 1 colorspace Color space conversion of image out = colorspace(s.o3] = colorspace(s.txt. and uint8 data with integer-valued intensities from 0 to 255. it can be omitted. double data is the natural choice. s = ‘dest<-src’. • Color space names are case insensitive. Supported color spaces are: ‘RGB’ ‘YPbPr’ ‘YCbCr’/’YCC’ ‘YUV’ ‘YIQ’ ‘YDbDr’ ‘JPEGYCbCr’ ‘HSV’/’HSB’ ‘HSL’/’HLS’/’HSI’ ‘XYZ’ ‘Lab’ ‘Luv’ ‘Lch’ R’G’B’ Red Green Blue (ITU-R BT. im) as above but specifies separate output channels or planes. As MATLAB’s native datatype. • When R’G’B’ is the source or destination. [o1. i1. Input and output images have 3 planes. For example ‘yuv<’ is short for ‘yuv<-rgb’. FUNCTIONS AND CLASSES • Based on the standard X11 color database rgb. im) converts the image im to a different color space according to the string s which specifies the source and destination color spaces.CHAPTER 2. colorspace(s.i2.

B). However.100).200). The distance d is M × N and element d(I.2*A. FUNCTIONS AND CLASSES the R’G’B’ format used by colorspace.CHAPTER 2. B = rand(400. out will also have size M × 3.b) is the Euclidean distances between L-dimensional points described by the matrices a (L × M ) and b (L × N ) respectively. for memory and computational performance. like a colormap. Author Pascal Getreuer 2005-2006 diff2 diff3(v) compute 2-point difference for each point in the vector v.B ) R Machine Vision Toolbox for MATLAB 99 Copyright c Peter Corke 2011 . d = distance(A. colorspace will first cast it to double R’G’B’ before processing. Example A = rand(400. some functions also operate with uint8 R’G’B’. Notes • This fully vectorized (VERY FAST!) • It computes the Euclidean distance between two vectors by: ||A-B|| = sqrt ( ||A||ˆ2 + ||B||ˆ2 .J) is the distance between points a(I) and d(J). Given uint8 R’G’B’ color data. • If im is an M × 3 array. distance Euclidean distances between sets of points d = distance(a.

direction) returns the list of edge pixels as above.nl Last Rev: Oct 29 16:35:48 MET DST 1999. Thanx: Nikos Vlassis. • seed must be a point on the edge of the region. Tested: PC Matlab v5. E = edgelist(im. Note that direction is with respect to y-axis upward. Notes • im is a binary image where 0 is assumed to be background.j). non zero is counter-clockwise.uva. tel. Machine Vision Toolbox for MATLAB 100 R Copyright c Peter Corke 2011 .CHAPTER 2. but the direction of edge following is specified. seed) return the list of edge pixels of a region in the image im starting at edge coordinate seed (i. FUNCTIONS AND CLASSES Author Roland Bunschoten. Intelligent Autonomous Systems (IAS) group. See also closest e2h Euclidean to homogeneous edgelist Return list of edge pixels for region E = edgelist(im.2 and Solaris Matlab v5. The result E is a matrix. • The seed point is always the first element of the returned edgelist. non-zero is an object. seed.3. direction == 0 (default) means clockwise. each row is one edge point coordinate (x. bunschot@wins. Kruislaan 403 1098 SJ Amsterdam. University of Amsterdam.y). in matrix coordinate frame.(+31)20-5257524. not image frame.

Points are specified by the columns of p. Coimbra.R. See also epiline. 1998. p1. H = epiline(f. Oct 27. p2) is the distance of the points p2 (2 × M ) from the epipolar lines due to points p1 (2 × N ) where f (3 × 3) is a fundamental matrix relating the views containing image points p1 and p2. d (N × M ) is the distance matrix where element d(i. one per line drawn. epiline(f. See also fmatrix.j) is the distance from the point p2(j) to the epipolar line due to point p1(i). p) draws epipolar lines in current figure based on points p (2 × N ) and the fundamental matrix f (3 × 3). I. ls) as above but return a vector of graphic handles.CHAPTER 2.S. Author Based on fmatrix code by. Nuno Alexandre Cid Martins. p. epidist Machine Vision Toolbox for MATLAB 101 R Copyright c Peter Corke 2011 . p. fmatrix epiline Draw epipolar lines epiline(f. ls) as above but draw lines using the line style arguments ls. FUNCTIONS AND CLASSES See also ilabel epidist Distance of point from epipolar line d = epidist(f.

The University of Western Australia. homography. p2. http://www. Reference Hartley and Zisserman. that is. School of Computer Science & Software Engineering.edu. epidist gauss2d kernel k = gauss2d(im. Notes • The points must be corresponding. k is (2W+1) x (2W+1). See also ransac. options) is the fundamental matrix (3 × 3) that relates two sets of corresponding points p1 (2 × N ) and p2 (2 × N ) from two different camera views. that is. Author Based on fundamental matrix code by Peter Kovesi.CHAPTER 2. sigma) Returns a unit volume Gaussian smoothing kernel. • Contains a RANSAC driver. no outlier rejection is performed. If w is not specified it defaults to 2*sigma.uwa. page 270. it is singular. which means it can be passed to ransac(). c. epiline.au/. FUNCTIONS AND CLASSES fmatrix Estimate fundamental matrix f = fmatrix(p1.csse. ‘Multiple View Geometry in Computer Vision’. and the convolution kernel has a half size of w. • f is a rank 2 matrix. Machine Vision Toolbox for MATLAB 102 R Copyright c Peter Corke 2011 . The Gaussian has a standard deviation of sigma.

itriplepoint. FUNCTIONS AND CLASSES h2e Homogeneous to Euclidean hitormiss Hit or miss transform H = hitormiss(im.y1) and (x2.CHAPTER 2. Homogeneous points X (3 × 1) on the line must satisfy L’*X = 0. y1. 1 and don’t care (represented by NaN).y2). ithin. y2) returns a 3 × 1 vectors which describes a line in homogeneous form that contains the two Euclidean points (x1. iendpoint homline Homogeneous line from two points L = homline(x1. Unlike standard morphological operations S has three possible values: 0. See also plot homline Machine Vision Toolbox for MATLAB 103 R Copyright c Peter Corke 2011 . See also imorph. se) is the hit-or-miss transform of the binary image im with the structuring element se. x2.

School of Computer Science & Software Engineering. fmatrix homtrans Apply a homogeneous transformation p2 = homtrans(T.csse.uwa. • If T is in SE(2) (3 × 3) and – p is 2 × N (2D points) they are considered Euclidean (R2 ) – p is 3 × N (2D points) they are considered projective (p2 ) • If T is in SE(3) (4 × 4) and – p is 3 × N (3D points) they are considered Euclidean (R3 ) – p is 4 × N (3D points) they are considered projective (p3 ) Machine Vision Toolbox for MATLAB 104 R Copyright c Peter Corke 2011 .edu. Author Based on homography code by Peter Kovesi. http://www. • The points must be projections of points lying on a world plane • Contains a RANSAC driver. no outlier rejection is performed. invhomog.CHAPTER 2. which means it can be passed to ransac(). See also ransac. p) applies homogeneous transformation T to the points stored columnwise in p. Notes • The points must be corresponding. p2) is the homography (3 × 3) that relates two sets of corresponding points p1 (2×N ) and p2 (2×N ) from two different camera views of a planar object.au/. FUNCTIONS AND CLASSES homography Estimate homography H = homography(p1. The University of Western Australia.

See also homography. ie.CHAPTER 2. set unmapped pixels to this value output image contains the specified ROI in the input image scale the output by this factor ensure output image is D × D size of output image S=[W. If T1 is a 3-dimensional transformation then T is applied to each plane as defined by the first two dimensions. that is tp=T*T1. options) as above but offs is the offset of the warped tile out with respect to the origin of im.H] Notes • The edges of the resulting output image will in general not be be vertical and horizontal lines. interp2 Machine Vision Toolbox for MATLAB 105 R Copyright c Peter Corke 2011 . R ‘scale’. im. S ‘dimension’. options) is a warp of the image im obtained by applying the homography H to the coordinates of every input pixel. V ‘roi’. D ‘size’. but its position with respect to the input image is given by the second return value offs. im. See also e2h. [out. T1) applies homogeneous transformation T to the homogeneous transformation T1. h2e homwarp Warp image by an homography out = homwarp(H. itrim. FUNCTIONS AND CLASSES tp = homtrans(T. if T = N × N and T=NxNxP then the result is NxNxP.offs] = homwarp(H. S output image contains all the warped pixels. Options ‘full’ ‘extrapval’.

Animate image sequence with overlaid corner features: c = icorner(im. 179-187. IT-8:pp. See also npq ianimate Display an image sequence ianimate(im. ’nfeat’. 1962. Hu. options) displays a greyscale image sequence im with point features overlaid. ianimate(seq. FUNCTIONS AND CLASSES humoments Hu moments phi = humoments(im) is the vector (7 × 1) of Hu moment invariants for the binary image im. The feature is plotted using the object’s plot method and additional options are passed through to that method.CHAPTER 2. options) displays a greyscale image sequence im (HxWxN) where N is the number of frames in the sequence. Notes • im is assumed to be a binary image of a single connected region Reference M-K. Visual pattern recognition by moment invariants. 200). on Information Theory. features. Examples Animate image sequence: ianimate(seq). features (N × 1) cell array whose elements are vectors of feature objects. ianimate(im. features. % computer corners % features shown as green squares Machine Vision Toolbox for MATLAB 106 R Copyright c Peter Corke 2011 . ’gs’). IRE Trans.

The bounding box is a 2 × 2 matrix [XMIN XMAX. M ‘npoints’. YMIN YMAX]. options) is a vector of RegionFeature objects that describe each connected region in the image im. FUNCTIONS AND CLASSES Options ‘fps’. box = ibbox(im) is the minimal bounding box that contains the non-zero pixels in the image im. iblobs features f = iblobs(im. N ‘only’. iharris. Machine Vision Toolbox for MATLAB 107 R Copyright c Peter Corke 2011 .CHAPTER 2. idisp ibbox Find bounding box box = ibbox(p) is the minimal bounding box that contains the points described by the columns of p (2 × N ). F ‘loop’ ‘movie’. isurf. I set the frame rate (default 5 frames/sec) endlessly loop over the sequence save the animation as a series of PNG frames in the folder M plot no more than N features per frame (default 100) display only the I’th frame from the sequence See also PointFeature.

The edges within im are marked by non-zero values in E. Machine Vision Toolbox for MATLAB 108 R Copyright c Peter Corke 2011 . A set pixel aspect ratio. FUNCTIONS AND CLASSES Options ‘aspect’. and larger values correspond to stronger edges. maximum vertical coordinate the number of pixels the value of the pixels forming this region the label assigned to this region a list of indices of features that are children of this feature coordinate of a point on the perimeter a list of edge points 2 × N matrix number of edge pixels true if region touches edge of the image major axis length of equivalent ellipse minor axis length of equivalent ellipse angle of major ellipse axis to horizontal axis aspect ratio b/a (always <= 1. C accept only blobs of pixel value C (default all) The RegionFeature object has many properties including: uc vc umin umax vmin vmax area class label children edgepoint edge perimeter touch a b theta shape circularity moments centroid. [A1.CHAPTER 2. horizontal coordinate centroid. [S1. minimum horizontal coordinate bounding box.S2] accept only blobs with shape in interval S1 to S2 ‘touch’ ignore blobs that touch the edge (default accept) ‘class’. options) returns an edge image using the Canny edge detector.A2] accept only blobs with area in interval A1 to A2 ‘shape’. default 1.0 ‘connect’. maximum horizontal coordinate bounding box.0) 1 for a circle. vertical coordinate bounding box. imoments icanny edge detection E = icanny(im. minimum vertical coordinate bounding box. less for other shapes a structure containing moments of order 0 to 2 See also RegionFeature. 4 (default) or 8 ‘greyscale’ compute greyscale moments 0 (default) or 1 ‘boundary’ compute boundary (default off) ‘area’. C set connectivity. ilabel.

n) as above but the structuring element se is applied n times. out = iclose(im. S ‘th0’. Notes • Cheaper to apply a smaller structuring element multiple times than one large one. T ‘th1’. Tel Aviv University. See also iopen. See also isobel. 1996-7. that is n dilations followed by n erosions.CHAPTER 2. se) is the image im after morphological closing with the structuring element se. FUNCTIONS AND CLASSES Options ‘sd’. kdgauss iclose closing out = iclose(im. imorph Machine Vision Toolbox for MATLAB 109 R Copyright c Peter Corke 2011 . the effective structuing element is the Minkowski sum of the structuring element with itself n times.1 x strongest edge) set the upper hysteresis threshold (default 0. This is an dilation followed by erosion. se.5 x strongest edge) Author Oded Comay. T set the standard deviation for smoothing (default 1) set the lower hysteresis threshold (default 0.

Options ‘dir’. [0 1 1]). The images do not have to be of the same size. value of unset background pixels Machine Vision Toolbox for MATLAB 110 R Copyright c Peter Corke 2011 .options) concatenates images from the cell array im. FUNCTIONS AND CLASSES icolor Colorize a greyscale image C = icolor(im) is a color image C (HxWx3)where each color plane is equal to im (H × W ).CHAPTER 2. D ‘bgval’. ipixswitch iconcat Concatenate images C = iconcat(im. See also imono. colorize.options) as above but also returns the vector u whose elements are the coordinates of the left (or top in vertical mode) edge of the corresponding image. and smaller images are surrounded by background pixels which can be specified.u] = iconcat(im. [C.options) as above but displays the concatenated images using idisp. Create an aqua tinted version of the greyscale image c = icolor(im. iconcat(im. B direction of concatenation: ‘horizontal’ (default) or ‘vertical’. color) as above but each output pixel is color (3 × 1) times the corresponding element of im. Examples Create a color image that looks the same as the greyscale image c = icolor(im). C = icolor(im.

resulting in an output image with the same number of planes. and contains only valid pixels Notes • This function is a convenience wrapper for the builtin function CONV2. See also conv2 Machine Vision Toolbox for MATLAB 111 R Copyright c Peter Corke 2011 . The smaller image is taken as the kernel and convolved with the larger image. options) convolves im1 with im2. im2. If the larger image is color (has multiple planes) the kernel is applied to each plane. FUNCTIONS AND CLASSES Notes • Works for color or greyscale images • Direction can be abbreviated to first character.CHAPTER 2. Options ‘same’ ‘full’ ‘valid’ output image is same size as largest input image (default) output image is larger than the input image output image is smaller than the input image. ‘h’ or ‘v’ • In vertical mode all images are right justified • In horizontal mode all images are top justified See also idisp iconv Image convolution C = iconv(im1.

CT ‘edgegap’. The PointFeature object has many properties including: u v strength descriptor horizontal coordinate vertical coordinate corner strength corner descriptor (vector) Options ‘cmin’. S ‘deriv’.CHAPTER 2. K ‘patch’. R ‘nfeat’. D ‘sigma’. • If im is NxMx3 it is taken as a sequence unless the option ‘color’ is given Machine Vision Toolbox for MATLAB 112 R Copyright c Peter Corke 2011 . P ‘color’ minimum corner strength minimum corner strength as a fraction of maximum corner strength don’t return features closer than E to the edge of image (default 2) don’t return a feature closer than R pixels to an earlier feature (default 0) return the N strongest corners (default Inf) choose the detector where D is one of ‘harris’ (default). This is a non-scale space detector and by default the Harris method is used. CM ‘cminthresh’. N ‘detector’. options) is a vector of PointFeature objects describing detected corner features. E ‘suppress’. The vector has zero mean and unit norm. If im is an image sequence a cell array of PointFeature vectors is returned. FUNCTIONS AND CLASSES icorner Corner detector f = icorner(im. D ‘k’. • The function stops when: – the corner strength drops below cmin – the corner strenght drops below cMinThresh x strongest corner – the list of corners is exhausted • Features are returned in descending strength order • If im has more than 2 dimensions it is either a color image or a sequence • If im is NxMxP it is taken as an image sequence and f is a cell array whose elements are feature vectors for the corresponding image in the sequence. specify that im is a color image not a sequence Notes • Corners are processed in order from strongest to weakest. ‘noble’ or ‘klt’ kernel width for smoothing (default 2) kernel for gradient (default kdgauss(2)) set the value of k for Harris detector use a P × P patch of surrounding pixel values as the feature vector.

with a delay of d [sec]. with a delay of 0.J. T ‘maxiter’. Fourth Alvey Vision Conf.G. limit the change in rotation at each step to T (default 0.001) eliminate correspondences more than T x the median distance at each iteration.05 rad) stop after N iterations (default 100) stop when the relative change in error norm is less than T (default 0. Shi and C. Stephens. J.. Harris and M.6. Computer Vision and Pattern Recognition. • “Good features to track”. • The default descriptor is a vector [Ix* Iy* Ixy*] which are the unique elements of the structure tensor. Options ‘dplot’. options) as above but also returns the norm of the error between the transformed point set p2 and p1. 1988. p2. show the points p1 and p2 at each iteration. See also PointFeature. Tomasi. 1994. C. May 1988.d] = icp(p1. 593-593. p2. IEEE Computer Society. FUNCTIONS AND CLASSES • If im is NxMx3xP it is taken as a sequence of color images and f is a cell array whose elements are feature vectors for the corresponding color image in the sequence. isurf icp Point cloud alignment T = icp(p1.Noble. J. [T. Proc. Proc.121-128. where * denotes squared and smoothed. d ‘plot’ ‘maxtheta’. • The descriptor is a vector of float types to save space References • “A combined corner and edge detector”. Image and Vision Computing. • “Finding corners”. T show the points p1 and p2 at each iteration. pp. vol.5 [sec]. pp 147-151. options) is the homogeneous transformation that best transforms the set of points p2 to p1 using the iterative closest point algorithm. Manchester. Machine Vision Toolbox for MATLAB 113 R Copyright c Peter Corke 2011 . T ‘distthresh’. pp. N ‘mindelta’.CHAPTER 2.

Notes • If the image has multiple planes. each plane is decimated. 1992.or 3-dimensional. sd) as above but the standard deviation of the smoothing kernel is set to sd. Feb. ismooth Machine Vision Toolbox for MATLAB 114 R Copyright c Peter Corke 2011 . 239-256. P.CHAPTER 2. pp. FUNCTIONS AND CLASSES Notes • Points can be 2. IEEETrans. s = idecimate(im.. vol. 14. m) is a decimated version of the image im whose size is reduced by m (an integer) in both dimensions. See also iscale. Reference “A method for registration of 3D shapes”. • For noisy data setting distthresh and maxtheta can help to prevent the the solution from diverging. []) as above but no smoothing is applied prior to decimation. m. idecimate an image s = idecimate(im. m. 2. s = idecimate(im. Intell. no.Besl and H. • Smoothing is used to eliminate aliasing artifacts and the standard deviation should be chosen as a function of the maximum spatial frequency in the image. Mach. Pattern Anal.McKay. The image is smoothed with a Gaussian kernel with standard deviation m/2 then subsampled.

maximum value is white color map: greyscale unsigned. useful for displaying stereo pair display image planes (colors or sequence) as horizontally adjacent images y-axis increases upward. If the image is zoomed.and y-axes respectively. highlights fine structure color map: greyscale unsigned. options) displays an image and allows interactive investigation of pixel value. F ‘square’ ‘wide’ ‘flatten’ ‘ynormal’ ‘cscale’. – The “zoom” button requires a left-click and drag to specify a box which defines the zoomed view. zero is black. good for superimposed graphics Notes • Color images are displayed in true color mode: pixel triples map to display pixels • Grey scale images are displayed in indexed mode: the image pixel value is mapped through the color map to determine the display pixel value. If im is a cell array of images. they are first concatenated (horizontally). frame or GUI add a color bar to the image write the image to file F in EPS format display aspect ratio so that pixels are squate make figure very wide. positive is blue. C ‘xydata’. The image is displayed in a figure with a toolbar across the top. negative is red. image is inverted C is a 2-vector that specifies the grey value range that spans the colormap. negative is red. zero is white color map: random values.CHAPTER 2. • The “line” button allows two points to be specified and a new figure displays intensity along a line between those points. Machine Vision Toolbox for MATLAB 115 R Copyright c Peter Corke 2011 . Options ‘ncolors’. linear profile. darker than ‘grey’. the histogram is computed over only those pixels in view. N ‘nogui’ ‘noaxes’ ‘noframe’ ‘plain’ ‘bar’ ‘print’. positive is blue. XY ‘colormap’. maximum value is black color map: greyscale signed. Set colormap C (N × 3) color map: greyscale unsigned. histogram and zooming. C ‘grey’ ‘invert’ ‘signed’ ‘invsigned’ ‘random’ ‘dark’ number of colors in the color map (default 256) display the image without the GUI no axes on the image no axes or frame on the image no axes. XY is a cell array whose elements are vectors that span the x. zero is white. • The “histo” button displays a histogram of the pixel values in a new figure. User interface: • Left clicking on a pixel will display its value in a box at the top. FUNCTIONS AND CLASSES idisp image display tool idisp(im. zero is black color map: greyscale signed.

See also iblobs. caxis. colorseg idouble Convert integer image to double imd = idouble(im) returns an image with double precision elements in the range 0 to 1. labelimage. colormap. labels. FUNCTIONS AND CLASSES • The minimum and maximum image values are mapped to the first and last element of the color map. See also image. im is a greyscale N × M or color NxMx3 image. Non-selected pixels are displayed as white. bg) as above but the grey level of the non-selected pixels is specified by bg in the range 0 to 1. idisplabel(im. icolorize. iconcat idisplabel Display an image with mask idisplabel(im. and labelimage is an N × M image containing integer pixel class labels for the corresponding pixels in im. labelimage. The pixel classes to be displayed are given by the elements of labels which is a scalar a vector of class labels. See also iint Machine Vision Toolbox for MATLAB 116 R Copyright c Peter Corke 2011 . labels) displays only those image pixels which belong to a specific class. The integer pixels are assumed to span the range 0 to the maximum value of their integer class.CHAPTER 2. which by default (’greyscale’) is the range black to white.

out = igamma(im. • For images of type int the pixels are assumed in the range 0 to max integer value. See also itriplepoint. gamma) is a gamma corrected version of im. hitormiss igamma correction out = igamma(im.45.2. All pixels are raised to the power gamma. Gamma encoding can be performed with gamma > 1 and decoding with gamma < 1. Notes • Gamma encoding is typically performed in a camera with gamma=0. • For images of type double the pixels are assumed to be in the range 0 to 1. ‘sRGB’) is a gamma decoded version of im using the sRGB decoding function (JPEG images sRGB encoded). • For images with multiple planes the gamma correction is applied to all planes. Machine Vision Toolbox for MATLAB 117 R Copyright c Peter Corke 2011 . • Gamma decoding is typically performed in the display with gamma=2.CHAPTER 2. Computed using the hit-or-miss morphological operator. ithin. FUNCTIONS AND CLASSES iendpoint Find end points in a binary skeleton image out = iendpoint(im) is a binary image where pixels are set if the corresponding pixel in the binary image im is the end point of a single-pixel wide line such as found in an image skeleton.

L = igraphseg(im. imser Machine Vision Toolbox for MATLAB 118 R Copyright c Peter Corke 2011 . Huttenlocher. k. Journal on Computer Vision. min.5). 100. [L. 0. k. 2004. P. 59. 167181. sigma) as above but m is the number of regions found. min) is a graph-based segmentation of the greyscale or color image im. Example im = iread(’58060. idisp(im) Reference “Efficient graph-based image segmentation”. Sept. See also ithresh.m] = igraphseg(im. L is an image of the same size as im where each element is the label assigned to the corresponding pixel in im.5). k is the scale parameter. k. min is the minimum region size (pixels). sigma) as above and sigma is the width of a Gaussian which is used to initially smooth the image (default 0. 1500. 2006. vol. pp. Int.CHAPTER 2.m] = igraphseg(im. Felzenszwalb and D. Notes • Is a MEX file Author Pedro Felzenszwalb.jpg’). and a larger value indicates a preference for larger regions. FUNCTIONS AND CLASSES igraphseg Graph-based segmentation L = igraphseg(im. [l. min.

For an image with multiple planes H is a matrix with one column per image plane.x] = ihist(im. plot(x. ’normcdf’). Notes • For a uint8 image the histogram spans the greylevel range 0-255 • For a floating point image the histogram spans the greylevel range 0-1 • For floating point images all NaN and Inf values are first removed. For an image with multiple planes the histogram of each plane is given in a separate subplot.h). FUNCTIONS AND CLASSES ihist Image histogram ihist(im. [H.x] = ihist(im. bar(x.h). options) as above but also returns the bin coordinates as a column vectors. Options ‘nbins’ ‘cdf’ ‘normcdf’ ‘sorted’ number of histogram bins (default 256) compute a cumulative histogram compute a normalized cumulative histogram histogram but with occurrence sorted in descending order Example [h. H = ihist(im. [h. options) displays the image histogram.CHAPTER 2. • For a uint8 image the MEX function fhist is used See also hist Machine Vision Toolbox for MATLAB 119 R Copyright c Peter Corke 2011 . options) is the image histogram as a column vector.x] = ihist(im).

A value of 0 indicates that the region has no single enclosing region. x2) is the sum of pixels in the rectangular image region defined by its top-left (x1.y1) and bottom-right (x2. Machine Vision Toolbox for MATLAB 120 R Copyright c Peter Corke 2011 . See also intgimage ilabel Label an image L = ilabel(im) performs connectivity analysis on the image im and returns a label image L. See also idouble iisum Sum of integral image s = iisum(ii. ii is a precomputed integral image.y2). x1.CHAPTER 2. y1. y2.m] = ilabel(im) as above but returns the value of the maximum label value.m. [L. The value of parents(I) is the label of the parent or enclosing region of region I. The floating point pixels values in im are assumed to span the range 0 to 1. FUNCTIONS AND CLASSES iint Convert image to integer class out = iint(im) is an image with 8-bit unsigned integer elements in the range 0 to 255. for a multilevel image it means that it touches more than one other region.parents] = ilabel(im) as above but also returns region hierarchy information. [L. Region labels are in the range 1 to M. where each pixel value represents the integer region label assigned to the corresponding pixel in im. same size as im. for a binary image this means the region touches the edge of the image.

The value of class(I) is the value of the pixels that comprise region I. eg.class] = ilabel(im) as above but also returns the class of pixels within each region. each a 2-vector [X. • This is a “low level” function. The pixels on the line are set to 1.maxlabel. p1. p1. See also iblobs. p2) is a copy of the image im with a line drawn between the points p1 and p2. Notes • Uses the Bresenham algorith • Only works for greyscale images • The line looks jagged since no anti-aliasing is performed See also bresenham.CHAPTER 2. out = iline(im.parents. v) as above but the pixel on the line are set to v.parents.maxlabel. ilabel(im. If edge(I) is 1 then region I touches edge of the image. FUNCTIONS AND CLASSES [L. iprofile. ipaste Machine Vision Toolbox for MATLAB 121 R Copyright c Peter Corke 2011 .class. IBLOBS is a higher level interface.edge] = ilabel(im) as above but also returns the edge-touch status of each region. p2. • The image can be binary or multi-level • Connectivity is performed using 4 nearest neighbours by default. To use 8-way connectivity pass a second argument of 8. 8).Y]. Notes • Is a MEX file. otherwise it does not. imoments iline Draw a line in an image out = iline(im. [L.

y). s] % relative to (x.DY. x. w2.y) and of size s. The template is searched for within im2 inside a rectangular region. -s. See also col2im imatch Template matching xm = imatch(im1. y. • im1 and im2 must be the same size.score] = imatch(im1.0 Machine Vision Toolbox for MATLAB 122 R Copyright c Peter Corke 2011 .y) and CC is the similarity score (zero-mean normalized cross correlation) for the best match in the search region.and y-offsets relative to (x. More generally s is a 4-vector s=[xmin. The template in im1 is centred at (x. [xm.CC] where (DX. If s is a scalar the search region is [-s. a perfect match score is 1.CHAPTER 2. H. Notes • Useful for tracking a template in an image sequence.y). ymax] relative to (x. x. The return value is xm=[DX. • Is a MEX file. y. and columns the vertical position. ymin.y) and its half-width is H. centred at (x. s) works as above but also returns a matrix of matching score values for each template position tested. FUNCTIONS AND CLASSES im2col Convert an image to pixel per row format out = im2col(im) returns the image (HxWxP) as a pixel vector (N ×P ) where each row is a pixel value (1 × P ). im2.DY) are the x. s. xmax. The pixels are in image column order and there are N=W × H rows. • ZNCC matching is used. The rows correspond to horizontal positions of the template. im2. s) is the matching subimage of im1 (template) within the image im2.

v] = imeshgrid(im) return matrices that describe the domain of image im and can be used for the evaluation of functions over the image. w) as above but the pixels have weights given by the vector w. [u.u) = v.v] = imeshgrid(size) as above but the domain is described size which is scalar size× size or a 2-vector [w H]. effectively a greyscale image. [u. The RegionFeature object has many properties including: Machine Vision Toolbox for MATLAB 123 R Copyright c Peter Corke 2011 . All pixels are equally weighted. FUNCTIONS AND CLASSES See also isimilarity imeshgrid Domain matrices for image [u.v] = imeshgrid(w. effectively a binary image. f = imoments(u.u) = u and v(v.CHAPTER 2. The element u(v. f = imoments(u. H) as above but the domain is w × H. v) as above but the moments are computed from the pixel coordinates given as vectors u and v. v. See also meshgrid imoments Image moments f = imoments(im) is a RegionFeature object that describes the greyscale moments of the image im.

Notes • For a binary image the zeroth moment is the number of non-zero pixels.0) a structure containing moments of order 0 to 2. m01. or its area. horizontal coordinate centroid. • This function does not perform connectivity. ilabel. colorspace Machine Vision Toolbox for MATLAB 124 R Copyright c Peter Corke 2011 . imoments imono Convert color image to monochrome out = imono(im. options) is a greyscale equivalent to the color image im.CHAPTER 2. Options ‘r601’ ‘r709’ ITU recommendation 601 (default) ITU recommendation 709 See also colorize. vertical coordinate the number of pixels major axis length of equivalent ellipse minor axis length of equivalent ellipse angle of major ellipse axis to horizontal axis aspect ratio b/a (always <= 1. if connected blobs are required then the ILABEL function must be used first. the elements are m00. m02. Different conversion functions are supported. See also RegionFeature. m11. m20. icolor. FUNCTIONS AND CLASSES uc vc area a b theta shape moments centroid. m10.

2004. idisp(im) Reference “Robust wide-baseline stereo from maximally stable extremal regions”. Image and Vision Computing. Pajdla. se. Matas. The labels [L. pp. m. op) is the image im after morphological processing with the operator op and structuring element se. Options ‘dark’ ‘light’ looking for dark features against a light background (default) looking for light features against a dark background Example im = iread(’castle_sign2. J. imser Maximally stable extremal regions L = imser(im. [label. options) is a greyscale segmentation of the image im based on maximally stable extremal regions. FUNCTIONS AND CLASSES imorph Morphological neighbourhood processing out = imorph(im. L is an image of the same size as im where each element is the label assigned to the corresponding pixel in im. by Andrea Vedaldi and Brian Fulkerson • vl mser is a MEX file Machine Vision Toolbox for MATLAB 125 R Copyright c Peter Corke 2011 .CHAPTER 2. part of VLFeat (vlfeat.org).png’. Sept.m] = imser(im. Notes • Is a wrapper for vl mser. Chum. 22. and T.m] = imser(im. Urban. O. ’light’). ’grey’. vol. options) as above but m is the number of regions found. ’double’). 761767.

igraphseg inormhist Histogram normalization out = inormhist(im) is a histogram normalized version of the image im. • Highlights image detail in dark areas of an image. See also ihist intgimage Compute integral image out = intimage(im) is an integral image corresponding to im.ˆ2). Notes • The histogram of the normalized image is approximately uniform. Examples Create integral images for sum of pixels over rectangular regions i = intimage(im). Integral images can be used for rapid computation of summations over rectangular regions. Create integral images for sum of pixel squared values over rectangular regions i = intimage(im. FUNCTIONS AND CLASSES See also ithresh.CHAPTER 2. Machine Vision Toolbox for MATLAB 126 R Copyright c Peter Corke 2011 .

se) is the image im after morphological opening with the structuring element se. v) as above but pads with pixels of value v. n. n) as above but the structuring element se is applied n times. the effective structuing element is the Minkowski sum of the structuring element with itself n times. sides. Notes • Cheaper to apply a smaller structuring element multiple times than one large one. out = iopen(im. See also iclose. Machine Vision Toolbox for MATLAB 127 R Copyright c Peter Corke 2011 . imorph ipad Pad an image with constants out = ipad(im. se.CHAPTER 2. This is an erosion followed by dilation. FUNCTIONS AND CLASSES See also iisum iopen Morphological opening out = iopen(im. sides. n) is a padded version of the image im with a block of NaN values n pixels wide on the sides of im as specified by sides which is a string containing one or more of the characters: ‘t’ top ‘b’ bottom ‘l’ left ‘r’ right out = ipad(im. that is n erosions followed by n dilations.

If the element of mask is zero im1 is selected.V]. 10. 0) Add a band of white pixels 10 pixels wide on all sides of the image: ipad(im. ipaste Paste an image into an image out = ipaste(im. im2) is an image where each pixel is selected from the corresponding pixel in im1 or im2 according to the corresponding values of mask. Options ‘centre’ ‘zero’ ‘set’ ‘add’ ‘mean’ The pasted image is centred at p. im1. ’t’. otherwise im2 is selected. 255) Notes • Not a tablet computer. otherwise p is the top-left corner (default) the coordinates of p start at zero. p. Machine Vision Toolbox for MATLAB 128 R Copyright c Peter Corke 2011 . 20. by default 1 is assumed im2 overwrites the pixels in im (default) im2 is added to the pixels in im im2 is set to the mean of pixel values in im2 and im See also iline ipixswitch Pixelwise image merge out = ipixswitch(mask. FUNCTIONS AND CLASSES Examples Add a band of zero pixels 20 pixels high across the top of the image: ipad(im. ’tblr’. im2. options) is the image im with the image im2 pasted in at the position p=[U.CHAPTER 2.

out is a cell array of images each one having R Machine Vision Toolbox for MATLAB 129 Copyright c Peter Corke 2011 .v) for the corresponding row of p. p1. [p. FUNCTIONS AND CLASSES Notes • im1 and im2 must have the same number of rows and columns • if im1 and im2 are both greyscale then out is greyscale • if either of im1 and im2 are color then out is color See also colorize iprofile Extract pixels along a line v = iprofile(im. p2) as above but also returns the coordinates of the pixels for each point along the line. iline ipyramid Pyramidal image decomposition out = ipyramid(im) is a pyramid decomposition of input image im using Gaussian smoothing with standard deviation of 1. p1. p2) is a vector of pixel values extracted from the image im (HxWxP) between the points p1 (2 × 1) and p2 (2 × 1).uv] = iprofile(im. Each row of uv is the pixel coordinate (u. Notes • The Bresenham algorithm is used to find points along the line.CHAPTER 2. See also bresenham. v (N × P ) has one row for each point along the line and the row is the pixel value which will be a vector for a multi-plane image.

nbins. The value of edge is: ‘border’ ‘none’ ‘trim’ ‘wrap’ the border value is replicated (default) pixels beyond the border are not included in the window output is not computed for pixels whose window crosses the border. se. the maximum. hence output image had reduced dimensions. 12. se. out = imorph(image. se(2. ismooth irank Rank filter out = irank(im. sigma) as above but the Gaussian standard deviation is sigma. out = ipyramid(im. op. the image is assumed to wrap around Examples 5 × 5 median filter: irank(im.2) = 0. 1. FUNCTIONS AND CLASSES dimensions half that of the previous image. See also iscalespace. Machine Vision Toolbox for MATLAB 130 R Copyright c Peter Corke 2011 . ones(5. op. The pyramid is computed down to a nonhalvable image size. se) is a rank filtered version of im.CHAPTER 2. n) as above but only n levels of the pyramid are computed.3). The highest rank. is order=1. 3 × 3 non-local maximum: se = ones(3. out = imorph(image. Notes • Works for greyscale images only. out = ipyramid(im. se). nbins) as above but the number of histogram bins can be specified. Only pixels corresponding to non-zero elements of the structuring element se are ranked and the order’th value in rank becomes the corresponding output pixel value. edge) as above but the processing of edge pixels can be controlled. order. sigma. im > irank(im.5)). idecimate.

Options ‘uint8’ ‘single’ ‘double’ ‘grey’ ‘grey 709’ ‘gamma’. R return an image with 8-bit unsigned integer pixels in the range 0 to 255 return an image with single precision floating point pixels in the range 0 to 1. Wildcards are allowed in file names. R ‘roi’. where R=[umin umax. im = iread(file. options) reads the specified file and returns a matrix. ivar. FUNCTIONS AND CLASSES Notes • Is a MEX file. return an image with double precision floating point pixels in the range 0 to 1. G ‘reduce’. On subsequent calls the initial folder is as set on the last call. If multiple files match a 3D or 4D image is returned where the last dimension is the number of images in the sequence. If the path is relative it is searched for on Matlab search path. vmin vmax]. iwindow iread Read image from file im = iread() presents a file selection GUI from which the user can select an image file which is returned as 2D or 3D matrix. See also imorph. Notes • A greyscale image is returned as an H × W matrix • A color image is returned as an HxWx3 matrix • A greyscale image sequence is returned as an HxWxN matrix where N is the sequence length Machine Vision Toolbox for MATLAB 131 R Copyright c Peter Corke 2011 . either numeric or ‘sRGB’ decimate image by R in both dimensions apply the region of interest R to each image. • A histogram method is used with nbins (default 256).CHAPTER 2. convert image to greyscale if it’s color using ITU rec 601 convert image to greyscale if it’s color using ITU rec 709 gamma value.

igamma. k) is an image where each pixel is replicated into a k × k tile.CHAPTER 2. If im is H × W the result is (KH)x(KW). im2) as above but also returns the homographies h1 and h2 that warp im1 to out1 and im2 to out2 respectively.out2] = irectify(f. Machine Vision Toolbox for MATLAB 132 R Copyright c Peter Corke 2011 .h2] = irectify(f. FUNCTIONS AND CLASSES • A color image sequence is returned as an HxWx3xN matrix where N is the sequence length See also idisp. CentralCamera ireplicate Expand image out = ireplicate(im. im1. f (3 × 3) is the fundamental matrix relating the two views and m is a FeatureMatch object containing point correspondences between the images. imwrite.h1.out2. Notes • The resulting image pair are epipolar aligned. path irectify Rectify stereo image pair [out1. See also FeatureMatch. • The resulting images may have negative disparity. im1. m. im2) is a rectified pair of images corresponding to im1 and im2. homwarp. [out1. istereo. Notes • Color images are not supported. imono. m.

iscale iroi Extract region of interest out = iroi(im.vmin vmax]. angle. FUNCTIONS AND CLASSES See also idecimate. S set size of out to H × W where S=[W. options) is a version of the image im that has been rotated about its centre. Options ‘outsize’. S ‘crop’ ‘scale’.CHAPTER 2. See also idisp. out = iroi(im) as above but the image is displayed and the user is prompted to adjust a rubber band box to select the region of interest. vmin.H] return central part of image. iline irotate Rotate image out = irotate(im.vmax]. same size as im scale the image size by S (default 1) set background pixels to V (default 0) smooth image with Gaussian of standard deviation S Machine Vision Toolbox for MATLAB 133 R Copyright c Peter Corke 2011 .R] = iroi(im) as above but returns the selected region of interest R=[umin umax. [out.umax. S ‘extrapval’. V ‘smooth’.R) is a subimage of the image im described by the rectangle R=[umin.

bias<0. See also iscale iscale Scale an image out = iscale(im. while bias>0. s set size of out to H × W where s=[W.H] set background pixels to V (default 0) smooth image with Gaussian of standard deviation s. Options ‘outsize’. See also iscale isamesize Automatic image trimming out = isamesize(im1. s<1 makes it smaller. FUNCTIONS AND CLASSES Notes • Rotation is defined with respect to a z-axis into the image. s) is a version of im scaled in both directions by s which is a real scalar.CHAPTER 2. bias) as above but bias controls which part of the image is cropped. s=[] means no smoothing (default s=1) Machine Vision Toolbox for MATLAB 134 R Copyright c Peter Corke 2011 .5 moves the crop window up or to the left. V ‘smooth’. im2) is an image derived from im1 that has the same dimensions as im2 which is achieved by cropping and scaling.5 moves the crop window down or to the right. bias=0. im2. s>1 makes the image larger. s ‘extrapval’. • Counter-clockwise is a positive angle. out = isamesize(im1.5 is symmetric cropping.

corresponding to each step of the sequence.L. [g. L (HxWxN) is the absolute value of the Laplacian of Gaussian (LoG) of the scale sequence. At each scale step the variance of the Gaussian increases by sigma2 . s) is a vector of ScalePointFeature objects which are the maxima. n) as above but sigma=1. Notes • The Laplacian is computed from the difference of adjacent Gaussians. n. idecimate. s (N × 1) is a vector of scale values corresponding to each plane of L. irotate iscalemax Scale space maxima f = iscalemax(L.L. The first step in the sequence is the original image. See also iscalespace. ScalePointFeature iscalespace Scale-space image sequence [g. of the Laplacian of Gaussian (LoG) scale-space image sequence L (HxWxN). and s (n × 1) is the vector of scales.s] = iscalespace(im. Machine Vision Toolbox for MATLAB 135 R Copyright c Peter Corke 2011 .CHAPTER 2. in space and scale. g (HxWxN) is the scale sequence. The standard deviation of the smoothing Gaussian is sigma. sigma) is a scale space image sequence of length n derived from im (H × W ). FUNCTIONS AND CLASSES See also ireplicate. Notes • Features are sorted into descending feature strength.s] = iscalespace(im.

ishomog(T. ‘valid’) as above. ilaplace. isvec isift SIFT feature extractor sf = isift(im. klog iscolor Test for color image iscolor(im) is true (1) if im is a color image. The SiftPointFeature object has many properties including: Machine Vision Toolbox for MATLAB 136 R Copyright c Peter Corke 2011 . options) returns a vector of SiftPointFeature objects representing scale and rotationally invariant interest points in the image im. that is. ishomog Test if argument is a homogeneous transformation ishomog(T) is true (1) if the argument T is of dimension 4 × 4 or 4x4xN.CHAPTER 2. ismooth. FUNCTIONS AND CLASSES See also iscalemax. it its third dimension is equal to three. but also checks the validity of the rotation matrix. See also isrot. else false (0).

• The SIFT algorithm is patented by Univerity of British Columbia. 91-110. • Features are returned in descending strength order. FUNCTIONS AND CLASSES u v strength descriptor sigma theta horizontal coordinate vertical coordinate feature strength feature descriptor (128 × 1) feature scale feature orientation [rad] Options ‘nfeat’. N ‘suppress’. 2 (2004). • If im is HxWxN it is considered to be an image sequence and F is a cell array with N elements.vlfeat. R set the number of features to return (default Inf) set the suppression radius (default 0) Notes • Greyscale images only. 60. icorner isimilarity Locate template in image s = isimilarity(T. “Distinctive image features from scale-invariant keypoints”.CHAPTER 2. See also SiftPointFeature. im) is an image where each pixel is the ZNCC similarity of the template T (M × M ) to the M × M neighbourhood surrounding the corresonding Machine Vision Toolbox for MATLAB 137 R Copyright c Peter Corke 2011 . International Journal of Computer Vision. • Wraps a MEX file from www. pp. isurf. Reference David G. Lowe. • ISURF is a functional equivalent. each of which is the feature vectors for the corresponding image in the sequence.org • Corners are processed in order from strongest to weakest.

d) is the size of the d’th dimension of im. ssd. s = isimilarity(T. the function accepts two regions and returns a scalar similarity score. metric) as above but the similarity metric is specified by the function metric which can be any of @sad. • The ZNCC function is a MEX file and therefore the fastest • User provided similarity metrics can be provided. See also imatch.CHAPTER 2. zsad. @zssd. @zsad. @ncc. zncc isize Size of image n = isize(im. See also size ismooth Gaussian smoothing out = ismooth(im. [w. sigma) is the image im after convolution with a Gaussian kernel of standard deviation sigma. FUNCTIONS AND CLASSES input pixel in im.H] = isize(im) is the image height H and width w. R Machine Vision Toolbox for MATLAB 138 Copyright c Peter Corke 2011 . Notes • Similarity is not computed where the window crosses the image boundary. @ssd. ncc. zssd. s is same size as im. Even if the image has only two dimensions p will be one. [w.H. sad.p] = isize(im) is the image height H and width w and number of planes p. and these output pixels are set to NaN. im.

[gx. • Smooths all planes of the input image • The Gaussian kernel has a unit volume • If input image is integer it is converted to float. Notes • By default (option ‘full’) the returned image is larger than the passed image.gy] = isobel(im.dx) as above but returns the gradient images.gy] = isobel(im) as above but returns the gradient images. Options ‘full’ ‘same’ ‘valid’ returns the full 2-D convolution (default) returns out the same size as im returns the valid pixels only.CHAPTER 2. kgauss isobel Sobel edge detector out = isobel(im) is an edge image computed using the Sobel edge operator applied to the image im. [gx. FUNCTIONS AND CLASSES out = ismooth(im. then converted back to integer. sigma. Machine Vision Toolbox for MATLAB 139 R Copyright c Peter Corke 2011 . those where the kernel does not exceed the bounds of the image. options) as above but the options are passed to CONV2. See also iconv. This is the norm of the vertical and horizontal gradients at each pixel.dx) as above but applies the kernel dx and dx’ to compute the horizontal and vertical gradients respectively. convolved. The Sobel kernel is: | -1 | -2 | -1 0 0 0 1| 2| 1| out = isobel(im.

range. That is. isvec istereo Stereo matching d = istereo(iml. options) as above but returns sim which is the same size as d and the elements are the peak matching score for the corresponding Machine Vision Toolbox for MATLAB 140 R Copyright c Peter Corke 2011 . imr. which can be a scalar for disparities in the range 0 to range. d (H × W ) is the disparity and the value at each pixel is the horizontal shift of the corresponding pixel in iml as observed in imr. FUNCTIONS AND CLASSES Notes • Tends to produce quite thick edges. options) is a disparity image computed from the epipolar aligned stereo pair: the left image iml (H × W ) and the right image imr (H × W ).M] for an N × M window.CHAPTER 2. else false (0). [d.sim] = istereo(iml.u). imr. range is the disparity search range. icanny.u-d) is the same world point as iml(v. which can be a scalar for N × N or a 2-vector [N. range. but also checks the validity of the rotation matrix. w. the disparity d=d(v. See also ksobel. • The resulting image is the same size as the input image.u) means that imr(v. ‘valid’) as above. or a 2-vector [DMIN DMAX] for searches in the range DMIN to DMAX. H is the half size of the matching window. H. iconv isrot Test if argument is a rotation matrix isrot(R) is true (1) if the argument is of dimension 3 × 3. See also ishomog. isrot(R.

w. Machine Vision Toolbox for MATLAB 141 R Copyright c Peter Corke 2011 .5 to +0. options) as above but returns dsi which is the disparity space image (HxWxN) where N=DMAX-DMIN+1.dx is the peak of the polynomial with respect to the integer disparity at which s is maximum (in the range -0. • Disparity values pixels within a half-window dimension (H) of the edges will not be valid and are set to NaN.5). out = istretch(im. That is. p.p] = istereo(iml. imr. In this case d is the interpolated disparity and p is a structure with elements A. enable subpixel interpolation and d contains non-integer values (default false) Notes • Images must be greyscale. [d. [d. FUNCTIONS AND CLASSES elements of d. ‘ncc’. B. options) if the ‘interp’ option is given then disparity is estimated to sub-pixel precision using quadratic interpolation. The interpolation polynomial is s = Ad2 + Bd + C where s is the similarity score and d is disparity relative to the integer disparity at which s is maximum. range.CHAPTER 2. dx. imr. range.max) as above but pixel values lie in the range 0 to max.B are matrices the same size as d whose elements are the per pixel values of the interpolation polynomial coefficients.sim. p.dsi] = istereo(iml. 3) See also irectify. M ‘interp’ string that specifies the similarity metric to use which is one of ‘zncc’ (default). stdisp istretch Image normalization out = istretch(im) is a normalized image in which all pixel values lie in the range 0 to 1. The I’th plane is the similarity of iml to imr shifted by DMIN+I-1. For the default matching metric ZNCC this varies between -1 (very bad) to +1 (perfect). w. a linear mapping where the minimum value of im is mapped to 0 and the maximum value of im is mapped to 1.A and p.sim. ‘ssd’ or ‘sad’. Options ‘metric’. • sim = max(dsi.

Twente) or a MEX-file OpenCV wrapper by Petter Strandmark. N ‘extended’ ‘upright’ ‘suppress’. options) returns a vector of SurfPointFeature objects representing scale and rotationally invariant interest points in the image im. Machine Vision Toolbox for MATLAB 142 R Copyright c Peter Corke 2011 . R set the number of features to return (default Inf) set Hessian threshold. Increasing the threshold reduces the number of features computed and reduces computation time. • Features are returned in descending strength order • If im is HxWxN it is considered to be an image sequence and F is a cell array with N elements. Features are not returned if they are within R [pixels] of an earlier (stronger) feature. FUNCTIONS AND CLASSES See also inormhist isurf SURF feature extractor sf = isurf(im. Kroon (U. or sequences. are first converted to greyscale. • The sign of the Laplacian is not retained. each of which is the feature vectors for the corresponding image in the sequence. number of octaves to process (default 5) return 128-element descriptor (default 64) don’t compute rotation invariance set the suppression radius (default 0).CHAPTER 2. • Wraps an M-file implementation of OpenSurf by D. N ‘thresh’. Notes • Color images. The SurfPointFeature object has many properties including: u v strength descriptor sigma theta horizontal coordinate vertical coordinate feature strength feature descriptor (64 × 1 or 128 × 1) feature scale feature orientation [rad] Options ‘nfeat’. T ‘octaves’.

delay) as above but graphically displays each iteration of the skeletonization algorithm with a pause of delay seconds between each iteration. Computer Vision and Image Understanding (CVIU). 110. 3. pp. out = ithin(im. Andreas Ess. either a row. icorner isvec Test if argument is a vector isvec(v) is true (1) if the argument v is a 3-vector.CHAPTER 2. isift. 2008 See also SurfPointFeature. Otherwise false (0). See also hitormiss. itriplepoint. FUNCTIONS AND CLASSES Reference Herbert Bay. else false (0). Vol. isrot ithin Morphological skeletonization out = ithin(im) is the binary skeleton of the binary image im. Luc Van Gool. isvec(v.or columnvector. iendpoint Machine Vision Toolbox for MATLAB 143 R Copyright c Peter Corke 2011 . “SURF: Speeded Up Robust Features”. Any non-zero region is replaced by a network of single-pixel wide lines. Tinne Tuytelaars. See also ishomog. No. 346–359. L) is true (1) if the argument v is a vector of length L.

When images are rectified or warped the shapes can become quite distorted and are embedded in rectangular images surrounded by black of NaN values. • For a floating point class image the slider range is 0 to 1. The same cropping is applied to each input image. irectify Machine Vision Toolbox for MATLAB 144 R Copyright c Peter Corke 2011 .im2.out2] = itrim(im1.0 See also idisp itrim Trim images [out1. T) as above but the initial threshold is set to T.im2) returns the central parts of images im1 and im2 as out1 and out2 respectively. [out1.T) as above but the threshold T in the range 0 to 1 is used to adjust the level of cropping. FUNCTIONS AND CLASSES ithresh Interactive image threshold ithresh(im) displays the image im in a window with a slider which adjusts the binary threshold. ithresh(im. a higher value will include fewer NaN value in the result. This function crops out the central rectangular region of each.5.CHAPTER 2. Notes • Greyscale image only. a lower value will include more. • For a uint8 class image the slider range is 0 to 255. It assumes that the undefined pixels in im1 and im2 have values of NaN. See also homwarp.out2] = itrim(im1. The default is 0.

hitormiss ivar Pixel window statistics out = ivar(im. op. se. hence output image had reduced dimensions. edge) as above but performance at edge pixels can be controlled. FUNCTIONS AND CLASSES itriplepoint Find triple points out = itriplepoint(im) is a binary image where pixels are set if the corresponding pixel in the binary image im is a triple point. These are the Voronoi points in an image skeleton. See also iendpoint. The elements in the neighbourhood corresponding to non-zero elements in se are packed into a vector on which the required statistic is computed. that is where three single-pixel wide line intersect. ithin.CHAPTER 2. op) is an image where each output pixel is the specified statistic over the pixel neighbourhood indicated by the structuring element se which should have odd side lengths. The value of edge is: ‘border’ ‘none’ ‘trim’ ‘wrap’ the border value is replicated (default) pixels beyond the border are not included in the window output is not computed for pixels whose window crosses the border. Computed using the hit-or-miss morphological operator. the image is assumed to wrap around Notes • Is a MEX file. The operation op is one of: ‘var’ variance ‘kurt’ Kurtosis or peakiness of the distribution ‘skew’ skew or asymmetry of the distribution out = ivar(im. Machine Vision Toolbox for MATLAB 145 R Copyright c Peter Corke 2011 . se.

The neighbourhood is defined by the size of the structuring element se which should have odd side lengths. the image is assumed to wrap around Example Compute the maximum value over a 5 × 5 window: iwindow(im. se. ones(5. edge) as above but performance of edge pixels can be controlled. iwindow iwindow Generalized spatial operator out = iwindow(im. func. • Is slow since the function func must be invoked once for every output pixel. See also ivar.3).CHAPTER 2. The return value becomes the corresponding pixel value in out. se. FUNCTIONS AND CLASSES See also irank. Notes • Is a MEX file. func) is an image where each pixel is the result of applying the function func to a neighbourhood centred on the corresponding pixel in im. Compute the standard deviation over a 3 × 3 window: iwindow(im.5). out = iwindow(image. @max). The value of edge is: ‘border’ ‘none’ ‘trim’ ‘wrap’ the border value is replicated (default) pixels beyond the border are not included in the window output is not computed for pixels whose window crosses the border. @std). ones(3. irank Machine Vision Toolbox for MATLAB 146 R Copyright c Peter Corke 2011 . hence output image had reduced dimensions. The elements in the neighbourhood corresponding to non-zero elements in se are packed into a vector (in column order from top left) and passed to the specified function handle func.

k = kdgauss(sigma. • The vertical derivative. Notes • This kernel is the horizontal derivative of the Gaussian.CHAPTER 2. FUNCTIONS AND CLASSES kcircle Circular structuring element k = kcircle(R) is a square matrix S × S where S=2R+1 of zeros with a maximal centered circular region of radius R pixels set to one. iconv Machine Vision Toolbox for MATLAB 147 R Copyright c Peter Corke 2011 . ktriangle. k = kcircle(R. dG/dx. is k’. • This kernel is an effective edge detector. See also ones. dG/dy. klog. H) as above but the half-width is explictly specified. imorph kdgauss Derivative of Gaussian kernel k = kdgauss(sigma) is a 2-dimensional derivative of Gaussian kernel (W × W ) of width (standard deviation) sigma and centred within the matrix k whose half-width H = 3 × sigma and W=2 × H+1.s) as above but s is explicitly specified. See also kgauss. Notes • If R is a 2-element vector the result is an annulus of ones. and the two numbers are interpretted as inner and outer radii. kdog.

iconv kgauss Gaussian kernel k = kgauss(sigma) is a 2-dimensional unit-volume Gaussian kernel of width (standard deviation) sigma. where sigma1 > SIGMA2. sigma2. By default SIGMA2 = 1. k = kdog(sigma1. The kernel is centred within the matrix k whose half-width H = 3 × SIGM A and W=2 × H+1.CHAPTER 2. kdgauss.6*sigma1. k = kgauss(sigma. See also kdgauss. iconv Machine Vision Toolbox for MATLAB 148 R Copyright c Peter Corke 2011 . klog. sigma2) as above but sigma2 is specified directly. Notes • This kernel is similar to the Laplacian of Gaussian and is often used as an efficient approximation. kdog. H) as above but the half-width H is specified. See also kgauss.KGAUSS(SIGMA2). klog. and centred within the matrix k whose half-width is H=2 × sigma and W=2 × H+1. k = kdog(sigma1. H) as above but the kernel half-width is specified. FUNCTIONS AND CLASSES kdog Difference of Gaussian kernel k = kdog(sigma1) is a 2-dimensional difference of Gaussian kernel equal to KGAUSS(sigma1) .

iconv. zcross Machine Vision Toolbox for MATLAB 149 R Copyright c Peter Corke 2011 .CHAPTER 2. k = klog(sigma. kdog. iconv klog Laplacian of Gaussian kernel k = klog(sigma) is a 2-dimensional Laplacian of Gaussian kernel of width (standard deviation) sigma and centred within the matrix k whose half-width is H=3 × sigma. and W=2 × H+1. FUNCTIONS AND CLASSES klaplace Laplacian kernel k = klaplace() is the Laplacian kernel: —0 |1 1 0— -4 1| —0 1 0— Notes • This kernel has an isotropic response to gradient. See also kgauss. kdgauss. See also ilaplace. H) as above but the half-width H is specified.

L is a vector (N ×1) whose elements indicates which cluster the corresponding element of x belongs to. it is assumed to have been completed previously. The data is organized into k clusters based on Euclidean distance from cluster centres C (D × k). Options ‘random’ ‘spread’ initial cluster centres are chosen randomly from the set of data points x initial cluster centres are chosen randomly from within the hypercube spanned by x.CHAPTER 2. C) is similar to above but the clustering step is not performed. k. C (D × k) contains the cluster centroids and L (N × 1) indicates which cluster the corresponding element of x is closest to. Pattern Recognition Principles. k. FUNCTIONS AND CLASSES kmeans K-means clustering [L.C] = kmeans(x. [L. L = kmeans(x. options) is k-means clustering of multi-dimensional data points x (D × N ) where N is the number of points. c0) as above but the initial clusters c0 (D × k) is given and column I is the initial estimate of the centre of cluster I.C] = kmeans(x. Reference Tou and Gonzalez. and D is the dimension. pp 94 ksobel Sobel edge detector k = ksobel() is the Sobel x-derivative kernel: |-1 |-2 |-1 0 0 0 1| 2| 1| Notes • This kernel is an effective horizontal edge detector • The Sobel vertical derivative is k’ Machine Vision Toolbox for MATLAB 150 R Copyright c Peter Corke 2011 .

CHAPTER 2. FUNCTIONS AND CLASSES

See also
isobel

ktriangle
Triangular kernel
k = ktriangle(w) is a triangular kernel within a rectangular matrix k. The dimensions k are w × w if w is scalar or w(1) wide and w(2) high. The triangle is isocles and is full width at the bottom row of the kernel and with its apex in the top row.

Examples
>> ktriangle(3) ans = |0 1 0| |0 1 0| |1 1 1|

See also
kcircle

lambda2rg
RGB chromaticity coordinates
rgb = lambda2rg(lambda) is the rg-chromaticity coordinate (1 × 2) for illumination at the specific wavelength lambda [metres]. If lambda is a vector (N × 1), then P (N × 2) is a vector whose elements are the luminosity at the corresponding elements of lambda. rgb = lambda2rg(lambda, E) is the rg-chromaticity coordinate (1 × 2) for an illumination spectrum E (N × 1) and lambda (N × 1).

Machine Vision Toolbox for MATLAB 151

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

See also
cmfrgb, lambda2xy

lambda2xy
= LAMBDA2XY(LAMBDA) is the xy-chromaticity coordinate (1 × 2) for
illumination at the specific wavelength LAMBDA [metres]. If LAMBDA is a vector (N × 1), then P (N × 2) is a vector whose elements are the luminosity at the corresponding elements of LAMBDA. xy = lambda2xy(lambda, E) is the rg-chromaticity coordinate (1 × 2) for an illumination spectrum E (N × 1) and lambda (N × 1).

See also
cmfxyz, lambda2rg

loadspectrum
Load spectrum data
s = loadspectrum(lambda, filename) is spectral data (N × D) from file filename interpolated to wavelengths [metres] specified in lambda (N × 1). The spectral data can be scalar (D=1) or vector (D>1) valued. [s,lambda] = loadspectrum(lambda, filename) as above but also returns the passed wavelength lambda.

Notes
• The file is assumed to have its first column as wavelength in metres, the remainding columns are linearly interpolated and returned as columns of s.

Machine Vision Toolbox for MATLAB 152

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

luminos
Photopic luminosity function
p = luminos(lambda) is the photopic luminosity function for the wavelengths in lambda. If lambda is a vector (N × 1), then p (N × 2) is a vector whose elements are the luminosity at the corresponding elements of lambda. Luminosity has units of lumens which are the intensity with which wavelengths are perceived by the light-adapted human eye.

See also
rluminos

maxfilt
maximum filter
MAXFILT(s [,w]) minimum filter a signal with window of width w (default is 5) SEE ALSO: medfilt, minfilt pic 6/93

medfilt1
Median filter
y = medfilt1(x, w) is the one-dimensional median filter of the signal x computed over a sliding window of width w.

Notes
• A median filter performs smoothing but preserves sharp edges, unlike traditional smoothing filters.

Machine Vision Toolbox for MATLAB 153

R

Copyright c Peter Corke 2011

CHAPTER 2. options) as above but return the rows of p as three vectors. If s is a 2-vector the side lengths are s(1)xS(2). symmetric about the origin.y. Options ‘T’. s. [x. The points are the columns of p. in this case the returned value is 3 × 14 (8 vertices + 6 face centres). options) is a mesh that defines the edges of a cube. If d is a 2-vector the grid is d(1)xD(2) points. options) is a set of points (3 x d2 ) that define a d × d planar grid of points with side length s. See also cylinder. Machine Vision Toolbox for MATLAB 154 R Copyright c Peter Corke 2011 . By default the grid lies in the XY plane. options) is a set of points (3 × 8) that define the vertices of a cube of side length s and centred at the origin. allowing the plane to be translated or rotated. C ‘T’.z] = mkcube(s. [x. T the homogeneous transform T is applied to all points. The cube is centred at C (3 × 1) not the origin The cube is arbitrarily transformed by the homogeneous transform T Return a set of cube edges in MATLAB mesh format rather than points. FUNCTIONS AND CLASSES mkcube Create cube p = mkcube(s. sphere mkgrid Create grid of points p = mkgrid(d.y.z] = mkcube(s. Options ‘facepoint’ ‘centre’. ‘edge’. T ‘edge’ Add an extra point in the middle of each face.

labels is a cell array of labels for the subplots. the sum of I(x. or y(:. y. FUNCTIONS AND CLASSES mlabel for mplot style graph mlabel(lab1 lab2 lab3) mplot multiple data Plot y versus t in multiple windows. p. MPLOT(t.2)). labels is a cell array of labels for the subplots.yq .CHAPTER 2.y). y. Plot all other vectors versus time in subplots. n) MPLOT(t. n) MPLOT(y. See also mpq poly. {labels}) Where y is multicolumn data and first column is time. npq. q) is the PQ’th moment of the image im.2)). n is a row vector specifying which variables to plot (1 is first data column. Subplots are labelled as per the data fields. MPLOT(y) MPLOT(y. upq Machine Vision Toolbox for MATLAB 155 R Copyright c Peter Corke 2011 .xp . n. {labels}) Where y is multicolumn data and t is time. or y(:. n is a row vector specifying which variables to plot (1 is first data column. MPLOT(S) Where S is a structure and one element ‘t’ is assumed to be time. That is. mpq Image moments m = mpq(im. y) MPLOT(t.

upq poly. Machine Vision Toolbox for MATLAB 156 R Copyright c Peter Corke 2011 . they are considered to be a single vertex. so centroids will be still be correct. The result m is a scalar in the interval -1 (non match) to 1 (perfect match) that indicates similarity. Notes • The points must be sorted such that they follow the perimeter in sequence (counterclockwise). • If the points are clockwise the moments will all be negated. q) is the PQ’th moment of the polygon with vertices described by the columns of v. FUNCTIONS AND CLASSES mpq poly Polygon moments m = MPQ POLY(v. • If the first and last point in the list are the same. See also mpq. npq poly.CHAPTER 2. i2) is the normalized cross-correlation between the two equally sized image patches i1 and i2. p. Polygon mtools simple/useful tools to all windows in figure ncc Normalized cross correlation m = ncc(i1.

T has the same dimensions as im. FUNCTIONS AND CLASSES Notes • A value of 1 indicates identical pixel patterns. for example. Notes • This is an efficient algorithm very well suited for binarizing text. ssd.s] = niblack(im. k. w2) as above but returns the per-pixel mean m and standard deviation s. • w2 should be chosen to be half the “size” of the features to be segmented. 1986. -0. niblack. Machine Vision Toolbox for MATLAB 157 R Copyright c Peter Corke 2011 . Example t = niblack(im.2.2 Reference An Introduction to Digital Image Processing. where W=2*w2+1. 20). W. k. • The ncc similarity measure is invariant to scale changes in image intensity. • A common choice of k=-0. Prentice-Hall. w2) is the per-pixel (local) threshold to apply to image im. The threshold at each pixel is a function of the mean and standard deviation computed over a W × W window. sad. isimilarity niblack Adaptive thresholding T = niblack(im. in text segmentation.CHAPTER 2. the height of a character. See also zncc.m. idisp(im >= t). [T.

q)/MPQ(im. Machine Vision Toolbox for MATLAB 158 R Copyright c Peter Corke 2011 . q) is the PQ’th normalized central moment of the image im.p. upq npq poly Normalized central polygon moments m = NPQ POLY(v. mpq. q) is the PQ’th normalized central moment of the polygon with vertices described by the columns of v. See also npq poly. b) npq Normalized central image moments m = npq(im.CHAPTER 2. That is UPQ(im.0). ithresh norm2 columnwise norm n = norm2(m) n = norm2(a. p. Notes • The normalized central moments are invariant to translation and scale. p. FUNCTIONS AND CLASSES See also otsu.0.

upq. so centroids will be still be correct. they are considered as a single vertex. See also mpq poly. Polygon numcols Return number of columns in matrix nc = numcols(m) returns the number of columns in the matrix m. • The normalized central moments are invariant to translation and scale. • If the first and last point in the list are the same. See also numrows numrows Return number of rows in matrix nr = numrows(m) returns the number of rows in the matrix m. npq.CHAPTER 2. See also numcols Machine Vision Toolbox for MATLAB 159 R Copyright c Peter Corke 2011 . • If the points are clockwise the moments will all be negated. FUNCTIONS AND CLASSES Notes • The points must be sorted such that they follow the perimeter in sequence (counterclockwise). mpq.

[yp. N. otsu IEEE Trans. pp 62-66 See also niblack. options) are the values of the maxima in the vector y. options) as above but also returns the indices of the maxima in the vector y. T is a scalar threshold that maximizes the variance between the classes of pixels below and above the thresold T.xp] = peak(y. Jan 1979. Machine Vision Toolbox for MATLAB 160 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES otsu Threshold selection T = otsu(im) is an optimal threshold for binarizing an image with a bimodal intensity histogram. [yp. Reference A Threshold Selection Method from Gray-Level Histograms. options) as above but also returns the corresponding x-coordinates of the maxima in the vector y. Systems. Notes • Performance for images with non-bimodal histograms can be quite poor. Man and Cybernetics Vol SMC-9(1). x.i] = peak(y. Example t = otsu(im).CHAPTER 2. x is the same length of y and contains the corresponding x-coordinates. idisp(im >= t). ithresh peak Find peaks in vector yp = peak(y.

Typically choose N to be odd. • The interp options fits points in the neighbourhood about the peak with a paraboloid and its peak position is returned. use peak(-V).CHAPTER 2. Interpolate peak (default no interpolation) Display the interpolation polynomial overlaid on the point data Notes • To find minima. Use SUB2IND to convert these to row and column coordinates Options ‘npeaks’. options) are the peak values in the 2-dimensional signal z. use peak2(-V). Machine Vision Toolbox for MATLAB 161 R Copyright c Peter Corke 2011 . options) as above but also returns the indices of the maxima in the matrix z. FUNCTIONS AND CLASSES Options ‘npeaks’. S ‘interp’.ij] = peak2(z.S points. N ‘plot’ Number of peaks to return (default all) Only consider as peaks the largest value in the horizontal range +/. See also peak2 peak2 Find peaks in a matrix zp = peak2(z. N ‘scale’. Order of interpolation polynomial (default no interpolation) Display the interpolation polynomial overlaid on the point data Notes • To find minima.S points. S ‘interp’ ‘plot’ Number of peaks to return (default all) Only consider as peaks the largest value in the horizontal and vertical range +/. N ‘scale’. • The interp options fits points in the neighbourhood about the peak with an N’th order polynomial and its peak position is returned. [zp.

FUNCTIONS AND CLASSES See also peak. ls) as above but the line style arguments ls are passed to plot. sub2ind pgmfilt Pipe image through PGM utility out = pgmfilt(im. plot2(p. iread plot2 Plot trajectories plot2(p) plots a line with coordinates taken from successive rows of p. pgmcmd) pipes the image im through a Unix filter program and returns its output as an image. ie. The program given by the string pgmcmd must accept and return images in PGM format. See also plot Machine Vision Toolbox for MATLAB 162 R Copyright c Peter Corke 2011 . See also pnmfilt. Nx2xM or Nx3xM then the M trajectories are overlaid in the one plot. p can be N × 2 or N × 3. If p has three dimensions.CHAPTER 2. Notes • Provides access to a large number of Unix command line utilities such as ImageMagick.

Y] and with dimensions W=[WIDTH HEIGHT]. ‘size’. plot circle Draw a circle on the current plot PLOT CIRCLE(C. P.y2. ’fillcolor’. ‘size’.y2). ’edgecolor’.CHAPTER 2. plot_circle(c. P. ls) draws a box with center at P=[X. ‘r’. and optional Matlab linestyle options ls. R. Machine Vision Toolbox for MATLAB 163 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES plot box a box on the current plot PLOT BOX(b. x2. Options ‘edgecolor’ ‘fillcolor’ ‘alpha’ the color of the circle’s edge. 1=solid. W. or a set of name. If C=[X Y Z] the circle is drawn in the XY-plane at height Z. Matlab color spec transparency of the filled circle: 0=transparent. PLOT BOX(’centre’. ’LineWidth’. PLOT BOX(’topleft’. plot_circle(c. ls) draws a box defined by b=[XL XR. ls) draws a box with top-left at P=[X. r.Y] and with dimensions W=[WIDTH HEIGHT]. value pairs that are passed to plot. Notes • the option can be either a simple linespec (eg. PLOT BOX(x1. W. If C (2 × N or 3 × N ) and R (1 × N ) then a set of N circles are drawn with centre and radius taken from the columns of C and R. options) draws a circle on the current plot with centre C=[X Y] and radius R.y1) and (x2. ‘g:’) for a non-filled circle. 5).y1. ’r’). ls) draws a box with corners at (x1. r. Matlab color spec the color of the circle’s interior. ’g’. Examples plot_circle(c. YL YR] with optional Matlab linestyle options ls. r. ’b’).

Z] the ellipse is parallel to the XY plane but at height Z.Y]. PLOT ELLIPSE(a. centred at the origin. xc. c Specify color of the axes.CHAPTER 2. ls) ls is the standard line styles. Matlab colorspec ‘axes’ ’axis’ Machine Vision Toolbox for MATLAB 164 R Copyright c Peter Corke 2011 . plot ellipse inv Plot an ellipse plot ellipse(a. C. FUNCTIONS AND CLASSES See also plot plot ellipse Draw an ellipse on the current plot PLOT ELLIPSE(a. current plot. ls) as above but centred at C=[X. ls) draws an ellipse defined by X’AX = 0 on the current plot. options) draws a coordinate frame corresponding to the homogeneous transformation T. Options ‘color’.Y. with Matlab line style ls. plot frame Plot a coordinate frame represented by a homogeneous transformation trplot(T. If C=[X.

The return argument is a vector of graphics handles for the lines. ’r’. to ‘view’. See also homline plot point point features PLOT POINT(p. n Specify the name of the coordinate frame. The current axis limits are used to determine the endpoints of the line. where p is 2 × N and each column is the point coordinate. ’name’. colspec Specify color of text ‘textsize’. l Specify length of the axes (default 1) Examples: trplot( T. ‘framename’. ‘printf’. v Specify the view angle for the Matlab axes ‘width’. fmt. options) adds point markers to a plot. ’r’). trplot( T. FUNCTIONS AND CLASSES ‘name’. Machine Vision Toolbox for MATLAB 165 R Copyright c Peter Corke 2011 . w ‘arrow’ ’length’. ls) draws a line in the current figure L.X = 0. Options ‘textcolor’. ’B’) plot homline Draw a line in homogeneous form H = PLOT HOMLINE(L. ’color’. n ‘text opts’. size Specify size of text ‘bold’ Text in bold font. data Label points according to printf format string and corresponding element of data ‘sequence’ Label points sequentially Additional options are passed through to PLOT for creating the marker.CHAPTER 2. Matlab line specification ls can be set. ’color’.

R is the radius and color is a Matlab color spec. patch. FUNCTIONS AND CLASSES See also plot. H = PLOT SPHERE(C. text plot poly Plot a polygon plotpoly(p. Machine Vision Toolbox for MATLAB 166 R Copyright c Peter Corke 2011 . C is the centre of the sphere and if its a 3 × N matrix then N spheres are drawn with centres as per the columns. See also plot. either a letter or 3-vector. • The number of vertices to draw the sphere is hardwired. R. R. Matlab color spec transparency of the filled circle: 0=transparent. The default is 1. H = PLOT SPHERE(C. alpha) as above but alpha specifies the opacity of the sphere were 0 is transparant and 1 is opaque. irrespective of figure hold state. color. Polygon plot sphere Plot spheres PLOT SPHERE(C. color) add spheres to the current figure. color) as above but returns the handle(s) for the spheres.CHAPTER 2. 1=solid. options) plot a polygon defined by columns of p which can be 2 × N or 3 × N. options ‘fill’ ‘alpha’ the color of the circle’s interior. NOTES • The sphere is always added. R.

The program given by the string pnmcmd must accept and return images in PNM format. p can be N × 2 or N × 3. See also plot. plotp(p. pnmcmd) pipes the image im through a Unix filter program and returns its output as an image. ls) as above but the line style arguments ls are passed to plot. By default a linestyle of ‘bx’ is used. iread Machine Vision Toolbox for MATLAB 167 R Copyright c Peter Corke 2011 . which by Toolbox convention are stored one per column. Notes • Provides access to a large number of Unix command line utilities such as ImageMagick. plot2 pnmfilt Pipe image through PNM utility out = pnmfilt(im. See also pgmfilt. FUNCTIONS AND CLASSES plotp Plot trajectories plotp(p) plots a set of points p.CHAPTER 2.

radgrad(im) as above but the result is displayed graphically.CHAPTER 2.Y] rather than the centre pixel of im. Notes • functions for T in SE(2) or SE(3) – if R is 2 × 2 then T is 3 × 3. FUNCTIONS AND CLASSES r2t Convert rotation matrix to a homogeneous transform T = r2t(R) is a homogeneous transform equivalent to an orthonormal rotation matrix R with a zero translational component. See also isobel Machine Vision Toolbox for MATLAB 168 R Copyright c Peter Corke 2011 . At each pixel the image gradient vector is resolved into the radial and tangential directions. centre) as above but the centre of the image is specified as centre=[X.gt] = radgrad(im) is the radial and tangential gradient of the image im. • translational component is zero See also t2r radgrad Radial gradient [gr.gt] = radgrad(im. or – if R is 3 × 3 then T is 4 × 4. [gr.

if the fit residual is aboe the the threshold the point is considered an outlier.in. options) as above but returns the vector in of column indices of x that describe the inlier point set. ransac determines the subset of points (inliers) that best fit the model described by the function func and the parameter m. options) is the ransac algorithm that robustly fits data x to the model represented by the function func. options) as above but returns the final residual of applying func to the inlier set.in] = ransac(func. T. N ‘maxDataTrials’. x. x. ransac classifies Points that support the model as inliers and those that do not as outliers. See also linspace ransac Random sample and consensus m = ransac(func. N maximum number of iterations (default 2000) maximum number of attempts to select a non-degenerate data set (default 100) Model function out = func(R) is the function passed to RANSAC and it must accept a single argument R which is a structure: Machine Vision Toolbox for MATLAB 169 R Copyright c Peter Corke 2011 . [m. T is a threshold on how well a point fits the estimated. T. one column per point pair. T.CHAPTER 2. d) as above but elements increment by d. [m. FUNCTIONS AND CLASSES ramp create a ramp vector ramp(n) output a vector of length n that ramps linearly from 0 to 1 ramp(v) as above but vector is same length as v ramp(v.resid] = ransac(func. x typically contains corresponding point data. x. Options ‘maxTrials’.

inlier list of inliers (1 × m) out. ”Multiple View Geometry in Computer Vision”.misc private data (cell array) out.x and returns the best model out.theta. N point pairs (6 × N ) R. For efficiency the data is conditioned once.resid model fit residual (1 × 1) The values of R.theta estimated quantity to test (3 × 3) R.theta and the subset of R. and the data transform parameters are kept in the .out. [out. Comp.x = CONDITION(R. This is used to discard random samples that do not result in useful models.A.theta estimated quantity (3 × 3) out.out.t threshold (1 × 1) R. pp 101-113.misc private data (cell array) The function return value is also a structure: out.valid if data is valid for estimation (logical) out. The inverse conditioning operation is applied to the model to transform the estimate based on conditioned data to a model applicable to the original data.cmd the operation to perform which is either (string) R.valid is true if a set of points is not degenerate.R.theta) decondition the estimated model data out. No 6.x that best supports (most inliers) that model.theta. Fishler and R.x) returns the best fit model and residual for the subset of points R. 1981 • Richard Hartley and Andrew Zisserman.theta] = ERR(R.inlier. References • m. Assoc.s sample size (1 × 1) out.x.theta = DECONDITION(R. FUNCTIONS AND CLASSES R. ”Random sample concensus: A paradigm for model fitting with applications to image analysis and automated cartography”.T) evaluates the distance from the model(s) R.s out.s is the minimum number of points required to compute an estimate to out.theta is a cell array.debug display what’s going on (logical) R. Vol 24. Mach. fundamental matrix) it is necessary to condition the data to improve the accuracy of model estimation. Cambridge University Press. [out. If this function cannot fit a model then out.x) condition the point data out. If multiple models are found out. that is they will produce a model.CHAPTER 2. they detect a structure argument.cmd are: ‘size’ ‘condition’ ‘decondition’ ‘valid’ ‘estimate’ out. pp 381-395.theta = []. ‘error’ Notes • For some algorithms (eg. that is. • The functions FMATRIX and HOMOG are written so as to be callable from RANSAC.theta to the points R.x. Boles.C..resid] = EST(R.x data to work on. 2001 Machine Vision Toolbox for MATLAB 170 R Copyright c Peter Corke 2011 . Comm.misc element.x conditioned data (2D × N ) out.

Relative luminosity lies in the interval 0 to 1 which indicate the intensity with which wavelengths are perceived by the light-adapted human eye.edu.csse. See also xycolourspace rluminos Relative photopic luminosity function p = ruminos(lambda) is the relative photopic luminosity function for the wavelengths in lambda. See also luminos Machine Vision Toolbox for MATLAB 171 R Copyright c Peter Corke 2011 . If lambda is a vector. homography rg addticks Label spectral locus RG ADDTICKS() adds wavelength ticks to the spectral locus.CHAPTER 2.au/ pk See also fmatrix. then p is a vector whose elements are the relative luminosity at the corresponding elements of lambda.uwa. FUNCTIONS AND CLASSES Author Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia pk at csse uwa edu au http://www.

roty. rotz. angvec2r Machine Vision Toolbox for MATLAB 172 R Copyright c Peter Corke 2011 .CHAPTER 2. angvec2r rotz Rotation about Z axis R = rotz(theta) is a rotation matrix representing a rotation of theta about the z-axis. FUNCTIONS AND CLASSES rotx Rotation about X axis R = rotx(theta) is a rotation matrix representing a rotation of theta about the x-axis. See also rotx. angvec2r roty Rotation about Y axis R = roty(theta) is a rotation matrix representing a rotation of theta about the y-axis. See also rotx. rotz. See also roty.

pitch. Y.CHAPTER 2. where the last index corresponds to the rows of roll. then TR is 4 × 4 • the validity of R is not checked Machine Vision Toolbox for MATLAB 173 R Copyright c Peter Corke 2011 . • many texts (Paul. where the last index corresponds to the rows of rpy. Notes • functions for R in SO(2) or SO(3) – If R is 2 × 2 and t is 2 × 1. Z axes respectively. yaw) as above but the roll-pitch-yaw angles are passed as separate arguments. yaw angles which correspond to rotations about the X. If rpy has multiple rows they are assumed to represent a trajectory and R is a three dimensional matrix. eul2tr rt2tr Convert rotation and translation to homogeneous transform TR = rt2tr(R. t) is a homogeneous transformation matrix formed from an orthonormal rotation matrix R and a translation vector t. Note • in previous releases (<8) the angles corresponded to rotations about ZYX. pitch. FUNCTIONS AND CLASSES rpy2tr Roll-pitch-yaw angles to homogeneous transform T = rpy2tr(rpy) is an orthonormal rotation matrix equivalent to the specified roll. See also tr2rpy. If roll. pitch and yaw are column vectors then they are assumed to represent a trajectory and R is a three dimensional matrix. pitch. Spong) use the rotation order ZYX. then TR is 3 × 3 – If R is 3 × 3 and t is 3 × 1. T = rpy2tr(roll. yaw.

a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. ncc. tr2rt sad Sum of absolute differences m = sad(i1. r2t. The result m is a scalar that indicates image similarity.CHAPTER 2. FUNCTIONS AND CLASSES See also t2r. theta) as above where xy=[x. See also zsad. i2) is the sum of absolute differences between the two equally sized image patches i1 and i2.y] and rotation is zero T = se2(xy.theta] See also trplot2 Machine Vision Toolbox for MATLAB 174 R Copyright c Peter Corke 2011 . ssd. T = se2(xy) as above where xy=[x. isimilarity se2 Create planar translation and rotation transformation T = se2(x. and rotation theta in the plane. theta) is a 3 × 3 homogeneous transformation SE(2) representing translation x and y.y.y] T = se2(xyt) as above where xyt=[x. y.

R) displays the stereo image pair L and R in adjacent windows. The result m is a scalar that indicates image similarity. See also vex ssd Sum of squared differences m = ssd(i1. Clicking the corresponding world point in the right image sets the green crosshair and displays the disparity [pixels]. a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. Clicking a point in the left image positions black cross hair at the same pixel coordinate in the right image. Two cross-hairs are created. sad. i2) is the sum of squared differences between the two equally sized image patches i1 and i2. isimilarity stdisp Display stereo pair stdisp(L.CHAPTER 2. Machine Vision Toolbox for MATLAB 175 R Copyright c Peter Corke 2011 . ncc. See also zsdd. FUNCTIONS AND CLASSES skew Create skew-symmetric matrix s = skew(v) is a skew-symmetric matrix and v is a 3-vector.

blah = [].bar = false. opt. then R is 2 × 2. It supports options that have an assigned value. then R is 3 × 3.choose = {’this’. istereo t2r Return rotational submatrix of a homogeneous transformation R = t2r(T) is the orthonormal rotation matrix component of homogeneous transformation matrix T: Notes • functions for T in SE(2) or SE(3) – If T is 4 × 4.foo = true. b. opt. FUNCTIONS AND CLASSES See also idisp. tr2rt. The software pattern is: function(a.CHAPTER 2. – If T is 3 × 3. • the validity of rotational part is not checked See also r2t. c. ’other’}. varargin) opt. ’that’. opt. boolean or enumeration types (string or int).args] = TB OPTPARSE(opt. arglist) is a generalized option parser for Toolbox functions. Machine Vision Toolbox for MATLAB 176 R Copyright c Peter Corke 2011 . rt2tr tb optparse Standard option parser for Toolbox functions [optout.

’#yes’}.S displays opt and arglist testpattern Create test images im = testpattern(type.y sets opt.arglist] = tb_optparse(opt.foo <.blah <.2 (the second element) and can be given in any combination. The return structure is automatically populated with fields: verbose and debug. varargin).choose <. By default if an option is given that is not a field of opt an error is declared.false ‘blah’.3 ‘blah’. The image is specified by the string type and one or two (type specific) arguments: Machine Vision Toolbox for MATLAB 177 R Copyright c Peter Corke 2011 .‘that’ ‘yes’ sets opt. opt = tb_optparse(opt.y ‘that’ sets opt. varargin).x.true ‘nobar’ sets opt. Optional arguments to the function behave as follows: ‘foo’ sets opt. 3 sets opt.verbose <.select <.blah <.select <. If w is a scalar the image is w × w else w(2)xW(1).true sets opt. If neither of ‘no’ or ‘yes’ are specified then opt.‘this’. S ‘showopt’ sets opt. w. if multiple values are required they must be converted to a cell array. Note: • that the enumerator names must be distinct from the field names.select = {’#no’. args) creates a test pattern image.1.debug <. • that only one value can be assigned to a field.choose <.foo <.CHAPTER 2. If neither of ‘this’. The allowable options are specified by the names of the fields in the structure opt. ‘that’ or ‘other’ are specified then opt. N ‘setopt’. The following options are automatically parsed: ‘verbose’ ‘debug’. x. Sometimes it is useful to collect the unassigned options and this can be achieved using a second output argument [opt.N sets opt <. which is a cell array of all unassigned arguments in the order given in varargin. FUNCTIONS AND CLASSES opt.

sf. 50. The trajectory s. Examples A 256 × 256 image with 2 cycles of a horizontal sawtooth intensity ramp: testpattern(’rampx’. [s. 256. intercept.sdd] = tpoly(s0. args is the number of cycles. args are dot pitch (distance between centres). Velocity and acceleration can be optionally returned as sd and sdd. FUNCTIONS AND CLASSES ‘rampx’ ‘rampy’ ‘sinx’ ‘siny’ ‘dots’ ‘squares’ ‘line’ intensity ramp from 0 to 1 in the x-direction. args are pitch (distance between centres). 2). binary dot pattern. Machine Vision Toolbox for MATLAB 178 R Copyright c Peter Corke 2011 . args is the number of cycles. n) is a trajectory of a scalar that varies smoothly from s0 to sf in n steps using a quintic (5th order) polynomial. sinusoidal intensity pattern (from -1 to 1) in the y-direction. dot diameter. a line. Notes • With no output argument the testpattern in displayed using idisp.sd. binary square pattern.sd. See also idisp tpoly Generate scalar polynomial trajectory [s. sf. args are theta (rad). T) as above but specifies the trajectory in terms of the length of the time vector T. sinusoidal intensity pattern (from -1 to 1) in the x-direction. intensity ramp from 0 to 1 in the y-direction. args is the number of cycles. square side length.sdd] = tpoly(s0. args is the number of cycles. sd and sdd are n-vectors. A 256 × 256 image with a grid of dots on 50 pixel centres and 20 pixels in diameter: testpattern(’dots’. 25). 256.CHAPTER 2.

P. See also angvec2r.Y] correspond to sequential rotations about the X. Y. The 3 angles rpy=[R. Machine Vision Toolbox for MATLAB 179 R Copyright c Peter Corke 2011 . then each row of rpy corresponds to a step of the trajectory.v] = tr2angvec(T) is a rotation of theta about the vector v equivalent to the rotational component of the homogeneous transform T. rpy = tr2rpy(R. Y and Z axes respectively. Notes • If no output arguments are specified the result is displayed. options) are the roll-pitch-yaw angles expressed as a row vector corresponding to the orthonormal rotation matrix R. options) are the roll-pitch-yaw angles expressed as a row vector corresponding to the rotation part of a homogeneous transform T. Options ‘deg’ ‘zyx’ Compute angles in degrees (radians default) Return solution for sequential rotations about Z. angvec2tr tr2rpy Convert a homogeneous transform to roll-pitch-yaw angles rpy = tr2rpy(T. FUNCTIONS AND CLASSES tr2angvec Convert rotation matrix to angle-vector form [theta. X axes (Paul book) Notes • There is a singularity for the case where THETA=0 in which case PHI is arbitrarily set to zero and PSI is the sum (PHI+PSI). [theta. If R or T represents a trajectory (has 3 dimensions).v] = tr2angvec(R) is a rotation of theta about the vector v equivalent to the orthonormal rotation matrix R.CHAPTER 2.

then R is 2 × 2 and T is 2 × 1. 4x4xN then T is considered a homgoeneous Machine Vision Toolbox for MATLAB 180 R Copyright c Peter Corke 2011 . tr2eul tr2rt Convert homogeneous transform to rotation and translation [R.CHAPTER 2.y. – If TR is 3 × 3. p = transl(T) is the translational part of a homogenous transform as a 3-element column vector. If p is an M × 3 matrix transl returns a 4x4xM matrix representing a sequence of homogenous transforms such that T(:. Spong) use the rotation order ZYX. ie. z) is a homogeneous transform representing a pure translation.t] = tr2rt(TR) split a homogeneous transformation matrix into an orthonormal rotation matrix R and a translation vector t. See also rt2tr.i) corresponds to the i’th row of p. r2t. • The validity of R is not checked. See also rpy2tr. t2r transl Create translational transform T = transl(x.:. then R is 3 × 3 and T is 3 × 1. Notes • Functions for TR in SE(2) or SE(3) – If TR is 4 × 4. T = transl(p) is a homogeneous transform representing a translation or point p=[x. y. If T has three dimensions. FUNCTIONS AND CLASSES • Note that textbooks (Paul.z].

or x and y. If tri is RGB then cc is rg. Multiple tristimulus values can be given as rows of tri (N × 3) in which case the chromaticity coordinates are the corresponding rows of cc (N × 2). out (HxWx2) has planes corresponding to r and g. each N × 1. trnorm Normalize a homogeneous transform tn = trnorm(T) is a normalized homogeneous transformation matrix in which the rotation submatrix is guaranteed to be a proper orthogonal matrix.C2] = tristim2cc(tri) as above but the chromaticity coordinates are returned in separate vectors. Machine Vision Toolbox for MATLAB 181 R Copyright c Peter Corke 2011 .CHAPTER 2. Notes • somewhat unusually this function performs a function and its inverse. The O and A vectors are normalized and the normal vector is formed from O x A. An historical anomaly.o2] = tristim2cc(im) as above but the chromaticity is returned as separate images (H × W ). FUNCTIONS AND CLASSES transform sequence and returns an N × 3 matrix where each row is the translational component of the corresponding transform in the sequence. out = tristim2cc(im) is the chromaticity coordinates corresponding to every pixel in the tristimulus image im (HxWx3). See also ctraj tristim2cc Tristimulus to chromaticity coordinates cc = tristim2cc(tri) is the chromaticity coordinate (1 × 2) corresponding to the tristimulus tri (1 × 3). [c1. if tri is XYZ then cc is xy. [o1.

CHAPTER 2. troty. Notes • translational component is zero Machine Vision Toolbox for MATLAB 182 R Copyright c Peter Corke 2011 . trotz troty Rotation about Y axis T = troty(theta) is a homogeneous transformation representing a rotation of theta about the y-axis. FUNCTIONS AND CLASSES Notes • Used to prevent finite word length arithmetic causing transforms to become ‘unnormalized’. See also oa2tr trotx Rotation about X axis T = trotx(theta) is a homogeneous transformation representing a rotation of theta about the x-axis. Notes • translational component is zero See also rotx.

Options ‘rpy’ ‘euler’ ‘angvec’ ‘radian’ ‘fmt’. Notes • translational component is zero See also rotz. and displays in RPY format. If T is a homogeneous transform sequence then print each element is printed on a separate line. trotx. trotz trotz Rotation about Z axis T = trotz(theta) is a homogeneous transformation representing a rotation of theta about the z-axis.CHAPTER 2. trotx. l display with rotation in roll/pitch/yaw angles (default) display with rotation in ZYX Euler angles display with rotation in angle/vector format display angle in radians (default is degrees) use format string f for all numbers. (default %g) display the text before the transform Machine Vision Toolbox for MATLAB 183 R Copyright c Peter Corke 2011 . FUNCTIONS AND CLASSES See also roty. troty trprint Compact display of homogeneous transformation trprint(T. trprint T is the command line form of above. options) displays the homogoneous transform in a compact single-line format. f ‘label’.

p.y).y0) is the centroid.(y-y0)q where (x0. mpq. q) is the PQ’th central moment of the image im. tr2angvec unit Unitize a vector vn = unit(v) is a unit vector parallel to v. That is. FUNCTIONS AND CLASSES See also tr2eul.CHAPTER 2. the sum of I(x. upq Central image moments m = upq(im. npq Machine Vision Toolbox for MATLAB 184 R Copyright c Peter Corke 2011 . See also upq poly. Note • fails for the case where norm(v) is zero. tr2rpy.(x-x0)p . Notes • The central moments are invariant to translation.

CHAPTER 2. FUNCTIONS AND CLASSES

upq poly
Central polygon moments
m = UPQ POLY(v, p, q) is the PQ’th central moment of the polygon with vertices described by the columns of v.

Notes
• The points must be sorted such that they follow the perimeter in sequence (counterclockwise). • If the points are clockwise the moments will all be negated, so centroids will be still be correct. • If the first and last point in the list are the same, they are considered as a single vertex. • The central moments are invariant to translation.

See also
upq, mpq poly, npq poly

usefig
a named figure or create a new figure
usefig(’Foo’) make figure ‘Foo’ the current figure, if it doesn’t exist create it. h = usefig(’Foo’) as above, but returns the figure handle

vex
Convert skew-symmetric matrix to vector
v = vex(s) is the vector which has the skew-symmetric matrix s.

Machine Vision Toolbox for MATLAB 185

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

Notes
• No checking is done to ensure that the matrix is skew-symmetric. • The function takes the mean of the two elements that correspond to each unique element of the matrix, ie. vx = 0.5*(s(3,2)-s(2,3))

See also
skew

xaxis
X-axis scaling
xaxis(max) xaxis([min max]) xaxis(min, max) xaxis restore automatic scaling for this axis

xycolorspace
Display spectral locus
xycolorspace() display a fully colored spectral locus in terms of CIE x and y coordinates. xycolorspace(p) as above but plot the points whose xy-chromaticity is given by the columns of p. [im,ax,ay] = xycolorspace() as above returns the spectral locus as an image im, with corresponding x- and y-axis coordinates ax and ay respectively.

Notes
• The colors shown within the locus only approximate the true colors, due to the gamut of the display device.

Machine Vision Toolbox for MATLAB 186

R

Copyright c Peter Corke 2011

CHAPTER 2. FUNCTIONS AND CLASSES

See also
rg addticks

yaxis
Y-axis scaling
yayis(max) yayis(min, max) YAXIS restore automatic scaling for this axis

zcross
Zero-crossing detector
iz = zcross(im) is a binary image with pixels set where the corresponding pixels in the signed image im have a zero crossing, a positive pixel adjacent to a negative pixel.

Notes
• Can be used in association with a Lapalacian of Gaussian image to determine edges.

See also
ilog

Machine Vision Toolbox for MATLAB 187

R

Copyright c Peter Corke 2011

sad. The result m is a scalar that indicates image similarity. A value of 1 indicates identical pixel patterns. isimilarity zsad Sum of absolute differences m = zsad(i1. ssd. ncc. Notes • The zncc similarity measure is invariant to affine changes in image intensity (brightness offset and scale). i2) is the zero-mean sum of absolute differences between the two equally sized image patches i1 and i2. Notes • The zsad similarity measure is invariant to changes in image brightness offset. See also sad.CHAPTER 2. a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. FUNCTIONS AND CLASSES zncc Normalized cross correlation m = zncc(i1. See also ncc. ssd. The result m is a scalar in the interval -1 to 1 that indicates similarity. i2) is the zero-mean normalized cross-correlation between the two equally sized image patches i1 and i2. isimilarity Machine Vision Toolbox for MATLAB 188 R Copyright c Peter Corke 2011 .

Notes • The zssd similarity measure is invariant to changes in image brightness offset. i2) is the zero-mean sum of squared differences between the two equally sized image patches i1 and i2. The result m is a scalar that indicates image similarity.CHAPTER 2. sad. FUNCTIONS AND CLASSES zssd Sum of squared differences m = zssd(i1. See also sdd. isimilarity Machine Vision Toolbox for MATLAB 189 R Copyright c Peter Corke 2011 . a value of 0 indicates identical pixel patterns and is increasingly positive as image dissimilarity increases. ncc.

Sign up to vote on this title
UsefulNot useful